What is a markov chain

  • We consider “dynamical systems”:

    • A finite set of variables whose values change over time. We can represent this as a state vector

What is a state vector

  • Its a vector that has an states and corralates to one another

    • IE probability vector
      • we have three guys
        • probability being in with A
        • 0.6 probability with B
        • 0.1 Probability with C
          • Note probability vector adds to 1
  • Ex

    • if the initial probability state is (0.6,0.4) what is the state after one transition?

Markov chain

  • Definition

    • A markov chain on n states is a system whose state vectors are proabability vectors and the state vector at time 0,1,2,… are denoted where is the initial probability vector, and are determined by
  • EX

Steady state matrix