A finite set of variables whose values change over time. We can represent this as a state vector
What is a state vector
Its a vector that has an n states and corralates to one another
IE probability vector
we have three guys A,B,C
0.3 probability being in with A
0.6 probability with B
0.1 Probability with C
Note probability vector adds to 1
Ex
if the initial probability state is (0.6,0.4) what is the state after one transition?
x=[0.60.4]
Markov chain
Definition
A markov chain on n states is a system whose state vectors are proabability vectors and the state vector at time 0,1,2,… are denoted x0,x1,x2,…,where x0 is the initial probability vector, and x1,x2,... are determined by