USA: +1-585-535-1023

UK: +44-208-133-5697

AUS: +61-280-07-5697

Markov Process and Markov Chain

A stochastic process is said to be Markov process if it satisfies the Markovian property, i.e.,

P {Xt + 1 = j | X0 = k0, X1 = k1... , Xt-1 = k t-1 ,X t= i}

= P {X t+1  = j | Xt = i} for t = 0, 1, ...

 

i.e. the occurrence of a future state depends on the immediately preceding state and only on it.

 

The conditional probabilities P { X t+1  = j | Xt = i} are called one step transition probabilities as it describes the system between t and t + 1.

If for each i and j,

 

P {X t+1  = j | Xt = i} = P {X1 =j | X0 = i} for all t = 0, 1, ... then the one step transition probabilities are said to be stationary and are usually denoted by Pij These stationary transition probabilities do not change in time.

 

If, for each i, j and n (= 0, 1, 2, .... ), we have

 

P {Xt+n=j | X, = i} = P {Xn = j | X0= i} for all t = 0, 1, .. , then these conditional probabilities are called n-step transitional probabilities and are usually denoted by pij(n)

 

These probabilities will satisfy the following properties :

(a) pij(n ) 0 for all i and j  and n = 0, 1, 2, ...

 

(b) ∑ pij(n )=1 for all i and n = 0, 1, 2, ... , and  M =No. of states.

 

In Matrix notation, we represent the n-step transition probabilities as

and for one step transition probabilities we can take

and P is called one step transition matrix, whereas p (n), n step transition matrix.

 

A stochastic process {X1} (t = 0,1, 2, ... ) is said to be a finite state Markov chain if it has the following properties :

 

(a) A finite number of states,

(b) The Markovian property,

(c) Stationary transition probabilities

(d) A set of initial probabilities P {X0 = i} for all i.

 

Example 3. Consider the one step transition matrix with three states :

A transition diagram is given below. The arrows from each state indicate the possible states t which a process can move from the given state.