A stochastic process is said to be Markov process if it satisfies the Markovian property, *i.e., *

P {X_{t + 1} = *j *| X_{0} = *k _{0}, *X

_{1}=

*k*... , X

_{1}_{t-1}=

*k*

_{ t-1},X

_{ t}= i}

= P {X_{ t+1} = *j |* X_{t} = i} for *t *= 0, 1, ...

* *

*i.e. *the occurrence of a future state depends on the immediately preceding state and only on it.

The conditional probabilities P { X_{ t+1} = *j *| X_{t} = i} are called one step transition probabilities as it describes the system between *t *and *t *+ 1.

If for each i and *j,*

* *

P {X_{ t+1} = *j |* X_{t} = i} = P {X1 *=j *| X_{0} = i} for all *t *= 0, 1, ... then the one step transition probabilities are said to be *stationary *and are usually denoted by *P _{ij} *These stationary transition probabilities do not change in time.

If, for each i, *j *and *n *(= 0, 1, 2, .... ), we have

P {X_{t+n}*=j *| X, = i} = P *{Xn *= *j *| X_{0}= i} for all *t *= 0, 1, .. , then these conditional probabilities are called *n-step transitional probabilities *and are usually denoted by *p _{ij}^{(n)} *

These probabilities will satisfy the following properties :

*(a) p _{ij}^{(n )}≥ * 0 for all i

*and j*and

*n*= 0,

**1,**2, ...

*(b) ∑ p _{ij}^{(n )}*=1 for all i and

*n*= 0, 1, 2, ... , and M =No. of states.

In Matrix notation, we represent the n-step transition probabilities as

and for one step transition probabilities we can take

and P is called one step transition matrix, whereas *p (n), n *step transition matrix.

A stochastic process {X1} *(t *= 0,1, 2, ... ) is said to be a *finite state Markov chain *if it has the following properties :

* *

*(a) *A finite number of states,

*(b) *The Markovian property,

(c) Stationary transition probabilities

*(d) *A set of initial probabilities P {X_{0 }= i} for all *i.*

** **

**Example **3. *Consider the one step transition matrix with three states :*

A transition diagram is given below. The arrows from each state indicate the possible states t which a process can move from the given state.