Let *f _{ij}^{(n)}*= Probability of arriving atj at time

*n*for the first time, given that the process starts at

*i*

* = P [ X _{n }= j, X_{n-1 }≠ j, X_{n-2 }≠ j. … X_{1} ≠ j| X_{0} = i] *

Let

*T _{ij} *=Min {n : X

_{n}

*=*

*j*| X0 = i}. Then

*f _{ij}^{(1)}*=

*= Pij*(gives the diagonal of the transition matrix)

and

We can prove the following result :

A state *i *of a Markov chain is said to be transient if *f _{ii}* < 1 and recurrent if

*f*= 1.

_{ii}Also. the mean recurrence time for

** **

**PROBLEMS**

1. Test whether the following Markov chains are periodic or aperiodic.

To

(b) To

2. Test whether the Markov chain having the following transition matrix is regular and ergodic.

3. Consider the three state Markov chain with transition probability matrix

Prove that the chain is irreducible

4. Find the mean recurrence time for each state of the following Markov chain

**ANSWERS**

1. (a) Periodic, *(b) *Aperiodic

2. Ergodic but not regular

4. µ_{00}=3.9604, µ_{11}=2.5381, µ_{12}=2.8289.