You are on page 1of 10

Probability and Random

Processes
Lecture no. (18)
Dr Sherif Rabia
Department of Engineering Mathematics and Physics,
Faculty of Engineering, Alexandria University
sherif.rabia@gmail.com
sherif.rabia@alexu.edu.eg

4.3 Markov chains


-

1-step transition matrix

System dynamics

Initial distribution

m-step transition matrix

Complete characterization

Math 9 - 2015/2016 (second term)

Transition probabilities
The probability to move from state i to state j directly:

pij = P[Xn+1 = j | Xn = i]
Assumed to be the same for all n (time homogeneous)
Example (Taxi position)
P[X3 = 6 | X2 = 5] = P[X10 = 6 | X9 = 5] = P[X19 = 6 | X18 = 5] = p56

The 1-step transition probability matrix


Example (Taxi position)
0.1 0.4 0.4 0.1
0.2 0.4 0.2 0.2

P
0.25 0.25 0.25 0.25

0
0
.
1
0
.
1
0
.
8

p11
p
P 21

p L1

Math 9 - 2015/2016 (second term)

p12
p22

pL 2

p1L
p2 L

p LL

Examples
Example 18.1
A particle moves through the shown grid.
At each time n, the particle makes a random
movement to another reachable position.

Let {Xn, n = 0, 1, 2, . . .} be the particle position at time n where X0 is its


initial position.
Is {Xn} a Markov chain? If yes, write its transition matrix.

Example 18.2
Consider repeated tosses of a die.
Let {Xn, n = 0, 1, 2, . . .} be the maximum of the numbers that appear in
the first (n +1) trials where X0 is the first appearing number.
Is {Xn} a Markov chain? If yes, write its transition matrix.

Math 9 - 2015/2016 (second term)

Dynamics of the Markov chain

The transition matrix contains almost all the information


needed to identify the state of the process at any time.

Another piece of information (initial distribution) will be


needed later.

Example 18.3
Let {Xn, n = 0, 1, 2, . . .} be a Markov chain with state space {1, 2} and
the shown transition matrix
Evaluate

0.50 0.50

0
.
25
0
.
75

(a) P[X1 = 2 | X0 = 1]
(b) P[X5 = 1 | X4 = 2, X3 = 1]
(c) P[X6 = 1, X7 = 2 | X5 = 2]
(d) P[X9 = 2, X10 = 1, X11 = 1 | X8 = 2]
5

How to evaluate
P[X1 = 2]?
P[X2 = 2 | X0 = 1]?

Math 9 - 2015/2016 (second term)

Initial distribution (0)

The initial distribution is the pdf of X0.


It gives probabilistic information about the staring state
of the process.
x0
1
2 ... L
This pdf looks like
f0(x0) f0(1) f0(2) . . . f0(L)

(0) = [f0(1), f0(2), . . ., f0(L)]

For example,
(0) = [0.5, 0.25, 0.25]
means P(X0 = 1) = 0.5,
P(X0 = 2) = 0.25,
P(X0 = 3) = 0.25

Example 18.3 (contd)


Evaluate (given (0) = [0.2 0.8])
(e) P[X0 = 1, X1 = 2]
(f) P[X1 = 2]
(g) P[X1 = 2, X2 = 1]

Math 9 - 2015/2016 (second term)

m-step transition probabilities


pij

(m)

= P[Xn+m = j | Xn = i]

Example 18.3 (contd)


Evaluate
(h) P[X3 = 1 | X1 = 1]

p12(3) = P[Xn+3 = 2 | Xn = 1]
p43(5) = P[Xn+5 = 3 | Xn = 4]

0.50 0.50

0.25 0.75

(i) P[X3 = 2 | X1 = 1]
State

State
2

. . . Time

p11(2) = p11 p11 + p12 p21


7

. . . Time

p12(2) = p11 p12 + p12 p22

Math 9 - 2015/2016 (second term)

m-step transition probability matrix P(m)


2-step transition probability
matrix
p ( 2) p ( 2)
P ( 2) 11( 2)
p21

In general

12

( 2)
p22

P(m) = Pm

Example 18.3 (contd)


Evaluate
(j) P[X3 = 1 | X1 = 2]

m-step transition probability matrix


p (m )
11( m )
p
P ( m ) 21

pL1( m )

0.50 0.50

0
.
25
0
.
75

(k) P[X4 = 1 | X1 = 1]
(l) P[X0 = 1, X2 = 2] (given (0) = [0.2 0.8])
8

p1L( m )
(m )
(m)
p22
p2L

(m)
(m)
pL2
pLL
p12( m )

Having
the
1-step
transition matrix
and the initial
distribution of a
Markov chain, we
have a complete
characterization.

Math 9 - 2015/2016 (second term)

Homework
Sections covered
12.1,
12.2 (partially)
Problems
12.1.1, 12.1.2, 12.1.4, 12.1.5

Math 9 - 2015/2016 (second term)

End of the course

)Math 9 - 2015/2016 (second term

10

You might also like