Professional Documents
Culture Documents
Therefore,
P{R > t + u|R > t, X(s) = i } = P{X(s+t+u) = i |X(s+t) = i , X(s) = i }
= P{X(s + t + u) = i |X(s + t) = i } = P{X(s + u) = i |X(s) = i }
= P{R > u|X(s) = i }
i
t
.
In the case of the Poisson process the intensity is constant
i
= .
MC of the transitions
Let T
n
denote the time of the n-th change of state and T
0
= 0.
Theorem
The process X
n
= X
T
n
, n = 0, 1, . . . is a Markov chain, with transition
matrix P such that P
ii
= 0.
Poisson In the case of the Poisson process, transition is always i i + 1
with probability 1.
Exercise Let N(t), t > 0, be a Poisson process. Dene X(t) = N(t) mod 2.
Therefore,
P{X(t) = 0} =
k=1
e
t
(t)
2k
(2k)!
= e
t
cosh(t) =
1
2
_
1 + e
2t
_
P{X(t) = 1} =
k=0
e
t
(t)
2k+1
(2k + 1)!
= e
t
sinh(t) =
1
2
_
1 e
2t
_
Continuous-time MC: second denition
Theorem
Given exponential independent times with state-dependent intensity
(rate) (i ), and a transition matrix P, with zero diagonal, P
ii
= 0, the
process obtained by waiting in each state for the given exponential time,
them moving to a new state according to the transition, is a
continuous-time MC.
Simulation of a CTMC
The following algorithm generates a continuous-time MC. Let s = 0, then
Step 1 Generate a value x from the distribution of X
s
Step 2 Generate a value t from Exp(1); the trajectory is x(u) = x
for 0 u <
t
(x)
; s =
t
(x)
Step 3 Generate a new value for x from the x-row of P.
Ross Example 6.5
Consider a shoe-shine shop consisting of two chairs (1 and 2).
Assumptions are:
1. Potential customers arrive according a Poisson process with
intensity .
2. Service starts in chair 1 if both chairs are free.
3. After an Exp(
1
) service time, the customer moves to chair 2.
4. The service time in chair 2 is Exp(
2
).
Then, the state space is 0, 1, 2, where 0 means that no chair is occupied,
1 o 2 means that that chair is occupied. Transitions are deterministic.
The exit times from 1 and 2 are exponential with rate
1
,
2
.
_
_
0 1 2
0 0 1 0
1 0 0 1
2 1 0 0
_
_
(1) =
1
(2) =
2
Discussion
Let T
2
an exit time from state 2. After that, the system goes to
state 0 and can accept a new client. Note that T
2
is independent
from any inter-arrival time of the Poisson process posterior to the
last client accepted.
The probability for the exit time from 0 to be greater then t, given
that the system is in state i is equal to
P{S > T
2
+ t|S > T
2
} = P{S > t} = e
t
therefore (0) =
A complement on the Poisson process
A continuous time MC can be described either by is continuous time
transitions P
ij
(t) = P{X(t) = j |X(0) = i }, or by its transition rates (i ),
together with the discrete time transitions P
ij
. The connection between
the two descriptions rests on the following property of the Poisson
process.
Let N(t), t 0, be a Poisson process with rate . Then
1. P{N(h) = 1} = h + o(h) as h 0.
2. P{N(h) 2} = o(h) as h 0.
This follows directly from the explicit formul
p{N(h) = 1} = e
h
h
1!
P{N(h) 2} = e
h
k2
(h)
k
k!
by Calculus.
Therefore
P
ij
(h) = P
ij
i
h + o(h) h 0 j = i
Birth and death process
min(B, D) is Exp(
i
+
i
); P{B < D} =
i
/(
i
+
i
).
i
+
i
,
P
i ,i 1
=
i
i
+
i
, i > 0
Examples of B&D process
Yule
n
= n,
n
= 0.
Linear growth with immigration
n
= n + , = n. is the
immigration rate. Let M(t) = E[X(t)|X(0) = i ] the expected population
at time t, given X(0) = i . By computing the transition probability with
small h, we can nd the dierential equation
M
(t) = ( )M(t) +
whose solution for = is
M(t) =
mu
_
e
()t
1
_
+ ie
(mu)t
Note the behavior when t .
M/M/1 Customers arrive as a Poisson process with rate . They enter
the queue and are served with the priority rule 1st arrived 1st served. The
process of services is a Poisson process with rate . The length of the
queue is a B&D process with constant rates.
M/M/s If there are s servers, then the birth rate is constant, while the
death rare is
n
= n for 1 n s,
n
= s, n > s.
Expected transition times in B&D process
A possible question related to a birth and death is the time T
i
needed
to move from state i to state i + 1 (not necessarily equal to 1), or its
expected value.
Obviously, E[T
0
] =
1
0
, while for others it can be shown that
E[T
i
] =
1
i
+
i
E[T
i 1
]
so that E[T
i
] may be calculated recursively.
In the particular case
i
= and
i
= , = , we get
E[T
0
] =
1
0
E[T
n
] =
1 (/)
n+1
1 (/)
, n 1.
In the case = then E[T
n
] =
n+1
.
Transition probability function P
ij
(t)
Let X(t), t 0 be a continuous-time MC.
Denition
P
ij
(t) The transition probability function is dened to be
P
ij
(t) = P{X(s + t) = j |X(s) = i }
For the Yule process (pure birth process) with distinct birth rates
i
, let
X
i
be the time spent in state i before moving to state i + 1. The time
spent before entering state j is
j 1
k=1
X
k
. Then the events
{X(t) < j , X(0) = i }, {X
i
+ + X
j 1
> t, X(0) = i }
are equal. Then, for i < j ,
P{X(t) < j |X(0) = i } = P{X
i
+ + X
j 1
> t|X(0) = i }
=
j 1
k=i
e
k
t
j 1
r =k,r =i
r
k
Proof is a (long) exercise.
Instantaneous transition rates
Let X(t), t 0 be a continuous-time MC with state transition rates
(i ) and state transition probabilities P
ij
.
Denition: For any couple of states i , j we dene the instantaneous
transition rates
q
ij
= (i )P
ij
Viceversa:
j
q
ij
= (i ) and
q
ij
j
q
ij
=
q
ij
(i )
= P
ij
.
Q and P(t): 1 P
ii
(h) is the probability of no transition or at least two
transitions. P
ij
(t) is the probability of one transition from
i to j or at least two transitions. Then, for h 0,
1 P
ii
(h) = (i )h + o(h) lim
h0
1 P
ii
(h)
h
= (i )
P
ij
(h) = (i )P
ij
h + o(h) lim
h0
P
ij
(h)
h
= q
ij
Champan-Kolmogorov
Exactly as in the discrete-time case, we derive the CK Equations:
P
ij
(t + s) = P{X(t + s) = j |X(0) = i }
=
k
P{X(t + s) = j , X(t) = k|X(0) = i }
=
k
P{X(t + s) = j |X(t) = k, X(0) = i }P{X(t) = k|X(0) = i }
=
k
P{X(t + s) = j |X(t) = k}P{X(t) = k|X(0) = i }
=
k
P{X(s) = j |X(0) = k}P{X(t) = k|X(0) = i }
=
k
P
ik
(t)P
kj
(s)
Kolmogorovs backward equations
Using the previous denitions and results,
P
ij
(t + h) P
ij
(t) =
k
P
ik
(h)P
kj
(t) P
ij
(t)
=
k=i
P
ik
(h)P
kj
(t) [1 P
ii
(h)]P
ij
(t)
=
k=i
P
ik
(h)P
kj
(t) [1 P
ii
(h)]P
ij
(t)
=
k=i
q
ik
hP
kj
(t) (i )hP
ij
(t) + o(h)
then, we have the Kolmogorovs Backward Equations
P
ij
(t) =
k=i
q
ik
P
kj
(t) (i )P
ij
(t)
It is a system of linear dierential equations. Note the matrix product QP
Kolmogorovs forward equations
By taking the product in the CKEs dierently, we get
P
ij
(t + h) P
ij
(t) =
k
P
ik
(t)P
kj
(h) P
ij
(t)
=
k=j
P
ik
(t)P
kj
(h) [1 P
jj
(h)]P
ij
(t)
=
k=j
P
ik
q
kj
(t)h (j )hP
ij
(t) + o(h)
then, we obtain the Kolmogorovs Forward Equations:
P
ij
(t) =
k=j
P
ik
(t)q
kj
(j )P
ij
(t)
Note that here the product is PQ. The (in-formal) derivation above fails
in some cases of countable state space, while the KBE are always
satised.
Example: For birth and death processes, Kolmogorovs forward
equations are:
P
i 0
(t) =
1
P
i 1
(t)
0
P
i 0
(t),
P
ij
(t) =
j 1
P
i ,j 1
(t)
j
P
ij
(t), j = 0.
Balance equations
Remember: for MCs, under suitable assumptions, P
j
= lim
n
P
n
ij
is the probability of being in state j for n . Similarly, it can be
proved that:
P
j
= lim
t
P
ij
(t) j S.
Letting t in the Kolmogorov forward equations:
lim
t
P
ij
(t) = lim
t
_
k=j
q
kj
P
ik
(t)
j
P
ij
(t)
0 =
_
k=j
q
kj
P
k
j
P
j
Thus, probabilities P
j
may be found by means of
j S
P
j
= 1 and
k=j
q
kj
P
k
=
j
P
j
j S.
Example: for a M/M/1 queue, with rates and , if < then
P
n
=
(
)
n
1 +
m
(
)
m
= (
)
n
(1
), n 0.
The are no limiting values of P
n
for since in this case the
states are all transient states or null recurrent states.
Erlangs Loss Formula
Consider an M/M/k queue where a client that nd the server occupied
moves out (loss system). Let the arrival rate, and the service rate.
The balance equations are:
P
0
= P
1
........
P
k1
= kP
k
.
Adding the condition
j S
P
j
= 1, one gets:
P
i
=
(
)
i
/i !
k
j =0
(
)
j
/j !
, i = 0, 1, . . . , k.
This formula can be generalized to a M/G/k queueing loss model (see
Ross, Theorem 5.7.4)