You are on page 1of 24

MARKOV CHAINS WITH CONTINUOUS TIME PARAMETER

Markov process with discrete state space with


continuous time parameter set
Assume time homogeneity throughout (stationary transition probability) i.e.
, independent of

Chapman Kolmogorov Equation

In matrix notation,

Lemma 1. Specification of the initial distribution

and the transition probabilities determine the joint


distribution of the process.
Proof:

Conditions to be satisfied by the probabilities


1.

2.

3.

4.

Given a family of ’s satisfying (1) - (4) and given , it is


not difficult to verify directly that the distribution defined as in Lemma 1
are consistent in the sense of Kolmogorov. They do in fact define a process
which is Markov.

However, it is usually more natural to specify infinitesimal


probabilities in any physical situation and to deduce the transition
probabilities [c.f. discrete time Markov Chain] when we defined one-
step transition probabilities to produce n-step transition probabilities. This
is the approach that may be used.
Example 1. Poisson process

M.C. such that

Note: P

Example 2. Pure birth process


Infinitesimal probabilities are exactly as for Poisson process except
is replaced by the probability of a birth per unit time depends
on population size.
(if we get the linear birth process)

Linear Birth and Death Process


= size of population at time
= birth rate per individual and = death rate per individual
Suppose
Lemma 2. For the pure birth process

Proof:
For

(a)

(b)

[Replacing by in (a) and (b) and letting , using the continuity


of left derivatives also exist and satisfy the same equations as
standard derivatives]

Solution:

.
General Continuous Time Markov Chain
1.
2.
3.
4.

Let i,j then

i.e. [ is a semigroup of stochastic matrices]

Assume now as

5.

6.
where and

Meaning of ,
For small
The infinitesimal generator of the process is the matrix (not a transition
matrix)

Want to find starting from the generator.


Example: The generator for the Birth-Death process is

= birth rate and =death rate.

; other ’s all zero

Sample paths of a homogeneous M.C. with generator A


Suppose , then we define the waiting time in state to be the
length of the time interval until the process first leaves state , i.e = time
until jump out of state .

Lemma 3. has exponential distribution with mean i.e.

Proof: Let
Example: For Poisson process with mean rate ,
, exponential with mean

Lemma 4. Note:

Proof: =

Recall that

Generation of Sample Paths


1. Select according to given initial distribution
2. Select exponential random variable with mean
3. At time , the process jumps to state , where is chosen according
to the distribution

4. Select exponential random variable with mean etc.


Example: B-D Process starting from state 0

Note: The above construction may (depending on generator A) lead to a


“dishonest” (or explosive) process. (i.e. one for which for
all )
e.g. consider the birth process with

i.e.

Mean time for to pass through all states in (to reach ) is


, and sample paths constructed as above look as follows
Infinitely many different “honest” Markov chains can be constructed with
the generator A by returning the process to the state-space in some
suitable way as soon as reaches . (e.g. by returning it to any
particular state in ).
A sufficient condition for a birth process to be uniquely defined by its
generator (and honest) is

(refer to Feller vol. I)

c.f. if then as for some finite iff .

Assumptions: For Kolmogorov equations as for “General continuous time


Markov Chain”
For forward equations only need also

uniformly in

The Kolmogorov differential equations; determination of from (see


Feller p.470)
A. The Bakward Equations
Let
(1) is fixed for
Equivalently, (1) can be expressed as

where with initial condition .

B. Forward Equations
Let (using uniformly in )

i.e.
(2) with initial condition

Note: In matrix notation


Backward

Forward
Determination of the Probability Distribution of
Let

Then,

i.e.
(3) (c.f. discrete parameter case
)

Alternatively,

Letting

(4) (assuming uniformly in )

Can use this differential equation with any given to solve for .

Solution of the Kolmogorov equation for finite state-space

(5)

Proof:
Power series converges (componentwise) for all . Hence, the matrix
of derivative with respect to is given by
And
Hence, both the backward and forward equations are satisfied.
Moreover, it is known from the theory of differential equations that the
solution of (and of ) with given is
unique.

Distribution of if is finite
From the equations (3) and (5) above
(6)

Question:
What is when is a finite matrix with distinct eigenvalues?

Evaluation of when is finite with distinct


(7) where

with - right e-vector of corresponding to

- left e-vector of corresponding to

Proof:

where = matrix of right e-vectors of


Solution of the forward equations for using generating functions
(applies also when infinite)
The idea is to derive, in place of a system of ordinary differential
equations for , a single partial d.e. for
(8)

Applications to Birth and Death Process


The forward Kolmogorov equations can be written

Holding for provided we define

Multiplying by and summing over gives

(9)
with initial condition provided series on right converges
uniformly in .
Using (8) in (9) gives

Note: Explicit solutions of equations (9) is possible in a number of cases of


interest.

Linear birth and death process


and
Note: 0 is an absorbing state
From (9)

with

Probability of Extinction at or before time

birth rate death rate


Probability of ultimate extinction=

Note: We assume platonic situation i.e. no sex involved meaning female-


sex saturated and male-sex around.
[As expected, this is the same as the probability of ultimate ruin in the
gambler’s ruin problem with and .

Limitin distributions for continuous parameter MCs


Theorem 1. If is irreducible and honest then has at most one
solution.
(10) and

If (10) has a solution then .


If (10) has no solution then .

Corollary: Every finite irreducible chain has a limiting distribution


uniquely determined by equation (10).

Theorem 2. An irreducible B & D process has a limiting distribution if and


only if , where . The limiting distribution is

Note: B.D. process is honest .

In particular, it is honest if . This latter condition is necessary and


sufficient for the pure birth process.
Questions related to Markov Chains:
1. Find the n-step transition probabilities from the one-step transition
probabilities.
In principle, this can be done using the matrix relationship
where

If , then

Thus, the distribution of is easily found from the , and the n-step
tpm . In matrix notation
2. What is the long-run behaviour of ? In particular, is there a
probability distribution such that as . Does the
limit (if it exists) depend on ?

3. How long does it take for to get from a given state to another
given state or set of states?

Classification of States of a Markov Chain


As already explained earlier, the state space can be assumed without
loss of generality to be a subset of the integers, i.e. the state at
time is .

leads to or is accessible from means there exists such that


, written as .
communicates with means and , written as .

Lemma 1: is an equivalence relation.


Proof:

(i) (reflexive)
(ii) by definition (symmetric)
(iii) (transitive)
Since where is some
intermediate state(s) between and .

Corollary: is partitioned into equivalence classes .


Proof: where = set of states which communicates with . By
(ii) and (iii), if . Since if there exists
then
everything in communicates with .
.

Note: Either and are the same or are mutually exclusive since or
is a partition of .
Hence, a subfamily of partitions .

Closed set of states is a set of states .

Note: An equivalence class of states (under ) need not be closed.

Closure of a set of states is the smallest closed set containing it.


Note:
(1) The closure of is the set
(2) If the closure of is then and state is said to be an
absorbing state.
(3) If is a closed subset of and then
with probability 1 and can be described as a
Markov chain with state-space and transition probabilities
.
A Markov Chain is irreducible if its state-space consists of a single
equivalence class, i.e. .
Examples:
1) Gambler’s ruin problem with finite state space conduct
independent trials at times At each trial, ’s fortune
increases by 1 with probability , decreases by 1 with probability
. The game stops when is bankrupt.
Let ’s fortune at time is a Markov chain with tpm

closed
not closed
closed

closed
not closed
not closed
Periodicity in a Markov Chain
Period of a state = greatest common denominator (gcd) of
such that .
is a periodic.

If define
Example:

Random walk on
Each state has period 2

To return to , need same number of steps to the right as to the left.


Consequently, only if is even.

Period is a class property i.e.

Proof: Suppose . Choose such that . Then


is a multiple of . .

If , then so is a multiple of .
.

Note: is divisible by . But is gcd of .


.
Interchanging .
Hence, .

First-Passage and First Return Probabilities


Let ,

is some intermediate time

a) Case

i.e

(1)

In terms of generating functions

Define

Note:
can be finite or .

Lemma 2.

(2)

Proof: Multiply equation (1) by and sum over

[since

b) Case
By the same arguments leading to equation (1) we get

(3)
But note that if , equation (2) is valid also for . Hence,
multiplying equation (3) by and summing over we get
(4)
Note: The first return and first entrance probabilities can be found from the
-step transition probabilities (at least in principle) using equations (2) and
(4).

Return Period recurrence interval in some practical applications e.g.


recurrence interval of “algal bloom” in Laguna Lake.

Readings:
1. Degram, G.G.S. 1971. “A Note on the Use of Markov Chains in
Hydrology”, Journal of Hydrology vol. 13, pp. 216-230.
2. Matis, J.H. et al. 1985. “A Markov Chain Approach to Crop Yield
Forecasting”. Agricultural Systems vol. 18. Pp. 171-187.

You might also like