Professional Documents
Culture Documents
In matrix notation,
2.
3.
4.
Note: P
Proof:
For
(a)
(b)
Solution:
.
General Continuous Time Markov Chain
1.
2.
3.
4.
Assume now as
5.
6.
where and
Meaning of ,
For small
The infinitesimal generator of the process is the matrix (not a transition
matrix)
Proof: Let
Example: For Poisson process with mean rate ,
, exponential with mean
Lemma 4. Note:
Proof: =
Recall that
i.e.
uniformly in
B. Forward Equations
Let (using uniformly in )
i.e.
(2) with initial condition
Forward
Determination of the Probability Distribution of
Let
Then,
i.e.
(3) (c.f. discrete parameter case
)
Alternatively,
Letting
Can use this differential equation with any given to solve for .
(5)
Proof:
Power series converges (componentwise) for all . Hence, the matrix
of derivative with respect to is given by
And
Hence, both the backward and forward equations are satisfied.
Moreover, it is known from the theory of differential equations that the
solution of (and of ) with given is
unique.
Distribution of if is finite
From the equations (3) and (5) above
(6)
Question:
What is when is a finite matrix with distinct eigenvalues?
Proof:
(9)
with initial condition provided series on right converges
uniformly in .
Using (8) in (9) gives
with
If , then
Thus, the distribution of is easily found from the , and the n-step
tpm . In matrix notation
2. What is the long-run behaviour of ? In particular, is there a
probability distribution such that as . Does the
limit (if it exists) depend on ?
3. How long does it take for to get from a given state to another
given state or set of states?
(i) (reflexive)
(ii) by definition (symmetric)
(iii) (transitive)
Since where is some
intermediate state(s) between and .
Note: Either and are the same or are mutually exclusive since or
is a partition of .
Hence, a subfamily of partitions .
closed
not closed
closed
closed
not closed
not closed
Periodicity in a Markov Chain
Period of a state = greatest common denominator (gcd) of
such that .
is a periodic.
If define
Example:
Random walk on
Each state has period 2
If , then so is a multiple of .
.
a) Case
i.e
(1)
Define
Note:
can be finite or .
Lemma 2.
(2)
[since
b) Case
By the same arguments leading to equation (1) we get
(3)
But note that if , equation (2) is valid also for . Hence,
multiplying equation (3) by and summing over we get
(4)
Note: The first return and first entrance probabilities can be found from the
-step transition probabilities (at least in principle) using equations (2) and
(4).
Readings:
1. Degram, G.G.S. 1971. “A Note on the Use of Markov Chains in
Hydrology”, Journal of Hydrology vol. 13, pp. 216-230.
2. Matis, J.H. et al. 1985. “A Markov Chain Approach to Crop Yield
Forecasting”. Agricultural Systems vol. 18. Pp. 171-187.