You are on page 1of 85

Random processes - Chapter 4 Random process 1

Random processes
Chapter 4 Random process
4.1 Random process
4.1 Random process
Random processes - Chapter 4 Random process 2
R S T I Random process I T S R
O Random process, stochastic process
The innite set {X
t
, t T} of random variables is called a random process, where
the index set T is an innite set.
K In other words, a random vector with an innite number of elements is called a
random process.
O Discrete time process, continuous time process
A random process is said to be discrete time if the index set is a countably innite set.
When the index set is an uncountable set, the random process is called a continuous
time random process.
4.1 Random process
Random processes - Chapter 4 Random process 3
O Discrete (alphabet) process, continuous (alphabet) process
A random process is called a discrete alphabet, discrete amplitude, or discrete state
process if all nite length random vectors drawn from the random process are discrete
random vectors. A process is called a continuous alphabet, continuous amplitude, or
continuous state process if all nite length random vectors drawn from the random
process are continuous random vectors.
K A random process {X()} maps an element of the sample space on a time
function X(, t).
K A random process {X()} is the collection of time functions X(t) called sample
functions. This collection is called an ensemble. The value of the time function
X(t) is a random variable at t.
4.1 Random process
Random processes - Chapter 4 Random process 4
K A random process and sample functions
4.1 Random process
Random processes - Chapter 4 Random process 5
K Since a random process is a collection of random variables with X(t) denoting a
random variable at t, the statistical characteristics of the random process can be
considered vi the cdf and pdf of X(t).
K For example, the rst-order cdf, rst-order pdf, second-order cdf, and nth-order
cdf of the random process {X(t)} are
F
X(t)
(x) = Pr{X(t) x},
f
X(t)
(x) =
dF
X(t)
(x)
dx
,
F
X(t
1
),X(t
2
)
(x
1
, x
2
) = Pr{X(t
1
) x
1
, X(t
2
) x
2
},
and
F
X(t
1
), ,X(t
n
)
(x
1
, , x
n
) = Pr{X(t
1
) x
1
, , X(t
n
) x
n
},
respectively.
4.1 Random process
Random processes - Chapter 4 Random process 6
O Mean function
The mean function m
X
(t) of a random process {X(t)} is dened by
m
X
(t) = E{X(t)}
=

xf
X(t)
(x)dx.
O Autocorrelation function
The autocorrelation function R
X
(t
1
, t
2
) of a random process {X(t)} is dened by
R
X
(t
1
, t
2
) = E{X(t
1
)X

(t
2
)}.
4.1 Random process
Random processes - Chapter 4 Random process 7
O Known signal
An extreme example of a random process is a known or deterministic signal. When
X(t) = s(t) is a known signal, we have
m(t) = E{s(t)}
= s(t),
R(t
1
, t
2
) = E{s(t
1
)s(t
2
)}
= s(t
1
)s(t
2
).
K Consider the random process {X(t)} with mean E{X(t)} = 3 and autocorrelation
function R(t
1
, t
2
) = 9 + 4 exp(0.2|t
1
t
2
|). If Z = X(5), W = X(8), we can
easily obtain E{Z} = E{X(5)} = 3, E{W} = 3, E{Z
2
} = R(5, 5) = 13,
E{W
2
} = R(8, 8) = 13, Var{Z} = 13 3
2
= 4, Var{W} = 13 3
2
= 4, and
E{ZW} = R(5, 8) = 9 + 4e
0.6
11.195. In other words, the random variables
Z and W have the variance
2
= 4 and covariance Cov(5, 8) = 4e
0.6
2.195.
4.1 Random process
Random processes - Chapter 4 Random process 8
O The autocorrelation function of X(t) = Ae
jt
dened with a random
variable A can be obtained as
R
X
(t
1
, t
2
) = E{Ae
jt
1
A

e
jt
2
}
= e
j(t
1
t
2
)
E{|A|
2
}.
O Autocovariance function
The autocovariance function K
X
(t
1
, t
2
) of a random process {X(t)} is dened by
K
X
(t
1
, t
2
) = E{[X(t
1
) m
X
(t
1
)][X

(t
2
) m

X
(t
2
)]}.
K In general, the autocovariance and autocorrelation functions are functions of t
1
and t
2
.
K The autocovariance function can be expressed in terms of the autocorrelation and
mean functions as
K
X
(t
1
, t
2
) = R
X
(t
1
, t
2
) m
X
(t
1
)m

X
(t
2
).
4.1 Random process
Random processes - Chapter 4 Random process 9
O Uncorrelated random process
A random process {X
t
} is said to be uncorrelated if R
X
(t, s) = E{X
t
}E{X

s
} or
K
X
(t, s) = 0 for t = s.
K If a random process {X(t)} is uncorrelated, the autocorrelation and autocovari-
ance functions are
R
X
(t, s) = E{X
t
X

s
}
=

E{|X
t
|
2
}, t = s,
E{X
t
}E{X

s
}, t = s,
and
K
X
(t, s) =

2
X
t
, t = s,
0, t = s,
respectively.
4.1 Random process
Random processes - Chapter 4 Random process 10
O Correlation coecient function
The correlation coecient function
X
(t
1
, t
2
) of a random process {X(t)} is dened
by

X
(t
1
, t
2
) =
K
X
(t
1
, t
2
)

K
X
(t
1
, t
1
)

K
X
(t
2
, t
2
)
=
K
X
(t
1
, t
2
)
(t
1
)(t
2
)
,
where (t
i
) is the standard deviation of X(t
i
).
K We can show that
|
X
(t
1
, t
2
)| 1,

X
(t, t) = 1.
4.1 Random process
Random processes - Chapter 4 Random process 11
O Crosscorrelation function
The crosscorrelation function R
XY
(t
1
, t
2
) of random processes {X(t)} and {Y (t)}
is dened by
R
XY
(t
1
, t
2
) = E{X(t
1
)Y

(t
2
)}.
K The autocorrelation and crosscorrelation functions satisfy
R
X
(t, t) = E{X(t)X

(t)}
=
2
X
(t) + |E{X(t)}|
2
0,
R
X
(t
1
, t
2
) = R

X
(t
2
, t
1
),
R
XY
(t
1
, t
2
) = R

Y X
(t
2
, t
1
).
K The autocorrelation function R
X
(t
1
, t
2
) is positive semi-denite. In other words,

j
a
i
a
j
R(t
i
, t
j
) 0 for non-negative constants {a
k
}.
4.1 Random process
Random processes - Chapter 4 Random process 12
O Crosscovariance function
The crosscovariance function K
XY
(t
1
, t
2
) of random processes {X(t)} and {Y (t)}
is dened by
K
XY
(t
1
, t
2
) = E{[X(t
1
) m
X
(t
1
)][Y

(t
2
) m

Y
(t
2
)]}
= R
XY
(t
1
, t
2
) m
X
(t
1
)m

Y
(t
2
).
O Two random processes which are uncorrelated
The random processes {X(t)} and {Y (t)} are said to be uncorrelated if R
XY
(t
1
, t
2
) =
E{X(t
1
)}E{Y

(t
2
)} or K
XY
(t
1
, t
2
) = 0 for all t
1
and t
2
.
O Orthogonality
The two random processes {X(t)} and {Y (t)} are said to be orthogonal if
R
XY
(t
1
, t
2
) = 0 for all t
1
and t
2
.
4.1 Random process
Random processes - Chapter 4 Random process 1
Random process
Chapter 4 Random process
4.2 Properties of random processes
4.2 Properties of random processes
Random processes - Chapter 4 Random process 2
R S T I Stationary process and independent process
I T S R
O A random process is said to be stationary if the probabilistic properties
do not change under time shifts.
O Stationary process
A random process {X(t)} is stationary, strict-sense stationary (s.s.s.), or strongly-
stationary if the joint cdf of {X(t
1
), X(t
2
), , X(t
n
)} is the same as the joint cdf
of {X(t
1
+ ), X(t
2
+ ), , X(t
n
+ )} for all n, , t
1
, t
2
, , t
n
.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 3
O Wide-sense stationary (w.s.s.) process
A random process {X(t)} is w.s.s., weakly-stationary, or second-order stationary if (1)
the mean function is constant and (2) the autocorrelation function R
X
(t, s) depends
only on t s but not on t and s individually.
K The mean function m
X
and the autocorrelation function R
X
of a w.s.s. process
{X(t)} are thus
m
X
(t) = m
and
R
X
(t
1
, t
2
) = R(t
1
t
2
),
respectively.
K In other words, the autocorrelation function R
X
(t
1
, t
2
) of a w.s.s. process {X(t)}
is a function of = t
1
t
2
. For all t and , we have
R
X
() = E{X(t + )X

(t)}.
When = 0, R
X
(0) = E{|X(t)|
2
}.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 4
K Consider two sequences of uncorrelated random variables A
0
, A
1
, , A
m
and
B
0
, B
1
, , B
m
having mean zero and variance
2
i
. Assume that the two se-
quences are uncorrelated with each other. Let
0
,
1
, ,
m
be distinct frequen-
cies in [0, 2), and let X
n
=
m

k=0
{A
k
cos n
k
+B
k
sin n
k
} for n = 0, 1, 2, .
Then we can obtain
E{X
n
X
n+l
} =
m

k=0

2
k
cos l
k
,
E{X
n
} = 0.
Thus, {X
n
} is w.s.s..
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 5
O Properties of the autocorrelation function R
X
() of a real stationary
process {X(t)}
K R
X
() = R

X
() : R
X
() is an even function.
K |R
X
()| R
X
(0) : R
X
() is maximum at the origin.
K If R
X
() is continuous = 0, then it is also continuous at every value of .
K If there is a constant T > 0 such that R
X
(0) = R
X
(T), then R
X
() is periodic.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 6
O Independent random process
A random process is said to be independent if the joint cdf satises
F
X
t
1
,X
t
2
, ,X
t
n
(x) =
n

i=1
F
X
t
i
(x
i
)
for all n and t
1
, t
2
, , t
n
, x
1
, x
2
, , x
n
.
O Independent and identically distributed (i.i.d.) process
A random process is said to be i.i.d. if the joint cdf satises
F
X
t
1
,X
t
2
, ,X
t
n
(x) =
n

i=1
F
X
(x
i
)
for all n and t
1
, t
2
, , t
n
, x
1
, x
2
, , x
n
.
O The i.i.d. process is sometimes called a memoryless process or a white
noise. The i.i.d. process is the simplest process, and yet it is the most
stochastic process in that past outputs do not have any information
on the future.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 7
O Bernoulli process
An i.i.d. random process with two possible values is called a Bernoulli process. For
example, consider the random process {X
n
} dened by
X
n
=
_
1, if the nth result is head,
0, if the nth result is tail,
when we toss a coin innitely. The random process {X
n
} is a discrete-time discrete-
amplitude random process. The success (head) and failure (tail) probabilities are
P{X
n
= 1} = p
and
P{X
n
= 0} = 1 p,
respectively.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 8
We can easily obtain
m
X
(n) = E{X
n
}
= p
and
K
X
(n
1
, n
2
) = E{X
n
1
X
n
2
} m
X
(n
1
)m
X
(n
2
)
=
_
p(1 p), n
1
= n
2
,
0, n
1
= n
2
.
The mean function m
X
(n) of a Bernoulli process is not a function of time but a
constant. The autocovariance K
X
(n
1
, n
2
) depends not on n
1
and n
2
individually, but
only on the dierence n
1
n
2
.
K Clearly, the Bernoulli process is w.s.s..
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 9
O Two random processes independent of each other
The random processes {X(t)} and {Y (t)} are said to be independent of each
other if the random vector (X
t
1
, X
t
2
, , X
t
k
) is independent of the random vector
(Y
s
1
, Y
s
2
, , Y
s
l
) for all k, l, and t
1
, t
2
, , t
k
, s
1
, s
2
, , s
l
.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 10
O Normal process, Gaussian process
A random process {X
t
} is said to be normal if (X
t
1
, X
t
2
, , X
t
k
) is a kdimensional
normal random vector for all k and t
1
, t
2
, , t
k
.
K A stationary process is always w.s.s., but the converse is not always true. On the
other hand, a w.s.s. normal process is s.s.s.. This result can be obtained from the
pdf
f
X
(x) =
1
(2)
n/2
|K
X
|
1/2
exp
_

1
2
(x m
X
)
T
K
1
X
(x m
X
)
_
of a jointly normal random vector.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 11
O Jointly w.s.s. processes
Two random processes are said to be jointly w.s.s. if (1) the mean functions are
constants and and (2) the autocorrelation functions and crosscorrelation function are
all functions only of time dierences.
K If two random processes {X(t)} and {Y (t)} are jointly w.s.s., then {X(t)} and
{Y (t)} are both w.s.s.. The crosscorrelation function of {X(t)} and {Y (t)} is
R
XY
(t + , t) =

R
XY
()
= E{X(t + )Y

(t)}
=
_
E{Y (t)X

(t + )}
_

=

R

Y X
().
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 12
K The crosscorrelation function R
XY
of two jointly w.s.s. random processes has the
following properties:
1. R
Y X
() = R

XY
().
2. |R
XY
()|
_
R
XX
(0)R
Y Y
(0)
1
2
{R
XX
(0) + R
Y Y
(0)}.
3. R
XY
() is not always maximum at the origin.
O Linear transformation and jointly normal process
Two processes X(t) and Y (t) are w.s.s. if the linear combination Z(t) = aX(t) +
bY (t) is w.s.s. for all a and b. The converse is also true.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 13
O Moving average (MA) process
Let a
1
, a
2
, , a
l
be a sequence of real numbers and W
0
, W
1
, W
2
, be a sequence
of uncorrelated random variables with mean E{W
n
} = m and variance Var{W
n
} =

2
. Then the following process {X
n
} is called a moving average process.
X
n
= a
1
W
n
+ a
2
W
n1
+ + a
l
W
nl+1
=
l

i=1
a
i
W
ni+1
.
The mean and variance of X
n
are
E{X
n
} = (a
1
+ a
2
+ + a
l
)m,
Var{X
n
} = (a
2
1
+ a
2
2
+ + a
2
l
)
2
.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 14
Since E{

X
2
n
} =
2
when

X
n
= W
n
m, {X
n
} is w.s.s. from
Cov(X
n
, X
n+k
) = E
_
_
X
n
m
l

i=1
a
i
__
X
n+k
m
l

i=1
a
i
_
_
= E
_
_
l

i=1
a
i

X
ni+1
__
l

j=1
a
j

X
n+kj+1
_
_
=
_

_
E
_
a
l
a
lk

X
2
n+kl+1
+ a
l1
a
lk1


X
2
n+kl+2
+ + a
k+1
a
1

X
2
n
_
, k l 1,
0, k l,
=
_
(a
l
a
lk
+ + a
k+1
a
1
)
2
, k l 1,
0, k l.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 15
O Autoregressive (AR) process
Let the variance of an uncorrelated zero-mean random sequence Z
0
, Z
1
, be
Var{Z
n
} =
_

2
1
2
, n = 0,

2
, n 1,
where
2
< 1. Then the random process {X
n
} dened by
X
0
= Z
0
, X
n
= X
n1
+ Z
n
, n 1.
is called the rst order autoregressive process. We can obtain
X
n
= (X
n2
+ Z
n1
) + Z
n
=
2
X
n2
+ Z
n1
+ Z
n
.
.
.
=
n

i=0

ni
Z
i
.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 16
Thus the autocovariance function of {X
n
} is
Cov(X
n
, X
n+m
) = Cov
_
n

i=0

ni
Z
i
,
n+m

i=0

n+mi
Z
i
_
=
n

i=0

ni

n+mi
Cov(Z
i
, Z
i
)
=
2

2n+m
_
1
1
2
+
n

i=1

2i
_
=

2

m
1
2
.
Now, from the result above and the fact that the mean of {X
n
} is E{X
n
} = 0, it
follows that {X
n
, n 0} is w.s.s..
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 17
O Square-law detector
Let Y (t) = X
2
(t) where {X(t)} is a Gaussian random process with mean 0 and
autocorrelation R
X
(). Then the expectation of Y (t) is E{Y (t)} = E{X
2
(t)} =
R
X
(0). Since X(t+) and X(t) are jointly Gaussian with mean 0, the autocorrelation
of Y (t) can be found as
R
Y
(t, t + ) = E{X
2
(t)X
2
(t + )}
= E{X
2
(t + )}E{X
2
(t)} + 2E
2
{X(t + )X()}
= R
2
X
(0) + 2R
2
X
().
Thus E{Y
2
(t)} = R
Y
(0) = 3R
2
X
(0) and
2
Y
= 3R
2
X
(0) R
2
X
(0) = 2R
2
X
(0). Clearly,
{Y (t)} is w.s.s..
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 18
O Limiter
Let {Y (t)} = {g(X(t))} be a random process which is dened by a random process
{X(t)} and a limiter
g(x) =
_
1, x > 0,
1, x < 0.
Then we can easily obtain P{Y (t) = 1} = P{X(t) > 0} = 1 F
X
(0) and
P{Y (t) = 1} = P{X(t) < 0} = F
X
(0). Thus the mean and autocorrelation of
{Y (t)} are
E{Y (t)} = 1 P{Y (t) = 1} + (1) P{Y (t) = 1}
= 1 2F
X
(0),
R
Y
() = E{Y (t)Y (t + )}
= P{Y (t)Y (t + ) = 1} P{Y (t)Y (t + ) = 1}
= P{X(t)X(t + ) > 0} P{X(t)X(t + ) < 0}.
Now, if {X(t)} is a stationary Gaussian random process with mean 0, X(t +) and
X(t) are jointly Gaussian with mean 0, variance R
X
(0), and correlation coecient
R
X
()/R
X
(0). Clearly, F
X
(0) = 1/2.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 19
We have (refer to (3.100), p. 156, Random Processes, Park, Song, Nam, 2004)
P{X(t)X(t + ) < 0} = P
_
X(t)
X(t + )
< 0
_
= F
Z
(0)
=
1
2
+
1

tan
1
r
1

1 r
2
=
1
2

1

sin
1
r
=
1
2

1

sin
1
R
X
()
R
X
(0)
,
P{X(t)X(t + ) > 0} = 1 P{X(t)X(t + ) < 0}
=
1
2
+
1

sin
1
R
X
()
R
X
(0)
.
Thus the autocorrelation of the limiter output is
R
Y
() =
2

sin
1
R
X
()
R
X
(0)
,
from which we have E{Y
2
(t)} = R
Y
(0) = 1 and
2
Y
= 1 {1 2F
X
(0)}
2
= 1.
4.2 Properties of random processes / 4.2.1 Stationary process and independent process
Random processes - Chapter 4 Random process 20
R S T I Power of a random process I T S R
O Power spectrum
The Fourier transform of the autocorrelation function of a w.s.s. random process is
called the power spectrum, power spectral density, or spectral density. It is usually
assumed that the mean is zero.
K When the autocorrelation is R
X
(), the power spectrum is
S
X
() = F{R
X
}
=
_

k
R
X
(k)e
jk
, discrete time,
_

R
X
()e
j
d, continuous time.
K If the mean is not zero, the power spectral density is dened by the Fourier trans-
form of the autocovariance instead of the autocorrelation.
4.2 Properties of random processes / 4.2.2 Power of random process
Random processes - Chapter 4 Random process 21
O White noise, white process
Suppose that a discrete time random process {X
n
} is uncorrelated so that R
X
(k) =

k
. Then we have
S
X
() =

k
e
jk
=
2
.
Such a process is called a white noise or white process. If {X
n
} is Gaussian in addition,
it is called a white Gaussian noise.
4.2 Properties of random processes / 4.2.2 Power of random process
Random processes - Chapter 4 Random process 22
O Telegraph signal
Consider Poisson points with parameter . Let N(t) be the number of points in the
interval (0, t]. As shown in the gure above, consider the continuous-time random
process X(t) = (1)
N(t)
with X(0) = 1. Here, W
i
is the time between adjacent
Poisson points. Assuming > 0, the autocorrelation of X(t) is
R
X
() = E{X(t + )X(t)}
= 1 P{the number of points in the interval (t, t + ] is even}
+(1) P{the number of points in the interval (t, t + ] is odd}
= e

_
1 +
()
2
2!
+
_
e

_
t +
()
3
3!
+
_
= e

cosh e

sinh
= e
2
.
4.2 Properties of random processes / 4.2.2 Power of random process
Random processes - Chapter 4 Random process 23
We can obtain a similar result when < 0, Combining the two results, we have
R
X
() = e
2||
. Consider a random variable A which is independent of X(t) and
takes on 1 or 1 with equal probability. Let Y (t) = AX(t). We then have
E{Y (t)} = E{A}E{X(t)}
= 0,
E{Y (t
1
)Y (t
2
)} = E{A
2
}E{X(t
1
)X(t
2
)}
= E{A
2
}R
X
(t
1
t
2
)
= e
2|t
1
t
2
|
since E{A} = 0, E{A
2
} = 1. Thus {Y (t)} is w.s.s., and the power spectral density
of {Y (t)} is
S
Y
() = F{e
2||
}
=
4

2
+ 4
2
.
4.2 Properties of random processes / 4.2.2 Power of random process
Random processes - Chapter 4 Random process 24
O Band limited noise, colored noise
When W > 0, let us consider a random process with the power spectral density
S
X
() =
_
1, [W, W],
0, otherwise.
Such a process is called a colored noise. The autocorrelation of a colored noise is
thus
R
X
() = F
1
{S
X
()}
=
sin(W)

.
4.2 Properties of random processes / 4.2.2 Power of random process
Random processes - Chapter 4 Random process 25
O Power spectral density is not less than 0. That is,
S
X
() 0.
O Cross power spectral density
The cross power spectral density S
XY
() of jointly w.s.s. processes {X(t)} and
{Y (t)} is
S
XY
() =
_

R
XY
()e
j
d.
K Thus S
XY
() = S

Y X
(), and the inverse Fourier transform of S
XY
() is
R
XY
() =
1
2
_

S
XY
()e
j
d.
4.2 Properties of random processes / 4.2.2 Power of random process
Random processes - Chapter 4 Random process 26
O Time delay process
Consider a w.s.s. process {X(t)} of which the power spectral density is S
X
().
Letting Y (t) = X(t d), we have
R
Y
(t, s) = E{Y (t)Y

(s)}
= E{X(t d)X

(s d)}
= R
X
(t s).
Thus the process {Y (t)} is w.s.s. and the power spectral density S
Y
() equals to
S
X
(). In addition, the crosscorrelation and cross power spectral density of {X(t)}
and {Y (t)} are
R
XY
(t, s) = E{X(t)Y

(s)} = E{X(t)X

(s d)}
= R
X
(t s + d) = R
X
( + d)
and
S
XY
() = F{R
X
( + d)} =
_

R
X
( + d)e
j
d
=
_

R
X
(u)e
ju
e
jd
du = S
X
()e
jd
.
That is, {X(t)} and {Y (t)} are jointly w.s.s..
4.2 Properties of random processes / 4.2.2 Power of random process
Random processes - Chapter 4 Random process 27
R S T I Random process and linear systems I T S R
O If the input random process is two-sided and w.s.s., the output of a
linear time invariant (LTI) lter is also w.s.s..
O If the input random process is one-sided and w.s.s., however, the output
of an LTI lter is not w.s.s. in general.
4.2 Properties of random processes / 4.2.3 Random process and linear systems
Random processes - Chapter 4 Random process 28
O Let h(t) be the impulse response of an LTI system and let
H() = F{h(t)} be the transfer function. Then the crosscorrelation
R
XY
(t
1
, t
2
) of the input random process {X(t)} and output random
process {Y (t)} and autocorrelation R
Y
(t
1
, t
2
) of the output are
R
XY
(t
1
, t
2
) = E{X(t
1
)Y

(t
2
)} = E
_
X(t
1
)
_
X

(t
2
)h

()d
_
=
_
E{X(t
1
)X

(t
2
)}h

()d =
_
R
X
(t
1
, t
2
)h

()d
and
R
Y
(t
1
, t
2
) = E{Y (t
1
)Y

(t
2
)}
= E
_
_
X(t
1
)h()d
_
X

(t
2
)h

()d
_
=
_ _
R
X
(t
1
, t
2
)h()h

()dd
=
_
R
XY
(t
1
, t
2
)h()d,
respectively.
4.2 Properties of random processes / 4.2.3 Random process and linear systems
Random processes - Chapter 4 Random process 29
K If the input and output are jointly w.s.s., we can obtain
R
XY
() =
_
R
X
(t + )h

()d
= R
X
() h

(),
R
Y
() = R
XY
() h().
since R
X
(t
1
, t
2
) = R
X
(t
1
t
2
+ ) = R
X
( + ) and R
XY
(t
1
, t
2
) =
R
XY
(t
1
t
2
) = R
XY
( ).
K The cross power spectral density and power spectral density of output are
S
XY
() = S
X
()H

(),
S
Y
() = S
XY
()H(),
respectively.
4.2 Properties of random processes / 4.2.3 Random process and linear systems
Random processes - Chapter 4 Random process 30
K We can express the autocorrelation and power spectral density of the output in
terms of those of the input. Specically, we have
R
Y
() = R
X
()
h
()
and
S
Y
() = S
X
()|H()|
2
,
where
h
(t) is called the deterministic autocorrelation of h(t) and is dened by

h
(t) = F
1
(|H()|
2
)
= h(t) h

(t)
=
_
h(t + )h

()d.
K Let S
Y
() be the power spectral density of the output process {Y
t
}. Then we can
obtain
R
Y
() = F
1
{S
Y
()}
= F
1
{|H()|
2
S
X
()}.
4.2 Properties of random processes / 4.2.3 Random process and linear systems
Random processes - Chapter 4 Random process 31
O Coherence function
A measure of the degree to which two w.s.s. processes are related by an LTI trans-
formation is the coherence function () dened by

XY
() =
S
XY
()
[S
X
()S
Y
()]
1/2
.
Here, | ()| = 1 if and only if {X(t)} and {Y (t)} are the linearly related, that is,
Y (t) = X(t) h(t). Note the similarity between the coherence function () and
the correlation coecient , a measure of the degree to which two random variables
are linearly related. The coherence function exhibits the property
| ()| 1
similar to the correlation coecient between two random variables.
4.2 Properties of random processes / 4.2.3 Random process and linear systems
Random processes - Chapter 4 Random process 32
R S T I Ergodic theorem* I T S R
O If there exists a random variable

X such that
1
n
n1

i=0
X
i

n

X, discrete time random process,


1
T
_
T
0
X(t)dt
T

X, continuous time random process.


{X
n
, n I} is said to satisfy ergodic theorem.
K When a process satises an ergodic theorem, the sample mean con-
verges to something, which may be dierent from the expectation.
K In some cases, a random process with time-varying mean satises an
ergodic theorem as shown in the example below.
4.2 Properties of random processes / 4.2.4 Ergodic theorem*
Random processes - Chapter 4 Random process 33
K Suppose that nature at the beginning of time randomly selects one of two coins
with equal probability, one having bias p and the other having bias q. After the
coin is selected it is ipped once per second forever. The output random process
is a one-zero sequence depending on the face of a coin. Clearly, the time average
will converge: it will converge to p if the rst coin was selected and to q if the
second coin was selected. That is, the time average will converge to a random
variable. In particular, it will not converge to the expected value p/2 + q/2.
O If lim
n
E{(Y
n
Y )
2
} = 0, Y
n
, n = 1, 2, is said to converge to Y
in mean square, which is denoted as
l.i.m.
n
Y
n
= Y,
where l.i.m. denotes limit in the mean.
4.2 Properties of random processes / 4.2.4 Ergodic theorem*
Random processes - Chapter 4 Random process 34
O Mean ergodic theorem
Let {X
n
} be an uncorrelated discrete time random process with nite mean E{X
n
} =
m and nite variance
2
X
n
=
2
X
. Then the sample mean S
n
=
n1

i=0
X
i
/n converges
to the expected value E{X
n
} = m in mean square. That is,
l.i.m.
n
1
n
n1

i=0
X
i
= m.
O A sucient condition for a w.s.s. discrete time random process
{X
n
, n I} to satisfy a mean ergodic theorem is K
X
(0) < and
lim
n
K
X
(n) = 0.
4.2 Properties of random processes / 4.2.4 Ergodic theorem*
Random processes - Chapter 4 Random process 35
O Let {X
n
} be a discrete time random process with mean E{X
n
} and
autocovariance function K
X
(i, j). The process need not be stationary
in any sense. A necessary and sucient condition for
l.i.m.
n
1
n
n1

i=0
X
i
= m
is
lim
n
1
n
n1

i=0
E{X
i
} = m
and
lim
n
1
n
2
n1

i=0
n1

k=0
K
X
(i, k) = 0.
That is, if and only if a process is asymptotically uncorrelated and its
sample averages converge, the sample mean converge in mean square.
4.2 Properties of random processes / 4.2.4 Ergodic theorem*
Random processes - Chapter 4 Random process 36
O Mean square ergodic theorem
Let

X
n
=
n

i=1
X
i
/n where {X
n
, n 1} is a second order stationary process with
mean m and autocovariance K(i) = Cov(X
n
, X
n+i
). Then lim
n
E{(

X
n
m)
2
} = 0
if and only if lim
n
n1

i=0
K(i)/n = 0.
K Let K(i) be the autocovariance of {X
n
}, a second order stationary Gaussian
process with mean 0. If
lim
T
1
T
T

i=1
K
2
(i) = 0,
then
lim
T
E{|

K
T
(i) K(i)|
2
} = 0
for i = 1, 2, , where

K
T
(i) =
T

l=1
X
l
X
l+i
/T is the sample autocovariance.
4.2 Properties of random processes / 4.2.4 Ergodic theorem*
Random processes - Chapter 4 Random process 37
R S T I Ergodicity* I T S R
O Shift operator
An operator T for which T = {x
t+1
, t I}, where = {x
t
, t I} is an innite
sequence in the probability space (R
I
, B(R)
I
, P), is called a shift operator.
O Stationary process
A discrete time random process with process distribution P is stationary if
P(T
1
F) = P(F) for any element F in B(R)
I
.
4.2 Properties of random processes / 4.2.5 Ergodicity*
Random processes - Chapter 4 Random process 38
O Invariant event
An event F is said to be invariant with respect to the shift operator T if and only if
T
1
F = F.
O Ergodicity, ergodic process
A random process is said to be ergodic if P(F) = 0 or P(F) = 1 for any invariant
event F.
K Consider a two-sided process with distribution
P( , x
1
= 1, x
0
= 0, x
1
= 1, x
2
= 0, ) = p,
P( , x
1
= 0, x
0
= 1, x
1
= 0, x
2
= 1, ) = 1 p.
Clearly, F = {sequence of alternating 0 and 1} is an invariant event, and has
probability P(F) = 1. Any other invariant event - for example, the all 1 sequence
- that does not include F has probability 0. Thus the random process is ergodic.
K Ergodicity has nothing to do with stationarity or convergence of sample averages.
4.2 Properties of random processes / 4.2.5 Ergodicity*
Random processes - Chapter 4 Random process 1
Random processes
Chapter 4 Random process
4.3 Process with i.s.i.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 2
Process with i.s.i.
Process with independent increments
A random process {Y
t
, t I} is said to have independent increments or to be an
independent increment process if for all choices of k = 1, 2, and all choices of
ordered sample times {t
0
, t
1
, , t
k
}, the k increments Y
t
i
Y
t
i1
, i = 1, 2, , k
are independent random variables.
Process with stationary increments
When the increments {Y
t
Y
s
} are stationary, the random process {Y
t
} is called a
stationary increment random process.
Process with i.s.i.
A random process is called an independent and stationary increment (i.s.i.) process
if its increments are independent and stationary.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 3
A discrete time random process is an i.s.i. process if and only if it can
be represented as the sum of i.i.d. random variables.
Mean and autocovariance of i.s.i. process
The mean and autocovariance of a discrete time i.s.i. process are
E{Y
t
} = tE(Y
1
), t 0
and
K
Y
(t, s) =
2
Y
1
min(t, s), t, s 0.
Clearly, an i.s.i. process itself is not stationary.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 4
Let the process {X
t
, t T} be an i.s.i. process. If
m
0
= E{X
0
}, m
1
= E{X
1
} m
0
,

2
0
= E{(X
0
m
0
)
2
},
2
1
= E{(X
1
m
1
)
2
}
2
0
,
we have
E{X
t
} = m
0
+ m
1
t,
Var{X
t
} =
2
0
+
2
1
t.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 5
Point process and counting process
A sequence T
1
T
2
T
3
of ordered random variables is called a point
process. For example, the set of times dened by Poisson points is a point process.
A counting process Y (t) can be dened as the number of points in the interval [0, t).
We have, with T
0
= 0,
Y (t) = i, T
i
t < T
i+1
, i = 0, 1, .
.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 6
A counting process
A process constructed by summing the outputs of a Bernoulli process
Let {X
n
, n = 1, 2, } be a Bernoulli process with parameter p. Dene the
random process {Y
n
, n = 1, 2, } as
Y
0
= 0,
Y
n
=
n

i=1
X
i
= Y
n1
+ X
n
, n = 1, 2, .
Since the random variable Y
n
represents the number of 1s in {X
1
, X
2
, , X
n
},
we have
Y
n
= Y
n1
or Y
n
= Y
n1
+ 1, n = 2, 3, .
In general, a discrete time process satisfying this relation is called a counting
process since it is nondecreasing, and when it jumps, it is always with an increment
of 1.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 7
Properties of the random process {Y
n
}
E{Y
n
} = np, Var{Y
n
} = np(1 p), K
Y
(k, j) = p(1 p) min(k, j)
Marginal pmf for Y
n
p
Y
n
(y) = Pr{ there are exactly y ones in X
1
, X
2
, , X
n
.}
=

n
y

p
y
(1 p)
ny
, y = 0, 1, , n.
Since the marginal pdf is binomial, the process {Y
n
} is called a binomial counting
process.
The process {Y
n
} is not stationary since the marginal pmf depends on the time
index n.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 8
Random walk process
One dimensional random walk, random walk
Consider the modied Bernoulli process for which the event failure is represented
by 1 instead of 0.
Z
n
=

+1, for success in the nth trial,


1, for failure in the nth trial.
Let p = Pr{Z
n
= 1}, and consider the sum
W
n
=
n

i=1
Z
i
of the variables Z
n
. The process {W
n
} is referred to as one dimensional random
walk or random walk.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 9
Since
Z
i
= 2X
i
1,
it follows that the random walk process {W
n
} is related to the binomial counting
process {Y
n
} by
W
n
= 2Y
n
n.
Using the results on the mean and autocorrelation functions of the binomial count-
ing process and the linearity of expectation, we have
m
W
(n) = (2p 1)n,
K
W
(n
1
, n
2
) = 4p(1 p) min(n
1
, n
2
),

2
W
n
= 4p(1 p)n.
4.3 Process with i.s.i.
Random processes - Chapter 4 Random process 1
Random process
Chapter 4 Random process
4.4 Discrete time process with i.s.i.
4.4 Discrete time process with i.s.i.
Random processes - Chapter 4 Random process 2
Discrete time process with i.s.i.
Discrete time discrete alphabet process {Y
n
} with i.s.i.
As mentioned before, {Y
n
} can be dened by the sum of i.i.d. random variables {X
i
}.
Let us consider the following conditional pmf.
p
Y
n
|Y
n1(y
n
|y
n1
) = p
Y
n
|Y
n1(y
n
|y
n1
, , y
1
)
= Pr(Y
n
= y
n
|Y
n1
= y
n1
),
where Y
n
= (Y
n
, Y
n1
, , Y
1
) and y
n
= (y
n
, y
n1
, , y
1
).
The conditioning event {Y
i
= y
i
, i = 1, 2, , n 1} above is the same as the
event {X
1
= y
1
, X
i
= y
i
y
i1
, i = 2, , n 1}. In addition, under the
conditioning event, we have Y
n
= y
n
if and only if X
n
= y
n
y
n1
.
4.4 Discrete time process with i.s.i.
Random processes - Chapter 4 Random process 3
Assuming y
0
= 0,
p
Y
n
|Y
n1(y
n
|y
n1
) = Pr(Y
n
= y
n
|X
1
= y
1
, X
i
= y
i
y
i1
, i = 2, 3, , n 1)
= Pr(X
n
= y
n
y
n1
|X
i
= y
i
y
i1
, i = 1, 2, , n 1)
= p
X
n
|X
n1(y
n
y
n1
|y
n1
y
n2
, , y
2
y
1
, y
1
),
where X
n1
= (X
n1
, X
n2
, , X
1
).
If {X
n
} are i.i.d.,
p
Y
n
|Y
n1(y
n
|y
n1
) = p
X
(y
n
y
n1
)
since X
n
is independent of X
k
for k < n.
Thus the joint pmf is
p
Y
n(y
n
) = p
Y
n
|Y
n1(y
n
|y
n1
) p
Y
n1(y
n1
)
.
.
.
= p
Y
1
(y
1
)
n

i=2
p
Y
i
|Y
i1
, ,Y
1
(y
i
|y
i1
, , y
1
) =
n

i=1
p
X
(y
i
y
i1
).
4.4 Discrete time process with i.s.i.
Random processes - Chapter 4 Random process 4
Applying the result above to the binomial counting process, we obtain
p
Y
n(y
n
) =
n

i=1
p
(y
i
y
i1
)
(1 p)
1(y
i
y
i1
)
,
where y
i
y
i1
= 0 or 1 for i = 1, 2, , n and y
0
= 0.
Properties of processes with i.s.i.
We can express the conditional pmf of Y
n
given Y
n1
as follows:
p
Y
n
|Y
n1
(y
n
|y
n1
) = Pr(Y
n
= y
n
|Y
n1
= y
n1
)
= Pr(X
n
= y
n
y
n1
|Y
n1
= y
n1
).
The conditioning event {Y
n1
= y
n1
} depends only on X
k
for k < n, and X
n
is
independent of X
k
for k < n. Thus, this conditioning event does not aect X
n
.
Consequently,
p
Y
n
|Y
n1
(y
n
|y
n1
) = p
X
(y
n
y
n1
).
4.4 Discrete time process with i.s.i.
Random processes - Chapter 4 Random process 5
Discrete time i.s.i. processes (such as the binomial counting process and discrete
random walk) has the following property:
p
Y
n
|Y
n1(y
n
|y
n1
) = p
Y
n
|Y
n1
(y
n
|y
n1
),
Pr{Y
n
= y
n
|Y
n1
= y
n1
} = Pr{Y
n
= y
n
|Y
n1
= y
n1
}.
Roughly speaking, given the most recent past sample (or the current sample), the
other past samples do not aect the probability of what happens next.
A discrete time discrete alphabet random process with this property is called a
Markov process. Thus all i.s.i. processes are Markov processes.
4.4 Discrete time process with i.s.i.
Random processes - Chapter 4 Random process 6
Gamblers ruin problem
A person wants to buy a new car of which the price is N won. The person has k
(0 < k < N) won, and he intends to earn the dierence from gambling. The game
this person is going to play is that if a toss of a coin results in a head, he will earn 1
won, and if it results in a tail, he will lose 1 won. Let p represent the probability of
heads, and q = 1 p. Assuming the man continues to play the game until he earns
enough money for a new car or lose all the money he has, what is the probability that
the man loses all the money he has?
Let A
k
be the event that the man loses all the money when he has started with k
won and B be the event the man wins a game. Then,
P(A
k
) = P(A
k
|B)P(B) + P(A
k
|B
c
)P(B
c
).
Since the game will start again with k +1 won if a toss of a coin results in a head
and k 1 won if a toss of a coin results in a tail, it is easy to see that
P(A
k
|B) = P(A
k+1
),
P(A
k
|B
c
) = P(A
k1
).
4.4 Discrete time process with i.s.i.
Random processes - Chapter 4 Random process 7
Let p
k
= P(A
k
), then p
0
= 1, p
N
= 0, and
p
k
= pp
k+1
+ qp
k1
, 1 k N 1.
Assuming p
k
=
k
, we get from the equation above
p
2
+ q = 0,
which gives
1
= 1 and
2
= q/p.
If p = 0.5, p
k
= A
1

k
1
+ A
2

k
2
. Thus using the boundary conditions p
0
= 1 and
p
N
= 0, we can nd
p
k
=
(q/p)
k
(q/p)
N
1 (q/p)
N
.
If p = 0.5, we have p
k
= (A
1
+ A
2
k)
k
1
since q/p = 1. Thus using the boundary
conditions p
0
= 1 and p
N
= 0, we can nd
p
k
= 1
k
N
.
4.4 Discrete time process with i.s.i.
Random processes - Chapter 4 Random process 8
Discrete time Wiener process
Discrete time Wiener process
Let {X
n
} be an i.i.d. zero-mean Gaussian process with variance
2
. The discrete
time Wiener process {Y
n
} is dened by
Y
0
= 0,
Y
n
=
n

i=1
X
i
= Y
n1
+ X
n
, n = 1, 2, .
The discrete time Wiener process is also called the discrete time diusion process
or discrete time Brownian motion.
Since the discrete time Wiener process is formed as sums of an i.i.d. process, it
has i.s.i.. Thus we have E{Y
n
} = 0 and K
Y
(k, j) =
2
min(k, j).
The Wiener process is a rst-order autoregressive process.
4.4 Discrete time process with i.s.i. / 4.4.3 discrete time Wiener process
Random processes - Chapter 4 Random process 9
The discrete time Wiener process is a Gaussian process with mean func-
tion m(t) = 0 and autocovariance function K
X
(t, s) =
2
min(t, s).
Since the discrete time Wiener process is an i.s.i. process, we have
f
Y
n
|Y
n1
(y
n
|y
n1
) = f
X
(y
n
y
n1
),
f
Y
n
|Y
n1(y
n
|y
n1
) = f
Y
n
|Y
n1
(y
n
|y
n1
)
= f
X
(y
n
y
n1
).
As in the discrete alphabet case, a process with this property is called a Markov
process.
Markov process
A discrete time random process {Y
n
} is called a rst order Markov process if it
satises
Pr{Y
n
y
n
|y
n1
, y
n2
, } = Pr{Y
n
y
n
|y
n1
}.
for all n, y
n
, y
n1
, y
n2
, .
4.4 Discrete time process with i.s.i. / 4.4.3 discrete time Wiener process
Random processes - Chapter 4 Random process 1
Random processes
Chapter 4 Random process
4.5 Continuous time i.s.i. process
4.5 Continuous time i.s.i. process
Random processes - Chapter 4 Random process 2
Continuous time i.s.i. process
Continuous time i.s.i. process
When we deal with a continuous time process with i.s.i. we need to consider more
general collection of sample times than in the case of discrete time process.
In the continuous time case, we assume that we are given the cdf of the increments
as
F
Y
t
Y
s
(y) = F
Y
|ts|
Y
0
(y)
= F
Y
|ts|
(y), t > s.
4.5 Continuous time i.s.i. process
Random processes - Chapter 4 Random process 3
The joint probability functions of a continuous time process
Dene the random variable {X
n
} by
X
n
= Y
t
n
Y
t
n1
.
Then {X
n
} are independent and
Y
t
n
=
n

i=1
X
i
,
Pr{Y
t
n
y
n
|Y
t
n1
= y
n1
, Y
t
n2
= y
n2
, } = F
X
n
(y
n
y
n1
)
= F
Y
t
n
Y
t
n1
(y
n
y
n1
).
As in the case of discrete time processes, these can be used to nd the joint pmf or
pdf as
p
Y
t
1
, ,Y
t
n
(y
1
, , y
n
) =
n

i=1
p
Y
t
i
Y
t
i1
(y
i
y
i1
),
f
Y
t
1
, ,Y
t
n
(y
1
, , y
n
) =
n

i=1
f
Y
t
i
Y
t
i1
(y
i
y
i1
).
4.5 Continuous time i.s.i. process
Random processes - Chapter 4 Random process 4
If a process {Y
t
} has i.s.i., and the cdf pdf, or pmf for Y
t
= Y
t
Y
0
is
given, the process can be completely described as shown above.
As in the discrete time case, a continuous time random process {Y
t
} is
called a Markov process if we have
Pr{Y
t
n
y
n
|Y
t
n1
= y
n1
, Y
t
n2
= y
n2
, } = Pr{Y
t
n
y
n
|Y
t
n1
= y
n1
},
f
Y
t
n
|Y
t
n1
, ,Y
t
1
(y
n
|y
n1
, , y
1
) = f
Y
t
n
|Y
t
n1
(y
n
|y
n1
),
or
p
Y
t
n
|Y
t
n1
, ,Y
t
1
(y
n
|y
n1
, , y
1
) = p
Y
t
n
|Y
t
n1
(y
n
|y
n1
)
for all n, y
n
, y
n1
, , and t
1
, t
2
, , t
n
.
A continuous time i.s.i. process is a Markov process.
4.5 Continuous time i.s.i. process
Random processes - Chapter 4 Random process 5
Wiener process
Wiener process
A process is called Wiener process if it satises
The initial position is zero. That is, W(0) = 0.
The mean is zero. That is, E{W(t)} = 0, t 0.
The increments of W(t) are independent, stationary, and Gaussian.
Wiener process is a continuous time i.s.i. process.
The increments of Wiener process are Gaussian random variables with
zero mean.
The rst order pdf of Wiener process is
f
W
t
(x) =
1

2t
2
exp

x
2
2t
2

.
4.5 Continuous time i.s.i. process / 4.5.1 Wiener process
Random processes - Chapter 4 Random process 6
Wiener process, Brownian motion
Wiener process is the limit of the random-walk process.
Properties of Wiener process
The distribution of X
t
2
X
t
1
, t
2
> t
1
, depends only on t
2
t
1
, not t
1
and t
2
individually.
4.5 Continuous time i.s.i. process / 4.5.1 Wiener process
Random processes - Chapter 4 Random process 7
Let us show that the Wiener process is Gaussian.
From the denition of the Wiener process, the random variables W(t
1
), W(t
2
)
W(t
1
), W(t
3
) W(t
2
), , W(t
k
) W(t
k1
) are independent Gaussian random
variables. Thus the random variables W(t
1
), W(t
2
), W(t
3
), , W(t
k
) can be
obtained from the following linear transformation of W(t
1
) and the increments
W(t
1
) = W(t
1
),
W(t
2
) = W(t
1
) + {W(t
2
) W(t
1
)},
W(t
3
) = W(t
1
) + {W(t
2
) W(t
1
)} + {W(t
3
) W(t
2
)},
.
.
.
W(t
k
) = W(t
1
) + {W(t
2
) W(t
1
)} +
+{W(t
k
) W(t
k1
)}.
Since W(t
1
), W(t
2
), W(t
3
), , W(t
k
) is jointly Gaussian, {W(t)} is Gaussian.
4.5 Continuous time i.s.i. process / 4.5.1 Wiener process
Random processes - Chapter 4 Random process 8
Poisson counting process
Poisson counting process
A continuous time counting process {N
t
, t 0} with the following properties is called
the Poisson counting process.
N
0
= 0.
The process has i.s.i.. Hence, the increments over non-overlapping time intervals
are independent random variables.
In a very small time interval, the probability of an increment of 1 is proportional to
the length of the interval, and the probability of an increment larger than 1 is 0.
Thus, Pr{N
t+t
N
t
= 1} = t + o(t), Pr{N
t+t
N
t
2} = o(t), and
Pr{N
t+t
N
t
= 0} = 1 t + o(t), where is a proportionality constant.
4.5 Continuous time i.s.i. process / 4.5.2 Poisson counting process
Random processes - Chapter 4 Random process 9
The Poisson counting process is a continuous time discrete alphabet i.s.i. pro-
cess.
We have obtained the Wiener process as the limit of a discrete time discrete
amplitude random-walk process. Similarly, the Poisson counting process can
be derived as the limit of a binomial counting process using the Poisson ap-
proximation.
4.5 Continuous time i.s.i. process / 4.5.2 Poisson counting process
Random processes - Chapter 4 Random process 10
The probability mass function p
N
t
N
0
(k) = p
N
t
(k) of the increment
N
t
N
0
between the starting time 0 and t > 0
Let us use the notation
p(k, t) = p
N
t
N
0
(k), t > 0.
Using the independence of increments and the third property of the Poisson counting
process, we have
p(k, t + t) =
k

n=0
Pr{N
t
= n}Pr{N
t+t
N
t
= k n|N
t
= n}
=
k

n=0
Pr(N
t
= n)Pr(N
t+t
N
t
= k n)
p(k, t)(1 t) + p(k 1, t)t,
which yields
p(k, t + t) p(k, t)
t
= p(k 1, t) p(k, t).
4.5 Continuous time i.s.i. process / 4.5.2 Poisson counting process
Random processes - Chapter 4 Random process 11
When t 0, the equation above becomes the following dierential equation
d
dt
p(k, t) + p(k, t) = p(k 1, t), t > 0,
where the initial conditions are
p(k, 0) =

0, k = 0,
1, k = 0
since Pr{N
0
= 0} = 1. Solving the dierential equation gives
p
N
t
(k) = p(k, t)
=
(t)
k
e
t
k!
, k = 0, 1, 2, , t 0.
The pmf for k jumps in an arbitrary interval (s, t), t s
p
N
t
N
s
(k) =
((t s))
k
e
(ts)
k!
, k = 0, 1, , t s.
4.5 Continuous time i.s.i. process / 4.5.2 Poisson counting process
Random processes - Chapter 4 Random process 12
Martingale
Martingale property
An independent increment process {X
t
} with zero mean satises
E {X(t
n
) X(t
n1
)|X(t
1
), X(t
2
), , X(t
n1
)} = 0
for all t
1
< t
2
< < t
n
and integer n 2. This property can be rewritten as
E {X(t
n
)|X(t
1
), X(t
2
), , X(t
n1
)} = X(t
n1
),
which is called the martingale property.
Martingale
A process {X
n
, n 0} with the following properties is called a martingale process.
E{|X
n
|} < .
E{X
n+1
|X
0
, , X
n
} = X
n
.
4.5 Continuous time i.s.i. process / 4.5.3 Martingale
Random processes - Chapter 4 Random process 1
Random processes
Chapter 4 Random process
4.6 Compound process*
4.6 Compound process*
Random processes - Chapter 4 Random process 2
Discrete time compound process*
Discrete time compound process
Let {N
k
, k = 0, 1, } be a discrete time counting process such as the binomial
counting process, and let {X
k
, k = 0, 1, } be an i.i.d. random process. Assume
that the two processes are independent of each other. Dene the random process
{Y
k
, k = 0, 1, } by
Y
0
= 0,
Y
k
=
N
k

i=1
X
i
, k = 1, 2, ,
where we assume Y
k
= 0 if N
k
= 0. The process {Y
k
} is called a discrete time
compound process. The process is also referred to as a doubly stochastic process
because of the two sources {N
k
} and {X
k
} of randomness.
4.6 Compound process* / 4.6.1 Discrete time compound process*
Random processes - Chapter 4 Random process 3
The expectation of Y
k
can be obtained using the conditional probability as
E{Y
k
} = E

N
k

i=1
X
i

(1)
= E{E{Y
k
|N
k
}}
=

n
p
N
k
(n)E{Y
k
|N
k
= n}
=

n
p
N
k
(n)nE{X}
= E{N
k
}E{X}.
Let X
1
, X
2
, be an i.i.d. sequence of random variables and G
X
be
their common moment generating function. Let the random variable
N be independent of X
i
and G
N
be the moment generating function
of N. Then the moment generating function of S =

N
i=1
X
i
is
G
S
(t) = G
N
(G
X
(t)).
4.6 Compound process* / 4.6.1 Discrete time compound process*
Random processes - Chapter 4 Random process 4
Continuous time compound process*
When a continuous time counting process {N
t
, t 0} and an i.i.d.
process {X
i
, i = 1, 2, } are independent of each other, the process
Y
t
= Y (t) =
N
t

i=1
X
i
is called a continuous time compound process. Here, we put Y (t) = 0
when N
t
= 0.
We have
E{Y
t
} = E{N
t
}E{X},
M
Y
t
(u) = E{M
N
t
X
(u)}.
4.6 Compound process* / 4.6.2 Continuous time compound process*
Random processes - Chapter 4 Random process 5
A compound process is continuous and dierentiable except at the
jumps, where a new random variable is added.
If {N
t
} is a Poisson counting process, we have
E{Y
t
} = tE{X},
M
Y
t
(u) =

k=0
(t)
k
e
t
k!
M
k
X
(u)
= e
t

k=0
(tM
X
(u))
k
k!
= e
t(1M
X
(u))
.
4.6 Compound process* / 4.6.2 Continuous time compound process*

You might also like