You are on page 1of 20

EE 407 Stochastic processes and Modeling in EE (Lecture 9)

Omar Siddiqui Department of Electrical Engineering College of Engineering Taiba University Madinah Email:ofsiddiqui@yahoo.com

The Standard Bivariate Gaussian Distribution


It is the standard Gaussian Distribution for two RV X and Y each having a mean of 0 and Variance of 1

Here both X and Y are standard Normal distributions i.e.

p X ( x) =

1 1 exp x 2 < x < 2 2

pY ( y ) =

1 1 exp y 2 < x < 2 2

Where will be later shown to be the correlation coefficient that takes on values -1 < < 1.

Examples of Gaussian Bivariate distributions


PDF Constant PDF Contours

Constant PDF contours have constant probabilities and are solutions of the probability equations for which x and y are given by

Examples of Gaussian Bivariate distributions


PDF Constant PDF Contours

Constant PDF contours have constant probabilities and are solutions of the probability equations for which x and y are given by

Examples of Gaussian Bivariate distributions


PDF Constant PDF Contours

Constant PDF contours have constant probabilities and are solutions of the probability equations for which x and y are given by

Marginal PDFs
If pxy(x,y) is the joint pdf of random variables X and Y, then the marginal pdfs are written as:

For the Standard Gaussian bivariate

Completing the square in y

=1 So the marginal PDF of a standard Gaussian Bivariate distribution is the standard Gaussian distribution

The Cumulative Distribution Function (CDF)


For one continuous RV FX(x) = P[X x]
x

-<x<

FX ( x) =

(t )dt = 1 < x <

For two continuous RV

The Cumulative Distribution Function (CDF)


Example: Find the joint CDF of the following PDF

Joint PDF from joint CDF)


The joint PDF can be obtained from the joint CDF by:

Example

Independence of the RV
Two RV are independent if:

Example 1

Expected Values
Vector of Expected values of two continuous RV

i.e. the vector of the expected values of the marginal pdfs Expected value of a function of two jointly distributed continuous RV

Expected values of two independent RVs

Covariance two RVs where

Covariance of X and Y of the Gaussian Bivariate

Which can be shown to be equal to


Correlation coefficient between X and Y RVs of Standard Gaussian Bivariate

Gaussian Bivariate with zero correlation coefficient


Consider a Gaussian Bivariate for which the correlation coefficient is 0 =0

Hence the joint RVs X and Y are independent. This is true for a non standard Gaussian bivariate also for which Zero Covariance Independence of Gaussian Bivariate PDF

This result is not generally true for other distributions. In general, Independence of any PDF Zero Covariance

Covariance Matrix of the Gaussian Bivariate


The covariance matrix for a joint distribution of two RVs X and Y can be written as:

The Gaussian Bivariate distribution can be written in the form of the covariance matrix as follows:

For a standard Gaussian Bivariate: x = y = 0, and

1 C =

C 1 =

1 1 2 1

and 1

C = 1 2

Putting back

Covariance Matrix of the Gaussian Bivariate


The covariance matrix for a joint distribution of two RVs X and Y can be written as:

The Gaussian Bivariate distribution can be written in the form of the covariance matrix as follows:

For a standard Gaussian Bivariate: x = y = 0, and

1 C =

C 1 =

1 1 2 1

and 1

C = 1 2

Putting back

Estimation of outcome based on joint RV


Let X and Y have a joint PDFgiven by PXY(x,y). The outcome of Y can be predicted based on the outcomes of X by using the covariance between the two variables. An example is the linear predictor given by:

Y = aX + b

Where Y is the predicted or estimated value of the variable Y. Based on minimizing the mean square error, the predicted values of Y are given by:

Estimation of outcome based on joint RV Examples


Example 7.13: Consider the joint X and Y events that are equally likely (uniformly distributed with p = ) are given by Find the best linear prediction of Y if the marginal PMFs are given by:

Solution: The predicted values of Y based on the linear predictor are given by:

Estimation of outcome based on joint RV Examples


Example 7.13 continued Calculation of Expected Values
1 1 1 5 E[ X ] = xi p X [i ] = 0. + 1. + 2. = 4 4 2 4 i =0 3 1 1 1 9 2 E[ X ] = xi p X [i ] = 0. + 1. + 4. = 4 4 2 4 i =0 1 1 1 1 3 E[Y ] = y j p y [ j ] = 0. + 1. + 2. + 3. = 4 4 4 4 2 i =0
3 3

Calculation of Variance and Covariance


9 5 11 var( X ) = E[ X 2 ] E 2 [ X ] = = 4 4 16
cov( X ) = E X ,Y [ XY ] E X [ X ]EY [ X ] = 11 5 3 7 . = 4 4 2 8
2

1 1 1 1 11 E[ XY ] = xyp Xy [i, j ] = 0.0. + 1.1. + 2.2. + 2.3. = 4 4 4 4 4 i =0

Estimated Value of Y
cov( X , Y ) Y = EY [Y ] + ( x E X [ X ]) var( X )

Predicted

5 3 14 Y = + (x ) 2 11 4 1 14 Y = x 11 11

Estimation of outcome based on joint RV Examples


Example on Page 412 A Gaussian Bivariate PDF has mean for the RVs X and Y and the covariance matrix is given by

Find the estimated value of Y in terms of X Solution: The variances and covariance can be obtained from the covariance matrix Estimated Value of Y
cov( X , Y ) Y = EY [Y ] + ( x E X [ X ]) var( X )

0.9 Y = 0+ ( x 0) 1

Y = 0.9 x

Problems Related to Chapter 12

You might also like