Professional Documents
Culture Documents
Mathematical Expectation
Problem:
Variance of a Random Variable
Variance of a Random Variable
Problem: Let the random variable X represent the number of
automobiles that are used for official business purposes on any
given workday. The probability distribution for company A is
Solution:
Variance of a Random Variable
Problem: Let X be a random variable having the following density
function. Find the variance of the random variable g(X) = 4X + 3.
x2
, 1 x 2,
f x 3
0, elsewhere.
Solution:
Covariance
Let X and Y be random variables with joint probability
distribution f(x, y). The covariance of X and Y is
Cov X , Y or, XY E X X Y Y
E XY X Y Y X X Y
E XY X E Y Y E X E X Y
E XY X Y Y X X Y
E XY X Y
Covariance
Problem: For the following joint probability distribution, find the
covariance of X and Y.
16 y
3 , 2 x 3, 0 y 1,
f x, y x
0, elsewhere.
Covariance
The sign of the covariance indicates whether the relationship
between two dependent random variables is positive or
negative.
When X and Y are statistically independent, it can be shown
that the covariance is zero.
The converse, however, is not generally true.
Two variables may have zero covariance and still not be
statistically independent.
Note that the covariance only describes the linear relationship
between two random variables.
Therefore, if a covariance between X and Y is zero, X and Y
may have a nonlinear relationship, which means that they are
not necessarily independent.
Correlation Coefficient
Let X and Y be random variables with covariance XY and
standard deviations X and Y , respectively. The correlation
coefficient of X and Y is
XY
XY
X Y
The correlation coefficient satisfies the inequality 1 XY 1.
It assumes a value of zero when XY = 0.
Where there is an exact linear dependency, say Y a + bX,
XY = 1 if b > 0 and XY = 1 if b < 0.
Problem: For the following joint density function, find the
correlation coefficient between X and Y.
16 y
3 , 2 x 3, 0 y 1,
f x, y x
0, elsewhere.
Means and Variances of Linear Combinations of
Random Variables
If a and b are constants, then E(aX + b) = aE(X) + b.
The expected value of the sum or difference of two or more
functions of a random variable X is the sum or difference of the
expected values of the functions. That is,
E[g(X) h(X)] = E[g(X)] E[h(X)].
The expected value of the sum or difference of two or more
functions of the random variables X and Y is the sum or
difference of the expected values of the functions. That is,
E[g(X,Y) h(X,Y)] = E[g(X,Y)] E[h(X,Y)].
Let X and Y be two independent random variables. Then
E(XY ) = E(X) E(Y).
Means and Variances of Linear Combinations of
Random Variables
If X and Y are random variables with joint probability distribution
f(x, y) and a, b, and c are constants, then
aX
2
bY c a
2 2
X b Y 2ab XY
2 2
2
X c 2
X
2
aX bY a b
2 2
X
2 2
Y
aX
2
bY a
2 2
X b Y
2 2
2
a1 X1 a2 X 2 ....... an X n a a
2
1
2
X1
2
2
2
X2 ..... a
2
n
2
Xn
n i 1
Skewness Coefficient ():
Skewness
X
X3
If the value is zero, the randomness is symmetric; if it is positive, the
dispersion is more above the mean than below the mean, and if it is
negative the dispersion is more below the mean.
Problem