You are on page 1of 20

Discrete Random Variables

Distribution, Expectation, Variance

Random Variable
A random variable is a function whose domain is the sample space and whose range is a subset of the set of real numbers. A random variable is a function that associates a unique numerical value with every outcome of an experiment. The value of the random variable will vary from trial to trial as the experiment is repeated. *

Experiment : Toss a coin twice Sample Space : {HH, HT, TH, TT} X = number of heads X(HH) = 2, X(HT) = 1, X(TH) = 1, X(TT) = 0 X : {0,1,2} is a function. Hence X is a random variable

(*http://www.stats.gla.ac.uk/steps/glossary/probability_distributions.html#randv ar)

Experiment : Draw three balls at random from an urn containing two red and two green balls. Sample Space = {RRG, RGG} X = number of red balls X(RRG) = 2, X(RGG) = 1 X : {1,2} is a function. Hence X is a random variable. Experiment : Toss two dice. Sample Space = {(1,1), (1,2),, (1,6),, (6,1),(6,6)} X = sum of the scores obtained in the two tosses X((1,1)) = 2, X((1,2)) = 3, , X((6,6)) = 12 X : {2, ,12} is a function. X is a random variable

Discrete Random Variable


A random variable is said to be discrete if the range of values taken by the random variable is either finite or is countably infinite. All the examples of random variables given till now are discrete random variables An example of a random variable which is not discrete is given below : Experiment : Choose three students at random from PGP-1 Sample Space : {{a, b, c}: a, b, c are students of PGP-1} X = total sum of weights of the three students X can take any positive value. Hence X is not a discrete random variable.

Distribution of a Discrete Random Variable


Let the random variable X take the values x1, x2, x n, P(X = xi) = P({ : X() = xi}) Let pi = P(X = xi). Then pi 0 and pi = 1 The list of values of X along with their corresponding probabilities of occurrence is called the probability distribution of the random variable X X : x 1 x2 xn P(X = x) : p1 p2 pn

Example : Let X be the number of tails in two tosses of a fair coin. Then X takes the values 0,1,and 2 P(X =0) = P({HH}) = P(X=1) = P({HT, TH}) = P(X=2) = P({TT}) =
Therefore the probability distribution of the random variable X is X : 0 1 2 P(X=x): 1/4

Example : Toss a die with one side marked 1, two sides marked 2 and three sides marked 3 twice. Let X be the sum of scores in the two tosses. Now note that X can take the values 2, 3, 4, 5 and 6. The probability distribution of X is
X : 2 P(X=x): 1/36 3 1/9 4 5 5/18 1/3 6 1/4

Three Coins are tossed.


Identify the Sample Space. Let us define the random variable

X= No of Heads Number of Tails


Find the Probability Distribution of X

Sample Space HHH HHT HTH HTT THH THT

X = No of (Heads Tails) 3 1 1 -1 1 -1

P (X=x)

1/8 1/8 1/8 1/8 1/8 1/8

TTH
TTT

-1
-3

1/8
1/8

X = No of (Heads Tails)
3

P (X=x)
1/8

1
-1

3/8
3/8

-3

1/8

Expectation of a Discrete Random Variable


Let X be a discrete random variable with distribution X : x1 x 2 xn P(X = x) : p1 p2 pn Then the Expectation of X denoted as E(X) is defined as E(X) = xi pi = xP(X=x) = (value x probability)

Example : Let X be the number of heads in two tosses of a fair coin. The probability distribution of the random variable X is X : 0 1 2 P(X=x): 1/4 Thus E(X) = 0. + 1. + 2. = 1 Example : In the case when the random variable X has distribution X : 2 3 4 5 6 P(X=x): 1/36 1/9 5/18 1/3 1/4

E(X) = 2. 1/36 + 3.1/9 + 4. 5/18 + 5. 1/3 + 6. = 14/3

Expectation as Theoretical AverageAn Intuitive View


Let X be a random variable which takes values x1, x2, , xk with probabilities p1, p2, , pk respectively. Suppose the experiment is repeated a large number, say n times, and each time the value of X is recorded. Suppose in these n values, x1occurs f1 times, x2 occurs f2 times and so on. The average value of X is then (x1 f1+ + xk fk)/n As n , fi/n pi therefore (x1 f1+ + xk fk)/n x1 p1+ + xk pk = E(X)

Properties of Expectation
Let X be a random variable and a, b are real numbers then E[a X + b] = a E[X] + b If P(X 0) = 1 then E(X) 0

Properties of Expectation
Suppose g(X) is a real valued function of X Then we define the expectation of function of random variable as E[g(X)] = g(xi)P(X = xi) = g(xi) pi Let g (X) = X2 Thus E(X2) = g(xi)P(X = xi) = xi2 pi

VARIANCE
If a random variable X has the following probability distribution (t > 0) X : -t t P(X=x) : Then E(X) = 0 for any t but as t increases then the spread of the distribution increases. Variance is a measure of spread of a probability distribution. Variance of a random variable X is defined as Var(X) = E(X E(X))2 = E(X2) (E(X))2 (Proof - exercise) For the above random variable Var(X) = t2. So as t increases Var(X) increases.

STANDARD DEVIATION (S.D.)


The positive square root of Var(X) is called the standard deviation of X or s.d.(X). The main advantage of using standard deviation is that the standard deviation is measured in the same unit as the original data whereas the variance is measured in squared units.

Properties of Variance and Standard Deviation


Let X be a random variable and a, b are real numbers Var(a X + b) = a2 Var(X) s.d.(a X + b) = |a| s.d.(X) Var(X) = 0 if and only is there is a real number t such that P(X = t) = 1. Since Var(X) 0, E(X2) (E(X))2

Markovs Inequality
If X is a non-negative random variable and t is a real number > 0 then P(X > t) E(X) / t Thus P(X > 2E(X)) P(X > 4E(X)) P(X > 100 E(X)) .01

Chebyshevs Inequality
If X is a random variable with E(X) = and standard deviation and if t is a real number > 0 then P(|X-| > t) 1 / t2 Thus P(|X-| > 2) P(|X-| > 3) 1/9 P(|X-| > 10) .01

You might also like