Professional Documents
Culture Documents
Random Variable
A random variable is a function whose domain is the sample space and whose range is a subset of the set of real numbers. A random variable is a function that associates a unique numerical value with every outcome of an experiment. The value of the random variable will vary from trial to trial as the experiment is repeated. *
Experiment : Toss a coin twice Sample Space : {HH, HT, TH, TT} X = number of heads X(HH) = 2, X(HT) = 1, X(TH) = 1, X(TT) = 0 X : {0,1,2} is a function. Hence X is a random variable
(*http://www.stats.gla.ac.uk/steps/glossary/probability_distributions.html#randv ar)
Experiment : Draw three balls at random from an urn containing two red and two green balls. Sample Space = {RRG, RGG} X = number of red balls X(RRG) = 2, X(RGG) = 1 X : {1,2} is a function. Hence X is a random variable. Experiment : Toss two dice. Sample Space = {(1,1), (1,2),, (1,6),, (6,1),(6,6)} X = sum of the scores obtained in the two tosses X((1,1)) = 2, X((1,2)) = 3, , X((6,6)) = 12 X : {2, ,12} is a function. X is a random variable
Example : Let X be the number of tails in two tosses of a fair coin. Then X takes the values 0,1,and 2 P(X =0) = P({HH}) = P(X=1) = P({HT, TH}) = P(X=2) = P({TT}) =
Therefore the probability distribution of the random variable X is X : 0 1 2 P(X=x): 1/4
Example : Toss a die with one side marked 1, two sides marked 2 and three sides marked 3 twice. Let X be the sum of scores in the two tosses. Now note that X can take the values 2, 3, 4, 5 and 6. The probability distribution of X is
X : 2 P(X=x): 1/36 3 1/9 4 5 5/18 1/3 6 1/4
X = No of (Heads Tails) 3 1 1 -1 1 -1
P (X=x)
TTH
TTT
-1
-3
1/8
1/8
X = No of (Heads Tails)
3
P (X=x)
1/8
1
-1
3/8
3/8
-3
1/8
Example : Let X be the number of heads in two tosses of a fair coin. The probability distribution of the random variable X is X : 0 1 2 P(X=x): 1/4 Thus E(X) = 0. + 1. + 2. = 1 Example : In the case when the random variable X has distribution X : 2 3 4 5 6 P(X=x): 1/36 1/9 5/18 1/3 1/4
Properties of Expectation
Let X be a random variable and a, b are real numbers then E[a X + b] = a E[X] + b If P(X 0) = 1 then E(X) 0
Properties of Expectation
Suppose g(X) is a real valued function of X Then we define the expectation of function of random variable as E[g(X)] = g(xi)P(X = xi) = g(xi) pi Let g (X) = X2 Thus E(X2) = g(xi)P(X = xi) = xi2 pi
VARIANCE
If a random variable X has the following probability distribution (t > 0) X : -t t P(X=x) : Then E(X) = 0 for any t but as t increases then the spread of the distribution increases. Variance is a measure of spread of a probability distribution. Variance of a random variable X is defined as Var(X) = E(X E(X))2 = E(X2) (E(X))2 (Proof - exercise) For the above random variable Var(X) = t2. So as t increases Var(X) increases.
Markovs Inequality
If X is a non-negative random variable and t is a real number > 0 then P(X > t) E(X) / t Thus P(X > 2E(X)) P(X > 4E(X)) P(X > 100 E(X)) .01
Chebyshevs Inequality
If X is a random variable with E(X) = and standard deviation and if t is a real number > 0 then P(|X-| > t) 1 / t2 Thus P(|X-| > 2) P(|X-| > 3) 1/9 P(|X-| > 10) .01