You are on page 1of 52

Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Lecture Notes in Undergraduate Probability


Lecture 1

Unnikrishna Pillai

Electrical and Computer Engineering Department,


Tandon School of Engineering, New York University

1 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Lecture 1

1 Sets and set operations

2 Probability

3 Independence

4 Conditional probability and inference

5 Partition

6 Advanced Topics

2 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Sets and set operations

3 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Whole sets and outcomes

Whole space S
fj Event A (more complicated
f1 f3
than elementary outcomes)
Event A vs. fj
f1 f3
f2 fi fn
elementary

outcomes
A
f2 fn
fi , fj . fi
{fi } : elementary outcomes
(b) An event A
H T

Coin tossing experiment

(a) Whole space and elementary


outcomes

4 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Set operations

A B AB
AB

Ac Bc A B

(A B )c (AB )c

Ac B c (AB )c

A B A B = (AB )c = Ac B c

De Morgans
Ac B c (A B )c
Laws
= (A B )c = Ac B c

Figure: Elementary set operations

5 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Set operations (contd...)

More generally,

A B C = A D, where D = B C
= A D = A(B C )
= A(B C ) = A B C ,

6 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Partition of whole space

Figure: AB = = A, B are
mutually exclusive sets. Figure: Partition of a set where,
A1 , A2 , ...An are disjoint sets whose
union gives the whole set.

Ai Aj = ,
A1 A2 . . . An = S .

7 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Probability

8 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Introduction

It is easy to assign probabilities to simple (elementary) random


events. For example, a coin tossing experiment has two possible
outcomes, Head (H) or Tail (T). For a fair coin, the probability of getting
a head or tail is 0.5 from actual observations over a large number of
trials or by simple deduction. For more complicated random events, we
need rules to figure out their probabilities.

9 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Axioms of Probability

Let P (A) denote the probability of any event A. Then


1. Probability is a non-negative number

P (A) 0. (1)

2. Probability of the whole set is unity

P ( S ) = 1. (2)

3. If A and B are mutually exclusive sets (i.e., A B = ), then

P (A B ) = P (A) + P (B ). (3)

Axioms are self evident and obvious, take them as is.

10 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Axioms of Probability

If A and B are mutually exclusive, then axiom (3) says that the
probability of their union is the sum of their probabilities.

What about P (A B ) if A and B are not mutually exclusive? Also


what about P (A)?

We can use the above axioms to answer all these questions, and
similar ones.

11 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Axioms of Probability

If A, B and C are pairwise mutually exclusive, then what is


P (A B C )?

Let, B C = D, then AD = .

P (A B C ) = P (A D )
= P (A) + P (D )
Figure
= P (A) + P (B C )
Hence, if A, B , C , D , . . . are = P (A) + P (B ) + P (C ),
pairwise mutually exclusive, then
in general
where A and D are mutually
P (A B C D .....) = exclusive. So are B and C. So we
P (A) + P (B )+P (C ) + P (D ) + . . . . can apply axiom (3).

12 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Axioms of Probability

Show that P (A) = 1 P (A).

Proof:
AA = S , and A A = .
Hence, by axiom (3)

P (A A) = P (A) + P (A).

But,
P (A A) = P (S ) = 1.

Hence,

P (A) + P (A) = 1
or, P (A) = 1 P (A). (4)

13 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Axioms of Probability

In general,
P (A B ) = P (A) + P (B ) P (A B ). (5)

A and B are generally not mutually exclusive . But their union can be
written as the union of two mutually exclusive sets as follows:

A B = A AB ,

where A and B are not mutually exclusive but A and AB are, and they
are as shown in the figure above.

14 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Axioms of Probability

Now using axiom (3) (since A and AB are mutually exclusive),

P (A B ) = P (A) + P (AB ) (6)

Also,
B = (A A) B = AB AB , (7)

where we have expressed B as a union of two mutually exclusive sets.


Hence
P (B ) = P (AB ) + P (AB ) (8)

by axiom (3).
Or,
P (AB ) = P (B ) P (AB ) (9)

Using this expression for P (AB ) in (6) we get,

P (A B ) = P (A) + P (AB ) = P (A) + P (B ) P (A B ) (10)


15 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Axioms of Probability

In general,

P (A B C ) = P (A D )
= P (A) + P (D ) P (AD ), where D = B C
= P (A) + P (B C ) + P (A(B C ))
= P (A) + {P (B ) + P (C ) P (BC )} + P (AB AC ).

Let E = AB, and F = AC, then

P (AB AC ) = P (E F ) = P (E ) + P (F ) P (EF )
= P (AB ) + P (AC ) P (ABC ).

Using this in Eqn. (5), we get

P (A B C ) =P (A) + P (B ) + P (C ) P (A B )
P (BC ) P (AC ) + P (ABC ). (11)

16 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Independence

17 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Independence

Events A and B are said to be independent if,

P (AB ) = P (A) P (B ). (12)

Independence is a probabilistic notion, and not a set theoretic notion


like mutually exclusive property.
Events A and B independent implies A and B, A and B, and A and B
are also independent.

18 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Independence

If A and B are independent, show that

P (A B ) = P (A)P (B ) (13)

Proof: We have A B = A B
Hence using Eqn. (4),

P (A B ) = P (A B ) = 1 P (A B )
= 1 (P (A) + P (B ) P (AB )) using Eqn.(5)
= 1 P (A) P (B ) + P (AB )
= P (A) P (B ) + P (A)P (B ) since A, B are independent
= P (A) P (B ){1 P (A)}
= P (A) P (B )P (A)
= P (A) (1 P (B )) = P (A)P (B ).

= If A and B are independent, then so are A and B.


19 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Independence

If A and B are independent and mutually exclusive, then


P (AB ) = P (A)P (B ) = 0 which implies, either P (A) = 0, or P (B ) = 0,
or both P (A) = P (B ) = 0. Thus at least one of them must have zero
probability of occurrence.
A, B , and C are independent if,

P (AB ) = P (A)P (B ) (14a)

P (BC ) = P (B )P (C ) (14b)

P (AC ) = P (A)P (C ) (14c)

P (ABC ) = P (A)P (B )P (C ) (14d)

20 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Independence

Note that pairwise independence, Eqns. (14a) (14c), is not enough


for multiple events to be independent. For that it is essential that Eqn.
14d is satisfied as well for the three events. Similarly there are extra
conditions to be satisfied when the number of events are more.
If A and B are independent, then we can write

P (A B ) = P (A) + P (B ) P (AB ) = P (A) + P (B ) P (A)P (B ) (15)

21 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Independence

If A1 , A2 , ...., An are independent, then what is P (A1 A2 ...An )?


Let us denote the event B as B = A1 A2 ...An .
Then B = A1 A2 ...An = A1 A2 . . . An .
Thus, P (B ) = P (A1 A2 . . . An ), where A1 , A2 .....An are independent
events by Eqn. (13).
Hence P (B ) = P (A1 )P (A2 )....P (An )

1 P (B ) = (1 P (A1 )) (1 P (A2 )) . . . (1 P (An )) ,


n
= [1 P (Ai )] ,
i =1
n
P (B ) = 1 [1 P (Ai )] ,
i =1
n
P (A1 A2 . . . An ) = 1 [1 P (Ai )] . (16)
i =1

22 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Conditional Probability

Event M
Event M
AM BM

A M A B

(a) (b)

23 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Conditional Probability

Let P (A|M ) denote the probability of an event A given that some other
event M has already occurred.
Then P (A|M ) is defined as,

P (AM )
P (A|M ) = 0, (17)
P (M )

provided P (M ) > 0.
Does definition (17) satisfy the probability axioms (1) (3)?

24 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Conditional Probability

It is easy to see that the above definition obeys all the three axioms
Eqns. (1) (3). To start with
P (AM )
i. P (A|M ) = P (M )
0
P (S M ) P (M )
ii. P (S |M ) = P (M )
= P (M )
=1
iii. If AB = , then

P [(A B ) M ] P (AM BM )
P (A B |M ) = = ,
P (M ) P (M )

where A M and B M are mutually exclusive (from figure).


Hence,

P (AM ) + P (BM ) P (AM ) P (BM )


P (A B |M ) = = + = P (A|M )+ P (B |M ),
P (M ) P (M ) P (M )

satisfying axion (3) as well.


25 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Note that if event A is independent of event B, then

P (AB ) P (A)P (B )
P (A|B ) = = = P (A),
P (B ) P (B )

i.e., occurrence of B has no influence on event A.


We can use conditional probability to deduce a better understanding
about future events based on those events that have already occurred.
Bayes theorem provides the proper framework for this.

26 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Bayes Theorem (1812)

From (17)

P (AB )
P (A|B ) = P (AB ) = P (A|B )P (B ). (18)
P (B )

Also,
P (BA)
P (B |A) = P (BA) = P (B |A)P (A). (19)
P (A)
But AB BA P (AB ) = P (BA).
Hence from Eqns. (18) and (19),

P (A|B )P (B ) = P (B |A)P (A).

Or,

P (B |A)P (A) P (A|B )P (B )


P (A|B ) = = P (B |A) = (20)
P (B ) P (A)

27 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Partition: : Whole space as the union of disjoint events

Cj

C1 C3

C2 Cn

Ci

28 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Partition: : Whole space as the union of disjoint events

If C1 , C2 , . . . , Cn form a partition, then we must have Ci Cj = , and


C1 C2 . . . Cn = S .
Hence B = BS = B (C1 C2 . . . Cn ) = BC1 BC2 . . . BCn , where
the BCi s are mutually exclusive because the Ci s are mutually
exclusive.
Then,

P (B ) = P (BC1 BC2 .... BCn ),


= P (BC1 ) + P (BC2 ) + .... + P (BCn ),
= P (B ) = P (B |C1 )P (C1 ) + P (B |C2 )P (C2 ) + .... + P (B |Cn )P (Cn )
(21)

29 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Partition: : Whole space as the union of disjoint events

Substituting (21) into (20) we get the generalized Bayes rule ,

P (B |A)P (A)
P (A|B ) = , (22)
ni=1 P (B |Ci )P (Ci )
where C1 , C2 , . . . Cn form a partition..
Replace A with Ck ,

P (B |Ck )P (Ck )
P (Ck |B ) = n , k = 1, 2, . . . n. (23)
i =1 P (B |Ci )P (Ci )

30 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 1

A box contains m black balls and n white balls. Two balls are
drawn without replacement. What is the probability that the first
ball is white and the second one is black?

Solution:
Let W1 :"First ball is white" and B2 :"Second ball is black".
Then we need P (B2 W1 ) = P (B2 |W1 )P (W1 ).
Now,
n
P (W1 ) =
n+m
and
m
P (B2 |W1 ) =
n+m1
[since there will be m black balls and n 1 white balls left after a white
ball is drawn].
Hence,
n m nm
P (B2 W1 ) = = .
n+m n+m1 (n + m)(n + m 1) 31 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 2

Box 1 contains 100 Red balls and 100 Blue balls; Box 2 contains
150 Red balls and 50 Blue balls. A ball is drawn at random from
one of the boxes without knowing which box it came from.
(a) What is the probability that the ball drawn Red?
(b) Given that the ball drawn is Red, what is the probability that
it came from Box 2?

32 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 2

Solution: Both boxes are identical as they


each contain 200 balls. Hence it
is reasonable to assume that
probability of picking any box is
0.5.
Let Bi : A ball is picked from box i
(i=1,2)".
Then B1 and B2 form a partition
as the ball has to come from
(a) Box 1
either of the two boxes. Similarly,
let
R : Red ball is drawn"
B : Blue ball is drawn"
They also form a different
(b) Box 2 partition.
33 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 2

(a)We can use the partition theorem in Eqn. (21) to compute P (R ).

100 1 150 1 5
P (R ) = P (R |B1 )P (B1 ) + P (R |B2 )P (B2 ) = + =
200 2 200 2 8
Thus probability of picking a red ball is slightly higher. This makes
sense as Box 2 has a lot more Red balls.

34 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 2

(b) Knowing that the ball drawn is red, what can we say about whether
it came from Box-1 or Box-2? Clearly Box-2 has more Red balls, so
chances are higher that it came from Box 2. How high is that
probability? Bayes theorem can answer that quantitatively.
From (18),
3 1
P (R |B2 )P (B2 ) .
4 2
P (B2 |R ) = = 5
= 0.6.
P (R ) 8

Hence given that the ball is Red, there is 60% chance that it came
from Box 2.

35 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 3

(a) In the circuit in the figure above, switches S1 , S2 , S3 close with


probability p and open with probability (1 p). They operate
independently. Find the probability of receiving the input signal at
the output.
(b) If an input is received at the output, what is the probability that
switch S3 is closed?

36 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 3

Solution: Let Ai : Switch Si is closed", i = 1, 2, 3.


Then Ai : Switch Si is open".

P (Ai ) = p, P (Ai ) = 1 p, i = 1, 2, 3.

Also let R: "Input signal is received at output".


(a) From the figure, for the input to be received at the output, we need
either either S1 and S2 to be closed, or S3 to be closed. That implies,

R = A1 A2 A3

Then,

P (R ) = P (A1 A2 A3 )
= P (A1 A2 ) + P (A3 ) P (A1 A2 A3 )
= P (A1 )P (A2 ) + P (A3 ) P (A1 )P (A2 )P (A3 )
= p + p2 p3 ,
where we have used the independence of the switches.
37 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Example 3

(b) P (R |A3 ) = 1, because if S3 is closed, input will be surely received


at output.
Hence,

P (R |A3 )P (A3 )
P (A3 |R ) =
P (R )
1.p
=
p + p2 p3
1
= .
1 + p p2

38 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Borel-Cantelli Lemma and Murphys law

Consider a chain of infinitely many events A1 , A2 , A3 , . . . to .


Let P (Ak ) = pk 0, k = 1, . . . , .
Consider the infinite sum
(

< Case 1
pk = M = Case 2
k =1

The sum can only be either finite or infinite. Each case is interesting by
itself and leads to interesting conclusions.

39 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Borel-Cantelli Lemma and Murphys law

Case 1 : M is finite. Thus if k =1 pk = M < , then infinitely many


pk 0 as k .
Infinitely many Ak s must occur with zero probability, or, only finitely
many Ak s will be there with non zero probability and only those occur
in the long run.
Case 2 : M is infinite. Thus k =1 pk = . As k ,, infinitely many
pk s do not go to zero, and hence infinitely many such Ak s can occur
with finite probability, and hence infinitely many Ak s can occur in the
long run (provided Ak s are independent).
P (Ak ) 6= 0 Ak can occur with finite probability. In this case, infinitely
many Ak s exist and hence they will all occur in the long run.
This is also Murphys law. If an event A can occur (P (A) 6= 0, then if
we wait long enough, it will occur.

40 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

A and B play against each other by flipping a coin continuously. The


winner gets one dollar from the loser after every toss. If A and B had
started with $a and $b respectively, what is the probability of As ruin?
That is, does A lose all of his/her money eventually ?

Solution: A will be ultimately ruined if A loses all his/her capital, and


similarly B will be ruined if B loses all his/her capital. Let us say A has
$n at some stage in the game and let Xn denote the event "A is
ultimately ruined given that his/her wealth is $n".

41 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Let Pn = P (Xn ) represent the probability of ruin for A given his/her


wealth is $n.
A can win the next toss with probability P (H ) = p, and increment
his/her wealth to $(n + 1), or lose the next toss with probability
q = p 1 and decrease his/her wealth to $(n 1). When A0 s wealth is
$(n + 1), associated probability of ruin is Pn+1 = P (Xn+1 ). Similarly
when A0 s wealth goes down to $(n 1), the probability of ruin is

Pn1 = P (Xn1 ).

We can assume p < 0.5 as generally any game is rigged against you.
We can link the above probabilities as follows

Xn = Xn S = Xn (H H ) = Xn H Xn H

42 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Here H represents Head appearing in the next (n + 1) toss. Naturally


either a head (H ) or a tail (H ) must appear in the next toss.
Hence,

Pn = P (Xn )
= P (Xn H ) + P (Xn H )
= P (Xn |H )P (H ) + P (Xn |H )P (H )
= pPn+1 + qPn1 (24)

is the basis iterative identity for the gamblers ruin problem

43 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

This is because the event: (Xn |H) =A is ultimately ruined when his/her
wealth is $n and given that A wins next toss," which is same as
(Xn |H) = A is ultimately ruined given that his/her wealth is $(n + 1)" =
Xn + 1 .
Similarly
Xn |H = Xn1 .

Now Eqn. (24) reads,

(p + q )Pn = pPn+1 + qPn1 or, p(Pn+1 Pn ) = q (Pn Pn1 ),


q
Pn+1 Pn = (Pn Pn1 ) = (q /p)2 (Pn1 Pn2 ) = (q /p)n (P1 P0 )
p
n
Pn+1 Pn = (q /p) (P1 1), (25)

since, P0 = P (A0 s ruin given his/her wealth is $0) = 1. Similarly


Pa+b = 0, since then B is completely ruined, and A has all the money.
44 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Now,
a+b1
(Pk +1 Pk ) = (Pa+b Pa+b1 ) + (Pa+b1 Pa+b2 )
k =n

+ . . . + (Pn+2 Pn+1 ) + (Pn+1 Pn ),


= Pa+b Pn
= Pn , (26)
(27)

But from (25)

a+b1 a+b1
 k
q
(Pk +1 Pk ) = (P1 1). (28)
k =n k =n p

45 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Or equating (26) and (28)


 n a+b1
 k n
q q
Pn = (1 P1 )
p k =n p
 n "  2  a+bn1 #
q q q q
= ( 1 P1 ) 1+ + +...+ ,
p p p p
 a+bn
q
1
 n
q p
= ( 1 P1 ) .

p 1 qp

Or,  n  a+b
q
p
qp
Pn = (1 P1 )   (29)
1 qp

46 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Let n = 0, then
1 ( qp )a+b
P0 = 1 = ( 1 P1 ) (30)
1 qp

Divide Eqn. (29) by Eqn. (30) to get

( qp )n ( qp )a+b
Pn =
1 ( qp )a+b

Set n = a to get the desired probability of ruin for A to be


 a  a+b  a+b  b  b
q
p
qp q
p
p
q
1 1 p
q
Pa =  a+b =  a+b  a+b =  a+b (31)
1 qp q
p
p
q
1 1 p
q

47 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Or,
( qp )b 1 1 ( qp )b
Pa = = (32)
( qp )a+b 1 1 ( qp )a+b

This gives the probability of ruin for A given his/her current wealth is
$a. Usually it is safe to assume that p < 0.5, as in general any casino
game is slightly unfavorable to the player.
To keep Pa close to zero, b must be small (close to zero). But one
could look at $b only as the intended profit" for A (instead of Bs entire
wealth). So for A to survive (not to be ruined), keep the profit $b low
(less greed).
Pa can be interpreted as the risk involved in trying to get $b by
investing $a on a game whose favorability factor is p.

48 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Fig. (a) shows probability of ruin PA versus (a,b), where $a represents


As current wealth, and $b represents the intended profit. This is
plotted for various values of p, the favorability factor of the underlying
game for A.

0.9 100
p=0.4 p=0.4
0.8 p=0.45 90 p=0.45
p=0.48 p=0.48
p=0.52 80 p=0.52
0.7 p=0.55 p=0.55
Prob. of ruin for player a (P a)

Avg. duration of game (Na)


70
0.6
60
0.5
50
0.4
40
0.3
30
0.2
20

0.1 10

0 0
(3,1) (4,1) (5,1) (5,2) (6,1) (6,2) (8,2) (10,1) (20,2) (20,4) (3,1) (4,1) (5,1) (5,2) (6,1) (6,2) (8,2) (10,1) (20,2) (20,4)
(a,b) (a,b)

(a) Prob. of ruin of A, i.e., B wins (b) Average time taken for A to win
49 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Notice that at 20% probability of ruin (Pa = 0.2) for a slightly


unfavorable game (p = 0.48), (5,1) and (6,1) strategies are feasible
solutions leading to 20% and 16.7% returns.
The duration taken to realize this gain is also plotted in Fig. (b). We
can see that the two strategies take about 5 units of time (hours, days,
weeks, months, whichever is appropriate).
One can use the above graphs to design appropriate investment
scenarios depending on the investors appetite for risk.

50 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

From the graph on the left at low risk (Pa = 0.15), (10,11) game is
appropriate if the favorability factor is p = 0.48. Thus it is possible to
make gains in moderately unfavorable situations (p = 0.48). provided
the greed factor $b is kept low compared to the investment.
Hence a 10% return carries a 15% risk even if each transaction is
slightly unfavorable (p = 0.48).

51 / 52
Sets and set operations Probability Independence Conditional probability and inference Partition Advanced Topics

Gamblers Ruin Problem

Thus for any a, Pa 1 if b is large and p/q < 1, i.e., it is unwise to


play the same game against a skillful opponent who is also wealthy as
your ruin in the long run is practically guaranteed.
Casinos run on this principle: all games are rigged in their favor (p < 12
for the player) and the opponent (i.e., the casino) has a large reserve
(b ).

52 / 52

You might also like