You are on page 1of 41

STAT 342 Statistical Methods for Engineers

Probability, Random Variables, and Sampling Distributions

Probability Concepts
The Ind&SE of uncertainty is called (Probability Theory)

Probability Theory enables you to evaluate and control the likelihood that a statistical inference is correct.

Probability Theory provide the mathematical basis for inferential statistics.

f
Probability of an event 
N

For an experiment with equally likely outcome, probability are identical to relative
frequencies (or percentages)
Probability, Random Variables, and Sampling Distributions

2 3 4 5 6 7

3 4 5 6 7 8

4 5 6 7 8 9

5 6 7 8 9 10

6 7 8 9 10 11

7 8 9 10 11 12
Probability, Random Variables, and Sampling Distributions

Frequency and Probability Chart


7

0
1 2 3 4 5 6 7 8 9 10 11 12 13
Probability 0 0.0278 0.0556 0.0833 0.1111 0.1389 0.1667 0.1389 0.1111 0.0833 0.0556 0.0278 0
frequency 0 1 2 3 4 5 6 5 4 3 2 1 0
Probability, Random Variables, and Sampling Distributions
Meaning of Probability: is generalization of the concept of percentage
Basic Properties of Probabilities:

Property 1: the probability of an event is always between 0 and 1, inclusive.

Property 2: the probability of an event that cannot occur is 0. (Impossible Event)

Property 3: the probability of an event that must occur is 1. (Certain Event)


Example: (the Dice)

Determine the probability that:

a) the sum of the dice is 1.

b) the sum of the dice is 12 or less.

Random Experiment: that can result in different outcomes, even though it is repeated in
the same manner every time.
Probability, Random Variables, and Sampling Distributions
Introducing Events:
Playing Cards: a deck of playing cards contains 52cards, when we perform the experiment of randomly selecting
one card from the deck, we will get one of these 52 cards.
The collection of all 52 cards ---- the possible outcomes--- is called the SAMPLE SPACE for this experiment.

Example:
List the outcomes constituting each of the following events:
a) The event that the card selected is the king of heart.
b) The event that the card selected is a king.
c) The event that the card selected is a heart.
d) The event that the card selected is a face.
Probability, Random Variables, and Sampling Distributions
Introducing Events:

number of outcomes in the event


P(event) 
total number of outcomes
Solution:

a) P(king of heart) =1/52 =0.0192 =1.92%


b) P(a king) =4/52 =0.0769 =7.69%
c) P(a heart) =13/52 =0.250 =25.0%
d) P(a face) =12/52 =0.231 =23.1%
Probability, Random Variables, and Sampling Distributions
Sample Space (S): The collection of all possible outcomes for a random experiment.
Event (E): A collection of outcomes for the experiment, that is, any subset of the sample space.
An event occurs if and only if the outcome of the experiment is a member of the event.

P( S )  1
0  P( E )  1

A = The event that the card selected is a king.


B = The event that the card selected is a heart.
C = The event that the card selected is a face.
D = The event that the card selected is the king of heart.

Discrete Event: That consists of a finite or countable infinite set of outcomes


Continuous Event: that consists of an interval (either finite or infinite) of real numbers
Probability, Random Variables, and Sampling Distributions
Notations and Graphical Display of Events
Venn Diagrams:

Relationship Among Events:

(A & B) (A or B)
Probability, Random Variables, and Sampling Distributions
Notations and Graphical Display of Events E  1 E c

Relationship Among Events:


(Not E; ~E): the event “E does not occur” E  1 E '

Not E is called the complement of Event E (Ec)


(E )  Ec c

( A  B) (A & B): The event “both A and B occur”

( A  B) (A or B): The event “ either A or B or both occur”

( A  B)

(A & B) (A or B)
Probability, Random Variables, and Sampling Distributions
Notations and Graphical Display of Events
Relationship Among Events:
Example:
Determine the following event:
P(A) =?
P(~A) =?
P(B) =?
P(A & B) =?
P(A or B) =?
Probability, Random Variables, and Sampling Distributions
Notations and Graphical Display of Events
Relationship Among Events:
Example:
Determine the following event:
P(A) =10/20 =0.5
P(~A) =10/20 =0.5
P(B) = 8/20 =0.4
P(A & B) = 4/20 =0.2
P(A or B) = 14/20 =0.7

P(A)=1-P(not A)
P(A or B)= P(A)+P(B)-P(A and B)
Probability, Random Variables, and Sampling Distributions
Notations and Graphical Display of Events ---- Relationship Among Events:
Descriptive statements:
At least x: Greater than or Equal x
At most y: Less than or Equal y
Between x and y Inclusive: Greater than or Equal x but Less that or Equal y.

Mutually Exclusive Events: two or more events are mutually exclusive events if no two of them
have outcome in common. Two mutually exclusive events Two non-mutually exclusive events

A B 
P ( )  0.0
Common outcomes
Counting Techniques
Multiplication Rule for counting techniques:
Assume an operation can be described as a sequence of k steps, and

➢ the number of ways of completing step 1 is n1, and

➢ the number of ways of completing step 2 is n2 for each way of completing

step 1, and

➢ the number of ways of completing step 3 is n3 for each way of completing

step 2, and so forth.

The total number of ways of completing the operation is

n1  n2  ......  nk
Permutations
The number of permutations of n different elements is n! where

n! n  (n  1)  (n  2)  .....  3  2 1
The number of permutations of subsets of r elements selected from a set of n different
elements is n!
P  n  (n  1)  (n  2)  .....  (n  r  1) 
n

(n  r )!
r

The number of permutations of n= n1 + n2 + ….. + nr objects of which n1 are of one type, n2 of a


second type, …., and nr of rth type is n!
n1!n2 !n3!....nr !
The number of combinations, subsets of r elements that can be selected from a set of n
n n
elements, is denoted as   or Cr and
r n n!
  C    
n

 r  r!(n  r )!
r
Example

Let A = {1,2,3}
2-permutations of A are: 1,2 2,1 1,3 3,1 2,3 3,2
6 total.

Consider a collection of objects consisting of the five letters a, b, c, d, e.

List all possible combinations of three letters from this collection of five letters.
Probability, Random Variables, and Sampling Distributions
Contingency Tables; Joint and Marginal Probabilities
Introducing Contingency Tables:

EE CE ME Ind&SE Comp-E Total

A1 A2 A3 A4 A5
UNIV-1 B1 5 2 2 1 0 10
UNIV-2 B2 5 2 5 0 1 13
UNIV-3 B3 5 3 1 1 1 11
UNIV-4 B4 2 5 2 0 0 9
UNIV-5 B5 0 1 8 7 3 19
UNIV-6 B6 7 3 1 3 3 17

Total 24 16 19 12 8 79
Probability, Random Variables, and Sampling Distributions
Contingency Tables; Joint and Marginal Probabilities
EE CE ME Ind&SE Comp-E Total
Introducing Contingency Tables:
A1 A2 A3 A4 A5
UNIV-1 B1 5 2 2 1 0 10

UNIV-2 B2 5 2 5 0 1 13

UNIV-3 B3 5 3 1 1 1 11

UNIV-4 B4 2 5 2 0 0 9

UNIV-5 B5 0 1 8 7 3 19

UNIV-6 B6 7 3 1 3 3 17

Total 24 16 19 12 8 79

(B1 & A1) (B1 & A2) (B1 & A3) (B1 & A4) (B1 & A5)
(B2 & A1) (B2 & A2) (B2 & A3) (B2 & A4) (B2 & A5)
P(A1)=24/79;
(B3 & A1) (B3 & A2) (B3 & A3) (B3 & A4) (B3 & A5)
P(B1)=10/79;
(B4 & A1) (B4 & A2) (B4 & A3) (B4 & A4) (B4 & A5) P(A1 & B1)= 5/79
(B5 & A1) (B5 & A2) (B5 & A3) (B5 & A4) (B5 & A5)
(B6 & A1) (B6 & A2) (B6 & A3) (B6 & A4) (B6 & A5)
Probability, Random Variables, and Sampling Distributions
Contingency Tables; Joint and Marginal Probabilities

EE CE ME Ind&SE Comp-E Total

A1 A2 A3 A4 A5
UNIV-1 B1 0.0633 0.0253 0.0253 0.0127 0 0.1266
UNIV-2 B2 0.0633 0.0253 0.0633 0 0.0127 0.1646
UNIV-3 B3 0.0633 0.038 0.0127 0.0127 0.0127 0.1392
UNIV-4 B4 0.0253 0.0633 0.0253 0 0 0.1139
UNIV-5 B5 0 0.0127 0.1013 0.0886 0.038 0.2405
UNIV-6 B6 0.0886 0.038 0.0127 0.038 0.038 0.2152
Total 0.3038 0.2025 0.2405 0.152 0.1013 1
Probability, Random Variables, and Sampling Distributions
Contingency Tables; Joint and Marginal Probabilities

P ( A  B )  P ( A)  P ( B )  ( A  B )
P ( A  B )  P ( A)  P ( B ) If A & B are Mutually Exclusive

P ( A  B  C )  P ( A)  P ( B )  P (C )  P ( A  B )  P ( A  C )  P ( B  C )  P ( A  B  C )
P ( A  B )  P ( A)  P ( B )  P (C ) If A, B & C are Mutually Exclusive

For a set of events that are all Mutually Exclusive


P ( E1  E2  E3  .....  Ek )  P ( E1 )  P ( E2 )  P ( E3 )  .....  P ( Ek )

(
Probability, Random Variables, and Sampling Distributions
Conditional Probability
Conational Probability: The probability that event B occurs given that event A occurs.

P(B| A) == the probability of B given A

The Conditional Probability Rule: if A and B any two events with P(A)> 0;
P( A & B) P( A  B)
P ( B | A)  
P ( A) P ( A)
P(B1)=
P(A1 &B1)= EE CE ME Ind&SE Comp-E Total
P(A1|B1)= A1 A2 A3 A4 A5
UNIV-1 B1 0.0633 0.0253 0.0253 0.0127 0 0.1266
P(A1)=
UNIV-2 B2 0.0633 0.0253 0.0633 0 0.0127 0.1646

P(B1|A1)= UNIV-3 B3 0.0633 0.038 0.0127 0.0127 0.0127 0.1392


UNIV-4 B4 0.0253 0.0633 0.0253 0 0 0.1139
UNIV-5 B5 0 0.0127 0.1013 0.0886 0.038 0.2405
UNIV-6 B6 0.0886 0.038 0.0127 0.038 0.038 0.2152
Total 0.3038 0.2025 0.2405 0.152 0.1013 1
Probability, Random Variables, and Sampling Distributions
Conditional Probability
Conational Probability: The probability that event B occurs given that event A occurs.

P(B| A) == the probability of B given A

The Conditional Probability Rule: if A and B any two events with P(A)> 0;
P( A & B) P( A  B)
P ( B | A)  
P ( A) P ( A)
P(B1)=0.1266
P(A1 &B1)=0.0633 EE CE ME Ind&SE Comp-E Total
P(A1|B1)=0.0633/0.1266=0.5 A1 A2 A3 A4 A5
UNIV-1 B1 0.0633 0.0253 0.0253 0.0127 0 0.1266
P(A1)=0.3038
UNIV-2 B2 0.0633 0.0253 0.0633 0 0.0127 0.1646
P(B1|A1)= UNIV-3 B3 0.0633 0.038 0.0127 0.0127 0.0127 0.1392
0.0633/0.3038 UNIV-4 B4 0.0253 0.0633 0.0253 0 0 0.1139
=0.20846 UNIV-5 B5 0 0.0127 0.1013 0.0886 0.038 0.2405
UNIV-6 B6 0.0886 0.038 0.0127 0.038 0.038 0.2152
Total 0.3038 0.2025 0.2405 0.152 0.1013 1
Probability, Random Variables, and Sampling Distributions
Conditional Probability

EE CE ME Ind&SE Comp-E Total

A1 A2 A3 A4 A5
UNIV-1 B1 0.0633 0.0253 0.0253 0.0127 0 0.1266
UNIV-2 B2 0.0633 0.0253 0.0633 0 0.0127 0.1646
UNIV-3 B3 0.0633 0.038 0.0127 0.0127 0.0127 0.1392
UNIV-4 B4 0.0253 0.0633 0.0253 0 0 0.1139
UNIV-5 B5 0 0.0127 0.1013 0.0886 0.038 0.2405
UNIV-6 B6 0.0886 0.038 0.0127 0.038 0.038 0.2152
Total 0.3038 0.2025 0.2405 0.152 0.1013 1
The multiplication Rule, Independence
Conational Probability: The probability that event B occurs given that event A occurs.

P(B| A) == the probability of B given A

The Conditional Probability Rule: if A and B any two events with P(A)> 0;

P( A  B)
P( B | A) 
P( A)
The General Multiplication Rule: if A and B any two events, then

P( A  B)  P(A) P( B A)  P( B) P( A B )
For any two events, the probability that both occur equals the probability that a specified one occurs times the
conditional probability of the other event, given the specified event.

Random Sampling: to select randomly implies that each step of the sample, the items that remain in the batch are
equally likely to be selected.
The multiplication Rule, Independence

Total Probability two events


B  ( A  B )  ( Ac  B )
P ( B )  P ( B  A)  P ( B  Ac )  P ( B A) P ( A)  P ( B Ac ) P ( Ac )
Total Probability Multiple events
P ( B )  P ( B  E1 )  P ( B  E2 )  .....  P ( B  Ek ) 
P ( B E1 ) P ( E1 )  P ( B E2 ) P ( E2 )  .....  P ( B Ek ) P ( Ek )
Assuming all E1 , E2 ….. Ek are mutually exclusive and exhaustive events
The multiplication Rule, Independence
Independent Events: Event B is said to be independent of event A if P(B| A)=P(B).

One event is independent of another event if knowing whether the latter event occurs does not affect the
probability of the former event.

The Special Multiplication Rule: if A and B are independent events, then

P( A & B)  P(A) * P( B)
And conversely, if P(A & B)=P(A)*P(B), then A and B are independent events.

Two events are independent if and only if the probability that both occur equals the product of their individual
probabilities.

The Special Multiplication Rule: if A, B, C, …. are independent events, then

P( A & B & C & ....)  P(A) * P( B) * P(C ) * ....


The multiplication Rule, Independence
Mutually exclusive versus Independent events

Mutually exclusive events are those cannot occur simultaneously.

Independent events are those for which the occurrence of some does not affect the
probabilities of others occurring

If two or more events are mutually exclusive, the occurrence of one preclude the occurrence
of the others.

Two or more (nonimpossible) events cannot be both mutually exclusive and independent.
The multiplication Rule, Independence

Example: A company has a program to drill a well on each of the two identified structures
if the probability of well 1 to be successful is P(W1)=0.35
the probability of well 2 to be successful is P(W2)=0.45

What is the probability of having two successful wells?


What is the probability of having only successful well?
The multiplication Rule, Independence

Example: A company has a program to drill a well on each of the two identified structures
if the probability of well 1 to be successful is P(W1)=0.35
the probability of well 2 to be successful is P(W2)=0.45

What is the probability of having two successful wells?


What is the probability of having only successful well?
0.1575
0.45
0.55 0.1925
0.35
0.45 0.2925
0.65
0.55 0.3575
P(W1&W2)=0.1575
P(W1 or W2)=0.1925+0.2925=0.485
The multiplication Rule, Independence

Example: A company has a program to drill three wells on a structure, if there is 60%
chance for each well to be successful, estimate the following:
1) Only one well is successful?
2) Exactly two successful drilling?
3) At least two successful wells?
4) At most two successful wells?
The multiplication Rule, Independence
Example: A company has a program to drill a three wells on a structure, if there is 60% chance for each well to be
successful, estimate the following:
1) Only one well is successful? 0.096+0.096+0.096= 0.288
2) Exactly two successful drilling? 0.144+0.144+0.144= 0.432 0.60
3) At least two successful wells? 0.432+0.216=0.684
4) At most two successful wells? 0.064+0.288+0.432=0.784 0.60
0.40
0.40 0.60
outcome 0.60 0.40
sss 0.216
ssf 0.144 0.60
sfs 0.144
0.40 0.60
sff 0.096 0.40
fss 0.144
fsf 0.096 0.60
0.40
ffs 0.096
fff 0.064 0.40
The multiplication Rule, Independence

Example: A company has a program to drill THIRTY wells on a structure, if there is


60% chance for each well to be successful, Do you think tree diagram is suitable to
use??
The multiplication Rule, Independence

Example: A company has a program to drill THIRTY wells on a structure, if there is


60% chance for each well to be successful, Do you think tree diagram is suitable to
use??

You should use Binomial Probability Formula:


 n  x n x
P( X  x)  
 xp q ; q  1 p
 
The Binomial Distribution
Binomial Coefficients:
If n is a positive integer and x is a nonnegative integer less than or equal to n, then the binomial coefficient
is defined as:
n n!
 
 x   x!( n  x )!
 
This definition is equivalent to the definition of Combination Rule: the number of possible combinations of
x objects from a collection of n objects
n!
n Cx 
x!(n  x)!
Binomial probability Formula:
Let X denote the total number of successes in n Bernoulli trials with success probability p. Then the
probability distribution of random variable X is given by
 n  x n x
P( X  x)    xp q ; q  1 p
 
The random variable X is called a binomial random variable and is said to have the binomial
distribution with parameters n and p
Example

Let A = {1,2,3}
2-permutations of A are: 1,2 2,1 1,3 3,1 2,3 3,2
6 total.

Consider a collection of objects consisting of the five letters a, b, c, d, e.

List all possible combinations of three letters from this collection of five letters.
Example

suppose that you take a multiple choice test with 10 questions, and each question has 5 answer choice (a, b, c, d, e),
which is the probability you get exactly 4 questions correct just by guessing?
The multiplication Rule, Independence
Bayes’ Theorem
The Rule of Total Probability: Suppose that events A1, A2, A3 …., Ak are mutually
exclusive and exhaustive; that is, exactly one of the events must occur. Then for
any event B, k
P( B)   P( A j ) • P( B | A j )
j 1

B
(A1&B) (A2 &B) (A3&B)

P( A & B)
P ( B | A) 
P( A)
The multiplication Rule, Independence
Bayes’ Rule
Suppose that events A1, A2, A3 ….,Ak are mutually exclusive and exhaustive; Then
for any event B

P( A j ) • P( B | A j )
P( A j | B)  k

 P( A
j 1
j ) • P( B | A j )

P( A & B)
P ( B | A) 
P ( A)
The multiplication Rule, Independence

Example: Student from Abiar Ali Area comes for classes at UPM using either his private car or he comes with a friend.

The probability of using his private car is 30%. If he comes by his private car, the probability of being late is 5%. If he

comes with a friend, the probability of being late is 10%.

1) Find the probability that he is late for classes.

2) Given that he is late for classes, find the probability he used his private car.
The multiplication Rule, Independence
P( A & B)
P ( B | A) 
P ( A) 1.5%

028.5%
7%

63%

P( L & C )
P (C | L ) 
P ( L)
Probability of being late 1.5%+7%=8.5%
The probability he used his car is (1.5/8.5) %=17.6%
Discrete Random Variable

Random Variable is a quantitative variable whose value depends on chance

Probability Distribution and Probability Histogram


Probability Distribution A listing of the possible value and corresponding probabilities of a
discrete random variable, or a formula for the probabilities.

Probability Histogram A graph of the probability distribution that displays the possible
value of a discrete random variable on the horizontal axis and the probabilities of those
values on the vertical axis. The probability of each value is represented by a vertical bar
whose height equals the probability.

You might also like