You are on page 1of 21

• One way to think of the Poisson

distribution is as an approximation to the


binomial distribution when n is large and p
is small.
• It is the case when n is large and p is small
the mass function depends almost entirely
on the mean np, and very little on the
specific values of n and p.
• We can therefore approximate the
binomial mass function with a quantity λ =
np. (note: λ is the parameter in the Poisson
distribution).
• Poisson distribution is useful when n is
large and p is small and when the
independent variables occur over a period
of time.
• It can also be used when a density of
items is distributed over a given area or
volume.
If X ~ Poisson(λ), the probability mass function
of X is
 e   x

p x   P  X  x   x!
, for x  0 ,1 ,2 ,...
 0 , otherwise

 mean X  

 variance
X2  

 Note: X must be a discrete random variable and λ


must be a positive constant.
If X ~ Poisson(3), compute P(X=2),
P(X=10), P(X=0), P(X=0.5),

P(X=2)= 0.2240
P(X=10)= 0.0008
P(X=0) =0.0498
P(X=0.5) =0
If X ~ Poisson(4), compute P(X≤2),
P(X>1)

P(X≤2) =0.2381
P(X>1) = 1 - P(X≤1) = 0.9084
Particles are suspended in a liquid medium
at a concentration of 6 particles per mL.
A large volume of the suspension is
thoroughly agitated, and then 3 mL are
withdrawn. What is the probability that
exactly 15 particles are withdrawn?

P(X=15) =0.0786
Hypergeometric Distribution :
• Consider a finite population containing two
types of items, which may be called
successes and failures.
• A simple random sample is drawn from the
population.
• Each item sampled constitutes a Bernoulli
trial.
Hypergeometric Distribution :
• As each item is selected, the probability of
successes in the remaining population decreases
or increases, depending on whether the
sampled item was a success or a failure.
• For this reason the trials are not independent,
so the number of successes in the sample does
not follow a binomial distribution.
• The distribution that properly describes the
number of successes is the hypergeometric
distribution.
Hypergeometric Distribution :
• Assume a finite population contains N items, of
which R are classified as successes and N – R are
classified as failures.
• Assume that n items are sampled from this
population, and let X represent the number of
successes in the sample. Then X has a
hypergeometric distribution with parameters N,
R, and n, which can be denoted
X ~ H(N, R, n).
The probability mass function of X is
Hypergeometric Distribution :
The probability mass function of X is
  R  N  R 
   
  x  n  x  , if max 0 , R  n  N   x  min n , R 
p x   P  X  x   N
  
 n
 0, otherwise

Mean nR
X 
N
Variance  R  R  N  n 
 X  n  1  
2

 N  N  N  1 
Of 50 buildings in an industrial park, 12
have electrical code violations. If 10
buildings are selected at random for
inspection, what is the probability that
exactly 3 of the 10 have code
violations? What is the mean and
variance of X?
 X ~ H(50, 12, 10), x=3
 P(X=3) =0.2703
 Mean = 2.4
 Variance = 1.4890
Geometric Distribution:
• Assume that a sequence of independent Bernoulli
trials is conducted, each with the same
probability of success, p.

• Let X represent the number of trials up to and


including the first success.

• Then X is a discrete random variable, which is


said to have the geometric distribution with
parameter p written as follows:
X ~ Geom(p).
Geometric Distribution :
The probability mass function of X is
 p1  p  x  1 , x  1 ,2 ,...
p x   P  X  x  
 0 , otherwise

Mean 1
X 
p
Variance
1 p
X  2
p2
A test of weld strength involves loading
welded joints until a fracture occurs. For a
certain type of weld, 80% of the fractures
occur in the weld itself, while the other 20%
occur in the beam. A number of welds are
tested. Let X be the number of tests up to
and including the first test that results in a
beam fracture.
1. What is the distribution of X?
2. What is the mean and variance of X?
3. What is the P(X = 3)?
Negative Binomial Distribution::
• The negative binomial distribution is an extension
of the geometric distribution. Let r be a positive
integer. Assume that independent Bernoulli
trials, each with success probability p, are
conducted, and let X denote the number of trials
up to and including the rth success. Then X has
the negative binomial distribution with
parameters r and p. We write X ~ NB(r,p).

• Note: If X ~ NB(r,p), then X = Y1 + …+ Yn where


Y1,…,Yn are independent random variables, each
with Geom(p) distribution.
Negative Binomial Distribution :
If X~NB(r,p) , then the pmf of X is
 x  1  r
  , p 1  p  x  r , x  r , r  1 ,...
p x   P  X  x   r  1 

 0, otherwise

r
Mean X 
p
Variance
r 1  p 
X 2
p2
A test of weld strength involves loading
welded joints until a fracture occurs. For a
certain type of weld, 80% of the fractures
occur in the weld itself, while the other 20%
occur in the beam. A number of welds are
tested. Find the mean and variance of X,
where X represents the number of tests up
to and including the third beam fracture.
Find P(X=8)
Multinomial Trials:
• A Bernoulli trial is a process that results in one
of two possible outcomes. A generalization of
the Bernoulli trial is the multinomial trial, which
is a process that can result in any of k outcomes,
where k ≥ 2.
• We denote the probabilities of the k outcomes
by p1,…,pk.
Multinomial Trials:
• Now assume that n independent multinomial trials
are conducted each with k possible outcomes and
with the same probabilities p1,…,pk.
• Number the outcomes 1, 2, …, k. For each
outcome i, let Xi denote the number of trials that
result in that outcome.
• Then X1,…,Xk are discrete random variables.
• The collection X1,…,Xk said to have the
multinomial distribution with parameters n,
p1,…,pk. We write X1,…,Xk ~ MN(n, p1,…,pk).
Multinomial Trials:
If X1,…,Xk ~ MN(n, p1,…,pk), then the pmf of
X1,…,Xk is

x
 n!
 , p1 x1 * p2 x 2 L pk xk , x  0 ,1 ,2...k and n
p x   P  X  x   x 1 ! x 2 ! Lx k !
i
 0, otherwise

Note that if X1,…,Xk ~ MN(n, p1,…,pk), then for


each i, Xi ~ Bin(n, pi).
The items produced on an assembly line are
inspected, and each is classified as either
conforming (acceptable), downgraded, or rejected.
Overall, 70% of the items are conforming, 20% are
downgraded, and 10% are rejected. Assume that
four items are chosen independently and at random.
Let X1, X2, X3 denote the numbers among the 4 that
are conforming, downgraded, and rejected,
respectively.
a) What is the distribution of X1, X2, X3?
b) What is the probability that 3 are conforming and
1 is rejected in a given sample?

You might also like