Professional Documents
Culture Documents
Bayesian Inference
• This topic is so important that we have a very
special guest lecturer today:
1
https://www.youtube.com/watch?v=Zr_xWfThjJ0
Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats.
You pick a door, say No. 1, and the host, who knows what's behind the doors, opens
another door, say No. 3, which has a goat. He then says to you, "Do you want to pick
door No. 2?“
? ?
?
2
3
Bayesian Inference
• If the winning door was chosen arbitrarily, why would
the probability of winning be anything other than 1/3?
• The host will not open the door with the car, so their
choice gives us valuable information. We need to
update our beliefs about where the car is in light of our
observation of the host’s choice.
4
Bayesian inference and AI
• Bayesian inference predates AI by around 200 years
– Thomas Bayes’ paper published 1763
– John McCarthy coined the term AI in 1955
6
7
Answer
• All are true, but the answer is B).
8
9
Answer
• D). Laplace was one of the first to promote
Bayesian ideas, after Bayes.
10
11
Answer
• D). The posterior predictive distribution is
used to predict new data.
12
Bayes’ rule
• Our goal:
• We have:
Prior beliefs about H, encoded as a probability distribution,
the prior distribution
A model for how the data were generated, given the hypothesis, the
likelihood
13
Bayes’ rule
A normalizing constant
14
Deconstructing Bayes’ rule
Product rule
Sum rule
15
The Monty Hall problem
16
Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats.
You pick a door, say No. 1, and the host, who knows what's behind the doors, opens
another door, say No. 3, which has a goat. He then says to you, "Do you want to pick
door No. 2?“
? ?
?
17
Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats.
You pick a door, say No. 1, and the host, who knows what's behind the doors, opens
another door, say No. 3, which has a goat. He then says to you, "Do you want to pick
door No. 2?“
18
Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats.
You pick a door, say No. 1, and the host, who knows what's behind the doors, opens
another door, say No. 3, which has a goat. He then says to you, "Do you want to pick
door No. 2?“
19
Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats.
You pick a door, say No. 1, and the host, who knows what's behind the doors, opens
another door, say No. 3, which has a goat. He then says to you, "Do you want to pick
door No. 2?“
20
Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats.
You pick a door, say No. 1, and the host, who knows what's behind the doors, opens
another door, say No. 3, which has a goat. He then says to you, "Do you want to pick
door No. 2?“
21
Suppose you're on a game show, and you're given the choice of three doors:
Behind one door is a car; behind the others, goats.
You pick a door, say No. 1, and the host, who knows what's behind the doors, opens
another door, say No. 3, which has a goat. He then says to you, "Do you want to pick
door No. 2?“
22
? ?
23
? ?
24
? ?
25
? ?
26
? ?
27
? ?
28
The Monty Hall problem
29
Probability and Inference
Probability
Data generating
Observed data
process
Inference
30
Figure based on one by Larry Wasserman, "All of Statistics"
Bayesian Statistics
31
Bayesian and Frequentist
Interpretations of Probability
• Bayesian interpretation:
– Probability refers to degrees of belief
– A probability distribution quantifies our uncertainty
– Typically subjective
• Frequentist interpretation:
– Probabilities are long-run frequencies
– I.e. if I repeat this experiment many times, what proportion of the time
will the event occur?
32
Limitations of Frequentist Probabilities
• We cannot assign frequentist probabilities to many “degree of belief”
statements that we’d like to use in our everyday lives, e.g.
33
Model parameters
• In Bayesian statistics, model parameters are
random variables. It is meaningful to talk about their
probabilities,
(prior) (posterior)
35
Elements of Bayesian Inference
Likelihood
Prior
36
Elements of Bayesian Inference
Likelihood
Prior
37
Subjectiveness of Bayesian Inference
• Bayesian inference is sometimes criticized for being
subjective. The choice of prior may affect the
conclusions, and so two Bayesians may get different
answers to the same query.
41
Fully Bayesian inference:
modeling uncertainty
Number of
observations
42
Point Estimates
• Maximum likelihood estimator (MLE):
44
MAP estimate can result in overfitting
45
46
Answer
• B) and C) are both good answers.
47
48
Making Predictions
• The posterior predictive distribution the distribution of new,
unseen data, given the observed data.
• E.g., given that we have seen 10 coin flips come up heads, and
one come up tails, the posterior predictive distribution gives
us the probability that the next coin flip is heads or tails.
49
Simulating the posterior predictive
• We can rewrite as the expected value of the
new data point with respect to the posterior:
53
Announcements
• Homework 1 is due 4/14/2016