Professional Documents
Culture Documents
Lecture 1 MDAS presentation Lecture 2 Probability, Bayes theorem, Random variables and probability densities Lecture 3 Catalogue of pdfs (uni-dimensional) Lecture 4 Catalogue of pdfs (multi-dimensional)
Expectation values
Consider continuous r.v. x with pdf f (x). Define expectation (mean) value as
Notation (often):
Notation:
Standard deviation:
Binomial distribution
Consider N independent experiments (Bernoulli trials):
But order not important; there are ways (permutations) to get n successes in N trials, total probability for n is sum of probabilities for each permutation.
random variable
parameters
Poisson distribution
Consider binomial n in the limit
Uniform distribution
Consider a continuous r.v. x with - < x < . Uniform pdf is:
N.B. For any r.v. x with cumulative distribution F(x), y = F(x) is uniform in [0,1].
Exponential distribution
The exponential pdf for the continuous r.v. x is defined by:
Example: proper decay time t of an unstable particle (t = mean lifetime) Lack of memory (unique to exponential):
Gaussian distribution
The Gaussian (normal) pdf for a continuous r.v. x is defined by:
(N.B. often m, s2 denote mean, variance of any r.v., not only Gaussian.) Special case: m = 0, s2 = 1 (standard Gaussian):
For n independent r.v.s xi with finite variances si2, otherwise arbitrary pdfs, consider the sum
Measurement errors are often the sum of many contributions, so frequently measured values can be treated as Gaussian r.v.s.
(G = 2, x0 = 0 is the Cauchy pdf.) E[x] not well defined, V[x] . x0 = mode (most probable value) G = full width at half maximum Example: mass of resonance particle, e.g. r, K*, f0, ... G = decay rate (inverse of mean lifetime)
Beta distribution
Often used to represent pdf of continuous r.v. nonzero only between finite limits.
Student's t distribution
Developed in 1908 by William Gosset, who worked under the pseudonym "Student" for the Guinness Brewery.
Multivariate distributions
Outcome of experiment characterized by several values, e.g. an n-component vector, (x1, ... xn)
joint pdf
Normalization:
Marginal pdf
Sometimes we want only pdf of some (or one) of the components:
Conditional pdf
Sometimes we want to consider some components of joint pdf as constant. Recall conditional probability:
conditional pdfs:
Recall A, B independent if
x, y independent if
If x, y, independent, i.e.,
, then
x and y, uncorrelated
Correlation (cont.)
Multinomial distribution
Like binomial but now m outcomes instead of two, probabilities are
Example:
represents a histogram
For n = 2 this is