You are on page 1of 8

Chapter 5 Random vectors, Joint distributions Lectures 18 -23

In many real life problems, one often encounter multiple random objects. For example, if one is interested in the future price of two different stocks in a stock market. Since the price of one stock can affect the price of the second, it is not advisable to analysis them separately. To model such phenomenon, we need to introduce many random variables in a single platform (i.e., a probability space). First we will recall, some elementary facts about -dimensional Euclidean space. Let

with the usual metric

A subset

of

is said to be open if for each

, there exists an

such that

where

Any

open

set

can

be

written

as

countable

union

of

open

sets

of

the

form

, called open rectangles.

Definition 5.1. The


of and is denoted by

-field generated by all open sets in .

is called the Borel

-field of subsets

Theorem 5.0.16 Let

Then

Proof. We prove for

, for

, it is similar. Note that

Hence from the definition of

, we have

Note that for

For each

such that

we have

Hence all open rectangles are in

. Since any open set in

can be rewritten as a countable , we get

union of open rectangles, all open sets are in

. Therefore from the definition of

This completes the proof. (It is advised that student try to write down the proof for

Definition 5.2. Let


vector if

be a probability space. A map

, is called a random

Now onwards we set

(for simplicity)

Theorem 5.0.17
where denote the

is a random vector iff component of .

are random variables

Proof: Let
For

be a random vector.

since Therefore Suppose For (5.0.1) Set is a random variable. Similarly, we can show that are random variables. is a random variable.

By (5.0.1) (5.0.2)

For

, we have

Hence

Thus

. Similarly

Hence

Thus from (5.0.2), we have

Therefore from Theorem 5.0.16, we have the proof.

. Hence

is a random vector. This completes

Theorem 5.0.18 Let

be a random vector. On

define

as follows

Then

is a probability measure on , we have

Proof. Since

Let

be pair wise disjoint elements from . Hence

. Then

are pair

wise disjoint and are in

This completes the proof.

Definition 5.3. The probability measure


denoted by .

is called the Law of the random vector

and is

Definition 5.4. (joint distribution function)

Let

be a random vector. Then the function

given by

is called the joint distribution function of

Theorem 5.0.19 Let


the following. (i) (a)

be the joint distribution function of a random vector

. Then

satisfies

(b)

(ii) (iii)

is right continuous in each argument. is nondecreasing in each arguments.

The proof of the above theorem is an easy exercise to the student. Given a random vector the marginal distribution of , the distribution function of denoted by of is called is defined.

. Similarly the marginal distribution function of

Given the joint distribution function as follows.

, one can recover the corresponding marginal distributions

Similarly

Given the marginal distribution functions of the dependence of over

and

, in general it is impossible to construct the and

joint distribution function. Note that marginal distribution functions doesn't contain information about and vice versa. One can characterize the independence of

in terms of its joint and marginal distributions as in the following theorem. The proof is beyond the scope of this course.

Theorem 5.0.20 Let


and are independent iff

be a random vector with distribution function

. Then

Definition 5.5. (joint pmf of discrete random vector) Let


vector, i.e, are discrete random variables. Define by

be a discrete random

Then

is called joint pmf of

Definition 5.6. (joint pdf of continuous random vector) Let


random variable (i.e., there exists a function

be a continuous . If

are continuous random variables) with joint distribution function such that

then

is called the joint pdf of

Theorem 5.0.21 Let

be a continuous random vector with joint pdf

. Then

Proof. Note that L.H.S of the equality corresponds to the law of Let student). Set denote the set of all finite union of rectangles in . Then

. is a field (exercise for the

Then

are probability measures on

and

on

Hence, using extension

theorem, we have

i.e.,

Example 5.0.34 Let

be two random variables with joint pdf given by

If

denote the marginal pdfs of

and

respectively, then

Therefore

Here

means

is normally distributed with mean

and variance

. Similarly,

Therefore

Also note that

and

are dependent since,

see exercise.

Theorem 5.0.22 Let


is given by

be independent random variables with joint pdf

. Then the pdf of

where

denote the convolution of

and

and is defined as

Proof. Let

denote the distribution function of

. Set

Therefore

This completes the proof.

Example 5.0.35 Let


respectively. Then

be independent exponential random variables with parameters

and

is given similarly. Now for

, clearly

. For

Conditional Densities. The notion of conditional densities are intended to give a quantification of dependence of one random variable over the other if the random variables are not independent. Definition 5.7. Let
density of given be two discrete random variables with joint pmf denoted by is defined as . Then the conditional

Intuitively,

means the pmf of

given the information about

. Here information about for each . One can rewrite

means knowledge about the occurrence (or non occurrence) of in terms of the pmfs as follows.

Definition 5.8. Let


distribution of given

are continuous random variables with joint pdf is defined as

. The conditional

Definition 5.9. If density of given

are continuous random variable and if . Then for ,

denote the conditional

Example 5.0.36
variable over

Let . i.e.,

be uniform random variable over

and

be uniform random

Note that the pdf of

given

is

, i.e.

Also

Hence

You might also like