You are on page 1of 4

Mathematical Tools For Finance

PROJECT 1

This project basically is an exercise to implement Metropolis


Algorithm and to estimate the expectation of the distribution with a
certain density as described in the lecture.

The aim of the project is to write a MATLAB code by implementing


metropolis algorithm, plot graphs, etc. and to analyse variance of the
estimator.

The software used for this purpose is MATLAB

MSc FINANCIAL MATHEMATICS

Rishika Saluja 1062723

Lin Helen Zhang 0710178


INTRODUCTION

Monte Carlo and Markov Chain methods in general are used to value and analyse investment
options by the following ways:

1. Simulating different uncertainty sources affecting the value of portfolio and then
finding out the average value over different outcomes (Monte Carlo)
2. A random process undergoing transition from one state to another in a chain like
manner wherein next state depends on the current state and not on past (Markov
Chain)

Monte Carlo are stochastic methods i.e. based on random numbers and statistical techniques
to find out solutions wherein a large system is divided in various random groups and data can
be used to describe the system and the advantage of this method increases as the dimension of
problem increases.

Markov Chain on the other hand is a sequence of random variable X1, X2, X3, ... and
represented in the following way:

Where the possible value of Xi forms a countable set S (State Space) and plotted in a graph
describing the probability of going from one state to other states is labelled.

The combination of both the above methods is called Monte Carlo Markov Chain and the
basic difference of this method as compared to above is that the generated point form a
sequence that takes the behaviour of random walk instead of each point being independent
from other. But, it does follows the property of markov chain as probability of jumping from
one point to other depends only on last point and not on the entire previous history.

The Metropolis Algorithm is a kind of Monte Carlo Markov Chain process for obtaining a
sequence of random samples from a probability distribution for which direct sampling is
difficult and this sequence can be used to approximate the distribution (i.e. to generate
histograms), or to compute an integral ( like an expected value).This algorithm can take any
type of probability distribution and generate samples from it. The basic requirement is that a
function that is proportional to density can be calculated.

By the use of this algorithm, we can generate markov chain where each state depends
only on the previous states i.e. by using a proposal density (proposal density
also depends on the current state a generate a new proposed sample ).

is taken from which is the uniform distribution and satisfies the following
property :
Step by Step Analysis:

1. Let the most recent value is

2. Use metropolis algorithm to generate new proposal state with probability

3. Calculate value of

4. Then the new state is chosen accordingly:

5. The algorithm works best if the proposal density matches the shape of the target
distribution , that is

“It has been shown theoretically that the ideal acceptance rate for a one dimensional
Gaussian distribution is approx 50%, decreasing to approx 23% for an -dimensional
Gaussian target distribution” (wikepedia)

“If is too large the acceptance rate will be very low because the proposals are likely to
land in regions of much lower probability density, so will be very small and again the
chain will converge very slowly” (wikepedia)

ISSUE WITH MCMC


1. Each point generated in the sequence has some correlation with immediately
preceding points and hence chain moves slowly and irregularly.

2. If proposed point is not accepted, chain remains at the same position in the next state
as was in previous state.

3. The initial part of the sequence is strongly influenced by the arbitrary starting point
and hence it becomes mandatory to remove the initial part.

You might also like