Professional Documents
Culture Documents
Claude shannon
Gowshigan.S
Asian Institute of Technology Thailand
Information Theory and Coding 20000471
07.04.2014
Gowshigan.S (AIT)
Presenation
07.04.2014
1 / 20
Overview
1
Introduction
Communication system
Second Section
Gowshigan.S (AIT)
Presenation
07.04.2014
2 / 20
Introduction
Claude Elwood Shannon was an American mathematician, electronic engineer, and cryptographer known as the father of information theory.
Shannon is famous for having founded information theory with a landmark
paper that he published in 1948
Gowshigan.S (AIT)
Presenation
07.04.2014
3 / 20
Communication System
Gowshigan.S (AIT)
Presenation
07.04.2014
4 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
5 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
6 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
7 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
8 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
9 / 20
Presenation
07.04.2014
10 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
11 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
12 / 20
Lets define a quantity which will measure how much information is produced by a process
Suppose we have a set of possible events whose probabilities of occurrence are p1,p2 ,..... pn.
and
If there is such a measure, say H( p1,p2,,,,pn )
H should be continuous in the Pi
If all the pi are equal, Pi=(1/n) then H should be a monotonic
increasing function of n. With equally likely events there is more
choice, or uncertainty, when there are more possible events.
Gowshigan.S (AIT)
Presenation
07.04.2014
13 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
14 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
15 / 20
H = 0 if and only if all the pi but one are zero, this one having the
value unity. Thus only when we are certain of the outcome does H
vanish. Otherwise H is positive.
For a given n, H is a maximum and equal to log n when all the pi are
equal (i.e., 1/n ). This is also intuitively the most uncertain
situation.
Gowshigan.S (AIT)
Presenation
07.04.2014
16 / 20
C onsider a discrete source of the finite state type considered above. For
each possible state i there will be a set of probabilities pi (j) of producing
the various possible symbols j.
Thus there is an entropy Hi for each state.
The entropy of the source will be defined as the average of these Hi
weighted in accordance withPthe probability
P of occurrence of the states in
question: H = i Pi Hi = i,j Pi pi (j) log pi (j)
This is the entropy of the source per symbol of text.
Gowshigan.S (AIT)
Presenation
07.04.2014
17 / 20
Discussion
Gowshigan.S (AIT)
Presenation
07.04.2014
18 / 20
Gowshigan.S (AIT)
Presenation
07.04.2014
19 / 20
The End
Gowshigan.S (AIT)
Presenation
07.04.2014
20 / 20