You are on page 1of 15

Source Coding

in
Digital Communication System

Syeda Quratulain Ali

Digital Communication

Digital Communication is the communication that has


a digital (standard is binary) interface between source
and destination

Concept Q1: Why we need a digital interface?

Standardization

Layering

Concept Q2: How digital interface is practically


realized?

By introducing Source Encoder and corresponding decoder at the


other end of channel.

Digital Communication Model

Fig 1.1

Fig 1.2a

Shannons Views

Can we link probability with communication?

May we know ahead of time what is expected to


come next, from receivers end?

Shannon introduced the concept of entropy

Simple words: Shannon entropy is the expected


value (average) of the information contained in
each message.

The more you are certain what you are going to


receive, the less is the entropy of communication
system.

Information Theory

Source emits letters in a probabilistic fashion


s ( m) a1 , . . . . , aK A
0 Pr ak 1 , k 1 ,. . . . , K

P a 1
k 1

Alphabet A known by source and sink

Issues

How should letters be represented by bits? Any limit to compression?

Can we mitigate the errors introduced by the channel?

Fig 1.2b

Why: Source Coding


Theorem

The theory provides answers to two


fundamental questions (among others):

What is the irreducible complexity below which a signal


cannot be compressed?

What is the ultimate transmission rate for reliable


communication over a noisy channel?

Source Coding Theorem


Definitions

Define source entropy H(A)

H A Pr ak log 2 Pr ak bits
k

Entropy maximized when letters are equally likely

P a 1K
r

H A log 2 K

Entropy minimized when one letter occurs with


probability one

P a 1
r

H A 0

Entropy Calculation
Example

1
Pr a1
2

H A

1
1
1
Pr a2
Pr a3
Pr a4
4
8
8
1
1 1
1 1
1 1
1
log 2 log 2 log 2 log 2
2
2 4
4 8
8 8
8

1
1
1
1

1 2 3 3
4
8
8
2

1.75 bits

Practice Example
H A Pr ak log 2 Pr ak bits
Find Entropyk
Letter

Pr

0.4

0.2

0.15

0.15

0.1

Source Coding Theorem


B ( ak ) number of bits assigned to ak
K

B ( A) Pr ak B (ak ) average number of bits to represent A


k 1

Shannon (1948)
There exists a source coder and decoder that can
represent and retrieve the sources alphabet if

H ( A) B ( A) H ( A) 1

Lossless compression

Furthermore, no error-free source coder/decoder exists if

B ( A) H ( A)

Lossless compression

Compression Schemes

Lossless Compression:

LZW (Lempel-Ziv-Welch), zip, Huffman, simple codes

Lossy Compression:

JPEG, MPEG, MP3,

Compression Example
1
1
Pr a1
Pr a2
2
4
H ( A) 1.75 bits
Simple Binary code :

1
1
Pr a3
Pr a4
8
8
1.75 B ( A) 2.75 bits
B (ak ) log 2 K B ( A)

a1 00 a2 01 a3 10 a4 11 B ( A) 2.0 bits
Better binary code : Varible length code
a1 0 a2 10 a3 110 a4 111 B ( A) 1.75 bits
But there is anissue ...
a1a1a3 a1a2 a1a4 001100100111

Huffman Codes

Huffman Codes Example


Compress the word Zebra using Huffman Algorithm,
where the frequency of letters in word zebra are:

Letter

Probability

1/16

1/2

1/8

1/16

1/4

Reference
[1] Don H. Johnson, Fundamentals of electrical engineering 1, connexions
Rice University Press, USA.
[2] Robert G Gallager, Principal of digital communication, Cambridge
University Press
2008, UK.

You might also like