You are on page 1of 4

Imperial College London

BSc/MSci EXAMINATION May 2016


This paper is also taken for the relevant Examination for the Associateship

INFORMATION THEORY MOCK EXAM PAPER


For 4th-Year Physics Students
Friday, 13th May 2016: 14:00 to 15:00

The paper consists of two sections: A and B.


Section A contains one question [20 marks total].
Section B contains two questions [30 marks each].
Candidates are required to:
Answer ALL parts of Section A, and ONE QUESTION from Section B.
Marks shown on this paper are indicative of those the Examiners anticipate assigning.

General Instructions
Complete the front cover of each of the FOUR answer books provided.
If an electronic calculator is used, write its serial number at the top of the front cover of
each answer book.
USE ONE ANSWER BOOK FOR EACH QUESTION.
Enter the number of each question attempted in the box on the front cover of its corresponding answer book.
Hand in FOUR answer books even if they have not all been used.
You are reminded that Examiners attach great importance to legibility, accuracy and
clarity of expression.

c Imperial College London 2016



Note: these are MOCK problems!

Go to the next page for questions

SECTION A: COMPULSORY QUESTIONS [20 marks total]


This mock exam aims only to give you an idea of the exam structure!
1. 1. How much information is gained when a die is thrown and...
(i) A four occurs

[5 marks]

(ii) A four or a six occurs


[You may assume that the bha bha....]

[3 marks]

A newspaper picture consists of 50 rows and 40 columns of black and white


dots. Calculate the information in the picture if black and white dots are
(iii) Equally probable

[3 marks]

(iv) The probability of white dots is three times that of black dots

[3 marks]

3. The Chinese alphabet consists of more than 10,000 characters, although only
6000 are in general use, so how much information is gained when a single Chinese
character appears?
[6 marks]
[Total 20 marks]

Note: these are MOCK problems!

Please go to the next page

SECTION B: CHOOSE ONE QUESTION [30 marks total]


This mock exam aims only to give you an idea of the exam
structure!
2. A message source produces two independent symbols A and B with probabilities
p(A)=1/5 and p(B)=4/5.
(i) Calculate the entropy of the source

[4 marks]

(ii) Calculate the redundancy

[4 marks]

(iii) What would the probabilities have to be to achieve the maximum entropy?
[6 marks]

A transmitter generates two symbols A and B with probabilities 1/4 and 3/4
respectively. Find the average information received every second if
(iv) A and B take 1s each to transmit

[3 marks]

(v) A takes 1s and B takes 2s

[3 marks]

A transmitter produces four source symbols (A,B,C,D) with corresponding probabilities 0.5, 0.2, 0.2 and 0.1 at a rate of 1024 symbols/s.
(vi) Find the entropy

[4 marks]

(vii) Find the information rate and redundancy if the source produces 1024 symbols/s
and the symbols are statistically independent
[6 marks]
[Total 30 marks]

Note: these are MOCK problems!

Please go to the next page

3. A scanner converts a black and white document, line by line, into binary data for
transmission. The scanner produces source data consisting of symbols which represent runs of pixels of different lengths as indicated in the table below.
(i) Determine the entropy when the scanner traverses at 1000 pixels/s.

[7 marks]

(ii) Determine the average length of a run (in pixels) and the corresponding effective
information rate when the scanner traverses at 1000 pixels/s.
[7 marks]

No. of
pixels
Prob.

0.2

0.4

0.15

0.1

0.06

0.09

Table 1

A deadly disease affects 1 in every 1000 people. A test is available which is


positive 95% of the time and negative 5% of the time if you have the disease. If
you do not have the disease the test is positive only 5% of the time. Suppose
you test positive for the disease, what is the likelihood that you actually have
it?
[4 marks]

Find a compact instantaneous code for an alphabet consisting of five symbols


S1-S5 with probabilities 1/2, 1/4, 1/8, 1/16 and 1/5. Show that the average code
length is equal to the source entropy.
[3 marks]

A binary symmetric channel has a binary error probability of 1/3. Two equiprobable input symbols A and B are to be transmitted, coded as 11 and 00 respectively. Determine:
(iii) The channel capacity

[3 marks]

(iv) The information transfer through the channel, assuming that the receiver interprets 11 as A, 00 as B and makes no decision for 01 and 10
[2 marks]

If you remeber mod operations...


(v) Show that 2664 mod 115 = 41

[2 marks]

(vi) Show that 266 mod 11 is not 41

[2 marks]
[Total 30 marks]

Note: these are MOCK problems!

End of examination paper

You might also like