You are on page 1of 12
Chapter 15 ae 15-11 15.1-2 151-3 151-4 151-5 A. =04, Py =03, y= 0.2 and Py = 01 Hom) =-(F lop A+ Py lop + Pslog + Fs log Ps) +=1.846 bits (source entropy) ‘There are 10* symbols/s. Henee, the rate of information generation is 1846 10° bits/s. Information/element = log, 10 = 3:32 bits. Information/picture frame = 332 x 300,000 = 9:96 «10° bit. Information/word = log) 10000 = 133 bits. Information content of 1000 words = 133 x 1000 = 13,300 bits. “The information per picture frame was found in Problem 15.1-2 to be 9.96 108 bits. Obviously, it is not possible to describe a picture completely by 1000 words, in general. Hence, # picture is worth 1000 words is very much an underrating or understating the reality. (a) Both options are equally likely. Hence, 1 = log(ds) =} bit (b) P(2 lanterns) = 0.1 1(2 lanterns) = log, 10 = 3322 bits (a) All27 symbols equiprobable and P(x;) = Ye . Hy(x) = 27(g5 logs 27) = 4.755 bits! symbol (b) Using the probability table, we compute u Hy (x) =~ PCa) log P(x) = 4.127 bits / symbol isi (©) Using Zipf's law, we compute entropy/word H (x). 327, Hy) =~ ZPee. Por) m2, = YMlog Ah) = 9:1353 bits/ word. rt Hiletter =11/82/5.5=2.14 bits/symbol. Entropy obtained by Zipf's law is much closer to the real value than Hy(x) oF a(x). 128 2 83 4 1524 H(m)= ¥ Ailog A = 55 bits it Message Probability Code = an Ss 5 ™ 2 0 i 0 m= 0 i 0 7 0. 120 m; v4 10 v4 10 v4 10 4104 102 I m; vs 110 ge 110 V8 110,18 110) v4 11, m 16 110 V6 TNO 1/16 Oy VB 11, ms 132 MNO_ 321110116 1111, m 164 MM10P 132111 m 64 wii 1 1 1 1 1 1 1 Le DAL HOGA) gO Ge GOO 3 me Sp binary digits H(m) Efficiency = x100= 100% Redundancy 7 = (100-7) = 0% 4 152-2 H(m)=-¥ Plog P= 2.289 bits fet 2289 - = = 14442 3+) ieg3 2 ary units: ‘Message Probabilit Code Ss Sy ™ 13 0 7 0 0 m 3 1 V3 1 V3 1 m; VW 20 VW 20 uw 2 m VW 21. vw 21 ms wat 20 ww 22 ™ 127 21 m) 1a7 222, - Vy lay toyed 1 L= DRL =4()+2()+22)+ 52) +3551 Zab gt g@+5@)+50)+35,0) -2 3-ary digits = 1.4442 3-ary digits erin 9 = HD Ha «100 100 Redundancy y =(1-1)100= 0% 129 4 152-3 H(m)=-¥ Plog F, = 169 bits —_——— Vessage Probability Code = = 7 03 70s 0, 0s 0 m 03 10, 03 10,505 1 ™ 01 110-02 11, ™ 0 1 L=ZPL, = 0541) +03(2) +01(3)-+0.1G) = 17 binary digits Efficiency = Hm 100 = 46 100 = 992% Redundancy y = (1-7)100= 03% For ternary coding, we need one dummy message of probability 0. Thus, Message Probability Code: ™ Os 005 m 03 1 03 1 m 01 20 0.2 2 m O1 21 ms 0 22, L = 05(1)+0.3(1) +01(2)+01(2) = 12. 3-ary digits Hom) =169 bits = 2 = 10663. 3-ary units og 3 Efficiency 1242100 ee x 100 = 8886% Redundancy y = (1- 7)100= 11.14% 182-4 Message _ Probability Code = = —r 12 0 Tm 0 mz 0 m Vs 1 wma 1. woo m, ves ve 20 42 m 16 we 21 ms 132 ING 22, mg 64 im 64 LaDAL a 3-ay digits 16 From Problem 15.2-1, H(m) -€ bits = 1242 3-ary units H(m) =A) «100 1282. 109 = 94.63 Efficiency 17 11008 5 % Redundancy y = (1—7)100= 537% 130 15.2.5 ‘Message Probability Code s & 3 “Tn v3 T 13 T | 7S 1 18 1_ pros 0 m wa 00 18 (0018 si YL 3 ms 19 On 1 O1l V9 O11, pw2/9 010-131. my "9 0100 1 0100, = 1/9 100] J 1/9 ON ms 127 01010, 127 01010 } =» 1/9 0101, me 127 0101107} —» 227 01011 m 127 o10111, eran S = 2.4074 binary digits H(m) = 2289 bits (See Problem 15.2-2). Efticiency =H) 100 = 228 100=9508% Redundancy 7 =(1-7)100=492% 15.2-6 (a) A(m) = 3(4 10g 3) — 1585 bits (b) Ternary Code Wessage Probability Code ™m rE 0 m v3 1 m, V3 2 La tat i La ZO4 z+ za! 3-ary digits ie. 1585 . #1585 bits = 28% 21 3-ary unit ‘H(m) #1585 bits = 7=—Z=1 3-ary unit 82 Efficiency = Ao) 100 = 100% Redundancy 7 = (1-7)100 = 0% (©) Binary Code L= 40+@4@) 3 = 167 binary digits Efficiency 1-2. ts, 100 = 95.08% Redundancy y = (I~ 7)100 = 4.92% 131 @® ) Second extension ~ binary code Malt 1 29 a, Le sla(3}o +2(3}o] aa 1.611 binary digits ‘H(q) = 1585 bits Efficiency = 4. ©) 100 Redundancy y = (1-7)100= Ea = gi * 1007 9839% 161% = = = 1 Ol p10 wt coor | i ooo [1 or mae ro | woo | 1 coo f 19 0) mr | wie | v8 oooL, | 19 0000” fo | in. [ie nopd 1 ooo, ie ee oop] 19 10 = = = TT 0 fix oa fis wp) as 2 no] w2® wit 2 oO 28 » Ou. 9 oO, ‘The channel matrix can be represented as shown in Fig. $15.4-1 POn) = POrlay) P(x) + POilz2)PG2) x 228 =5 “to 3 “8 PU) = I~ roped Ped 1 1 3B %2 H(x) = P(x )log: Pa? Pea dea Ty = Hog23+ 210g; 3 = 0918 bits Fig. S141 To compute H(x\y), we find _ Ponixy) PCa) _ 10 _ PoplsPen) 5 PO In) POn) B POly2) PU) n P( Pisqly)) = Pola) Pesan) = Pear = Pon) 64 1 OO TG Ai 5 Pea TS 3 13 = Blogs B+ Seg Y= 079 1 H P(x|y2)log——— (x\y2) = PC|y1) log = —— Fi Dy (x2 |¥2 log Peba) Sons 2+ Bog, St = 0624 132 2/3 and H(xly) = eae P(y2)H(zly2) = Zorm-3 = (0 624) = 0.6687 ‘Thus, Uexly) = H@)~ HOaly) = 0918 - 0.6687 = 0.24893 bits/ binit HQ)= LPO Fl a Bog ti» Bosse = 08673 bits/symbol T Also, H(y\x) = H(y)~ I(xly) = 08673-02493 = 0.618 bits/ symbol 154-2. The channel matrix P(y,;) is (P) —_+—, CPD , 1 0 0 P hea @) Eee ® 0 1 * POP Also, P(n) = P, PO2)= PUs)= 2 @) *5 P 3 Now we use Plely,)= eee PEO tg obiain Fig, S15.42 LPO) 10 0 P(xily,)matrixas —-y, [0 p Imp O lp P H(x) = E P(x) log- Pa” ~Plog P-2QlogQ with (2Q=1- P) = {ome P+(l- nro 52)] =0(P)+(1-P) 1 Hosly)= rE PO Ply DB EG) = Plog + rarbee-rln}e-via t+ 9h05] a 1-p. ta f = 0+20M%p) = (I- P)ACP) Axly) = H(x)~ Healy) = A(P) + (1= P)~ (= PCP) =0(P)+(1- PYI-2(p)] Leng £=2%? or Olp)=lon 8 I (ly) = A(P)+(1- PX 10g A) Ff 4 Ziewyy=o oF Flr) +(1— PAI tog Ay} =O. Ths means SZ lpiog p+) Poett- P\t-le6)]=0 Jog P~ log(1~ P) +[1-logA]=0 133 aa1+ logs Therefore log: 1-P Note: <1 logy = 10822 +1052 B= 062 4 Pak = AL ang 1- P= 2 a gen PB $0 . Bigg #2 4 2 tog B22 4 2 (1-198) = p+2 C= MAXI(xly) pap Fa pat web) eG B 18463 Consider the cascade of 2 BSCS shown in Fig. $15.4-3. In this case Pyy(it)= (I~ AXI- B+ AR = 1A 2A Py(Ol) =(1- ADR AC A)= A+ 2A (a) Cascade of The Rt» BSC , = ct) Hence, the channel matrix of the cascade is [roncine A+h-2AR J} fn" ats? ea A+P,-2AP 1-R-R-28P] LA MAILE 1-P, ‘This result will prove everything in this problem. (a) With A = Prom the above result follows thatthe channel matrix is indeed mm. © Werhave already shown thatthe channel matrix of two cascaded channels is MM2- (©) Consider a cascade of # identical channels broken UP 38 channel cascaded with the k’ channel. If M,., is the channel matrix ofthe first 1" channels in cascade then from the results derived in par (>), the channel matrix of the k cascaded channels is My = MyM. This is valid for any k. We have already proved it for k =2, that Mz = M2. Using the process of induction it is clear that M, = M* aeeaey ify these results from the development in Example 10.7. From the results in Example 10.7, we have, for a cascade of 3 channels 1- Pr =(l-B) +3P70-F) 1-3P,43P2- PO +3P -3Pe =1-3P, 4672-47 and Pp =3P,-6PP +4PP Now 134 fa fe" p Te (3, -6R2+472) 3% -6R2+4P? «I= |5p,-6R244R? — 1-GR-6R,? +42?) Clearly Pe =3P,-6P2 +4P which confirms the results in Example 10.7 for k = 3. (@) From Equation 15.25 L 1 [re bape td- Pe)logs— | where Pp is the error probability of cascade of & identical channel. ‘We have shown in Example 10.7 that reat o- Ry om ee we PAQ- nw] d Wa < —(1+logp) dicep)= pand Bat ‘Substituting these quantities in Equation 15.37, we have “(1+ log p) +a, =0= p=e™™! and yn Mette 2a) <1 Hence, ted wage 2M 2M Also, 1 1 HQ)= [penton d= | spp 082 Mae = og 2M 15.52 Wehave H(x) = -[° plog par, A=|y spar, I= fp pe F(x,p) =~plog p and FF --stp) tiasp)= prand Sax #2(z,p)= pand aa ‘Substituting these quantities in Equation 15.37, we have ~(1+logp)+ayx+a2 =0 or pa et002 a (@22-Nerit Substinuting this relationship in earlier constraints, we get Hence, sett =a, p(x) =-aye™ et Ie JP pede = [Peet = 1 and A [pape ase Hence, To obtain H(x) 137 H(3) = -[P pCa) log (ada = -[p° 1-8 A ine] C3 = log 4\> playa + EELS shade = log A +loge = log(ed) 155-3 Information per picture frame = 9.9610 bits. (See Problem 15.1). For 30 picture frames per second, we need a channel with capacity C given by C= 30%996 105 = 2.988 x 107 bits/sec. But for a white Gaussian noise Ss = Blog [1+— c= Blog (1+) We are given =50db =100,000 (Note: 100,000 = $0 db) Hence, B=18MHz 15.5-4 Consider a narrowband Af where Af -> 0 so that we may consider both signal noise power density to be ‘constant (bandlimited white) over the interval Af - ‘The signal and noise power Sand N ‘respectively are given by S=25,(w)4f and N=25,(o)4f ‘The maximum channel capacity over this band Af is given by S4n S,(o)+5,(0) a ap tog [StM]= ay og | 2+ Sno ow aoe W | a | x) | “The capacity of the channel over the entire band (/j, 2) is given by fa igg { Sei Salo) cuff ire [ xr ‘We now wish to maximize C where the constraint is that the signal power is constant. 2 Sw) df =S (aconstant) ‘Using Equation 15.37, we obtain 2 tog [5st 50 |, pS = =| Ss }eat ° 55+Sn= + (aconstant) @ ‘Thus, 5,(0)+S,(0)=—* a ‘This shows that to attain the maximum channel capacity, the signal power density + noise power density must be a constant (white). Under this condition, C= [Plog [seepesied foes 2] a =| S,(0) aul be =ci- sien (-2}- foe (storey 138 = Blog [5,(o)+Sq(@)]~ [7 log [Sy] 15.55 _ In this problem, we use the results of Problem 15.5-4, Under the best possible conditions, C= Blog [S,(o)+5,(0)}- 7? log [Sn(or] of consant We shall now show that the integral if Jog [Sp()] df is maximum when 5,(«) = constant if the noise is constrained to have a given mean square value (power). Thus, we wish to maximize [Plog [Son]ar under the constraint 2{ft og [Sy(o)] af = N(@-constant) Using Equation 15.37, we have leona aR 0 or dtaz0 Sy Sy(0) = 4 (a constant) ‘Thus, we have shown that for a noise with a given power, the integral Jog [Sao] is maximized when the noise is white. This shows that white Guassian noise is the worst possible kind of noise. 139

You might also like