You are on page 1of 2

2102401 Random Processes for EE

Solution to Quiz 1, 2011

A binary transmission system transmits a signal X (1 to send a 0 bit; +1 to send a 1 bit). The received signal is Y = X + N where noise N has a zero-mean Gaussian distribution with unit variance. Assume that 1 bits occur with probability p and 0 bits occur with probability 1 p. 1. (4 points.) Find the conditional pdf of Y given the input value: fY (y | X = +1) and fY (y | X = 1). Solution. If X is given and equal to 1, then Y = N + 1, i.e., Y is simply an ane function of N . Therefore, we can conclude that Y is also a Gaussian with mean E[N ] + 1 = 1 and variance 12 var(N ) = 1. From this, we can write down the conditional pdf of Y given X = 1 by 1 2 fY (y|X = 1) = e(y1) /2 . 2 Similarly, if X = 1, then Y is a Gaussian with mean 1 and variance 1. Hence, 1 2 fY (y|X = 1) = e(y+1) /2 . 2 2. (6 points.) The receiver decides a 0 was transmitted if the observed value of y satises fY (y | X = 1)P (X = 1) > fY (y | X = +1)P (X = +1) and it decides a 1 was transmitted otherwise. Use the results from part 1) to show that this decision rule is equivalent to: If y < T decide 0; if y T decide 1. T is a threshold for this decision rule. Write down what values of T are for p = 1/4 and p = 1/2. Solution. The above condition is 1 1 2 2 e(y+1) /2 (1 p) > e(y1) /2 p, 2 2 from which it can be further derived as follows. e(y1)
2 /2+(y+1)2 /2

< <

2y

1p p 1p p

1 y < ln 2

1p p

) y < T.

Hence, for p = 1/4, we have T = 0.5 ln 3 and for p = 1/2, we have T = 0. 3. (5 points.) Let P+1 and P1 be the probabilities that the receiver makes an error given that a +1 was transmitted and a 1 transmitted, respectively. These probabilities depend on the parameter T obtained from the previous question. Write down P+1 and P1 as a function of T using the denition of cumulative function of the normalized Gaussian: z 1 2 (z) = et /2 dt. 2 Solution. The probability of error when 1 was transmitted is given by P (Y < T |X = 1) since for Y < T we should have decided the 0 was transmitted. Therefore, we can write P+1 as T T 1 2 e(y1) /2 dy = (T 1). P+1 = P (Y < T |X = 1) = fY (y|X = 1)dy = 2 1

Similarly, the probability of error when -1 was transmitted is P1 = P (Y T |X = 1) T = fY (y|X = 1)dy =

1 2 e(y+1) /2 dy 2

= 1 (T + 1) = ((T + 1)).

The above notation follow from a fact that for a Guassian variable X with mean and variance 2 , we can compute P (X a) by ( ) a P (X < a) = , and a property 1 (a) = (a). 4. (5 points.) Can you compare P+1 and P1 ? Which one is larger ? Compare their values for p = 1/4 and p = 1/2. (No need to compute the actual values ). Solution. We have T 1 (T + 1) for any T 0 and with equality when T = 0. Since the cumulative function () is an increasing function, i.e., (a) (b) for a b,

we can conclude that P+1 P1 for any T 0. In conclusion, for p = 1/4, P+1 is larger than P1 and for p = 1/2, we have P+1 = P1 . This result intuitively makes sense. When the bit 1 is less likely to occur (p < 1/2), the decision threshold T will be greater than zero. Then the probability that receiver makes an error (given 1 was transmitted) will be larger than the error when -1 was sent.

You might also like