Professional Documents
Culture Documents
AprilMay 2010
Multisuser IT
AprilMay 2010
1 / 37
Contents
Introduction
Typicality
Multisuser IT
AprilMay 2010
2 / 37
Contents
Introduction
Typicality
Multisuser IT
AprilMay 2010
2 / 37
Contents
Introduction
Typicality
Multisuser IT
AprilMay 2010
2 / 37
Contents
Introduction
Typicality
Multisuser IT
AprilMay 2010
3 / 37
Point-to-point communication is well understood One source, one channel, one receiver Capacity regions well-known Rate-distortion functions well-known
Multisuser IT
AprilMay 2010
4 / 37
The presence of multiple users brings new elements to the communication problem Instead of two main users, one has many... ...possibly a network of interconnected nodes a graph The single-source / single-receiver case is known since Shannon The multiuser case includes several important problems Some are well understood, some are not
Multisuser IT
AprilMay 2010
5 / 37
Multiple-access channel
There are several senders who wish to communicate with one receiver The channel is shared among all senders This problem is well understood
Multisuser IT
AprilMay 2010
6 / 37
Broadcast channel
Consider now a situation in which one sender transmits to many receivers In this case there are problems that still lack complete answers
Multisuser IT
AprilMay 2010
7 / 37
Relay channel
Has one source and one receiver... ...but also at least one intermediate receiver/sender relay nodes In this case there are also problems that lack complete answers
Multisuser IT
AprilMay 2010
8 / 37
Contents
Introduction
Typicality
Multisuser IT
AprilMay 2010
9 / 37
Typical sequences
C = {X1 , X2 , . . . Xk } are discrete r.v. with p(X1 , X2 , . . . Xk ) S is an ordered subset of C Consider n independent copies of S Example: n independent copies of S = {Xa , Xb , Xc } would form a n 3 matrix: x1a x1b x1c x2a x2b x2c x3a x3b x3c . . . . . . . . . xna xnb xnc The probability of a given (xa , xb , xc ) is
n
(n independent copies)
Paulo J S G Ferreira (SPL) Multisuser IT AprilMay 2010 10 / 37
Typical sequences
Consider n independent copies of subsets of C = {X1 , X2 , . . . Xk } The typical sequences (x1 , . . . xk ) satisfy for every S C There are 2k possible subsets of C The set of typical sequences is denoted by T (I should really use something like T () or even T (n, ), but lets keep it simple) log p(s) n H(S) <
Multisuser IT
AprilMay 2010
11 / 37
We are considering k random variables Fix any subset A of the k variables Consider n independent copies Ai of it Then, by the law of large numbers, log p(A1 , A2 , . . . An ) n = 1 n log p(Ai ) H(A)
Compare with the denition of typical sequence The law of large numbers guarantees that as n the conditions are satised, no matter how small is
Multisuser IT
AprilMay 2010
12 / 37
Notation
T (S) is the restriction of T to a subset S of the k vectors Let for example S = (Xa , Xb ) In this case, the typicality condition would mean log p(xa , xb ) n log p(xa ) n log p(xb ) n H(xa , xb ) <
Multisuser IT
AprilMay 2010
13 / 37
More notation
Consider the condition Rewrite it as Solve for p(a): This is log p(a) n H(A) = , || 1 log p(a) n H(A) <
If n is sufciently large, then for any subset S of C = {X1 , X2 , . . . Xk } P(T (S)) 1 This is a direct consequence of the law of large numbers
Multisuser IT
AprilMay 2010
15 / 37
The probability of any sequence s in T (S) is p(s) = 2n(H(S)) This is a consequence of the denition log p(s) n H(S) <
Multisuser IT
AprilMay 2010
16 / 37
p(s) 2n(H(S)+)
sT (S)
|T (S)|2n(H(S)+)
p(s) 2n(H(S))
sT (S)
|T (S)|2n(H(S))
This shows that |T (S)| (1 )2n(H(S)) 2n 2n(H(S)) Hence, 2n(H(S)2) |T (S)| 2n(H(S)+) , as wanted
Paulo J S G Ferreira (SPL) Multisuser IT AprilMay 2010 18 / 37
Multisuser IT
AprilMay 2010
19 / 37
Consequences
Let S1 , S2 be subsets of C = {X1 , X2 , . . . Xk } Similar idea shows that the number of sequences that are jointly typical with a given (typical) sequence g S2 satises |T (S1 |g)| 2n(H(S1 |S2 )+2) For three subsets of C, the probability of joint typicality is P P[(S1 , S2 , S3 ) T ] = 2n(I(S1 ;S2 |S3 )6)
Multisuser IT
AprilMay 2010
20 / 37
Why?
S1 , S2 , S3 are independent given S3 , same pairwise marginals Thus p(S1 , S2 , S3 ) = p(S1 , S2 |S3 )p(S3 ) = p(S1 |S3 )p(S2 |S3 )p(S3 ) P= p(s3 )p(s1 |s3 )p(s2 |s3 )
= |T (S1 , S2 , S3 )|2n(H(S3 )) 2n(H(S1 |S3 )2) 2n(H(S2 |S3 )2) = 2n(H(S1 ,S2 ,S3 )) 2n(H(S3 )) 2n(H(S1 |S3 )2) 2n(H(S2 |S3 )2)
Result follows from I(X , Y |Z) = H(X , Y , Z) + H(Z) + H(X |Z) + H(Y |Z)
Multisuser IT
AprilMay 2010
21 / 37
Why?
I(X , Y |Z] = =
Multisuser IT
AprilMay 2010
22 / 37
Contents
Introduction
Typicality
Multisuser IT
AprilMay 2010
23 / 37
Simplest version: two senders, one receiver New difculty: in addition to noise there is interference Arises in several practical problems
One satellite receiver, many emitting ground stations One base station, many cell phones within its range
Multisuser IT
AprilMay 2010
24 / 37
x1 p(y |x1 , x2 ) x2 y
Multisuser IT
AprilMay 2010
25 / 37
Code
Codes for this channel must account for x1 and x2 Notation: 2nR1 , 2nR2 , n Need:
Two encoding functions One decoding function g()
Multisuser IT
AprilMay 2010
26 / 37
Operation
Each sender picks an index w1 , w2 in the appropriate range and sends the corresponding codeword Probability of error: Pn = e 1 2n(R1 +R2 )
w1 , w2
A pair (R1 , R2 ) is achievable if there exists a sequence of codes 2nR1 , 2nR2 , n such that Pn 0 e The capacity region is the closure of the set of achievable pairs
Multisuser IT
AprilMay 2010
27 / 37
Capacity region
The capacity region is described by R1 < I(X1 , Y |X2 ) R2 < I(X2 , Y |X1 ) R1 + R2 < I(X1 , X2 |Y )
R2
C R1
Paulo J S G Ferreira (SPL) Multisuser IT AprilMay 2010 28 / 37
There is no interference The capacity region is bounded by the vertical and horizontal lines only
Multisuser IT
AprilMay 2010
29 / 37
In this case Y = X1 X2 By setting X1 = 1 one can achieve R2 = 1 on the second channel By setting X2 = 1 one can achieve R1 = 1 on the rst channel It is impossible to make R1 + R2 > 1 since Y is binary Time-sharing makes R1 + R2 = 1 possible The capacity region is bounded by the R1 + R2 = 1 diagonal line
Multisuser IT
AprilMay 2010
30 / 37
Consider Y = X1 + X2 , binary inputs, ternary output Value Y = 1 can arise as a result of (0, 1) or (1, 0) By setting X1 = 0 one can achieve R2 = 1 on the second channel By setting X2 = 0 one can achieve R1 = 1 on the second channel If R1 = 1, the channel looks like an erasure channel to the other channel The capacity of the erasure channel with p = 1/ 2 is 1/ 2 Hence if R1 = 1 one can send an additional 1/ 2 bit of information on the other channel
Multisuser IT
AprilMay 2010
31 / 37
Achievability proof
To construct the codebook, generate:
2nR1 codewords X1 (i), i = 1, 2, . . . 2nR1 2nR2 codewords X2 (i), i = 1, 2, . . . 2nR2
Each element is generated independently following p1 (x1 ) (p2 (x2 )) To send i, sender 1 sends X1 (i) To send j, sender 2 sends X2 (j) The decoder picks (i, j) such that (x1 (i), x2 (j), y) are typical, if this is possible (that is, if the pair exists and is unique) Otherwise there is an error By symmetry, it is possible to assume w.l.g. that (i, j) = (1, 1)
Multisuser IT
AprilMay 2010
32 / 37
Proof outline
Let T be the set of typical (x1 , x2 , y) sequences There is an error if: (a) The correct codewords are not typical with the received sequence or if (b) There are incorrect codewords typical with the received sequence The error probability is the sum of the probabilities of (a) and (b)
Multisuser IT
AprilMay 2010
33 / 37
Error probability
The probabilities can be estimated using the previous results on typicality The probability of (a) is the probability of (x1 (1), x2 (1), y) / T Roughly speaking, this goes to zero as n by the law of large numbers and the denition of typicality See the basic results section
Multisuser IT
AprilMay 2010
34 / 37
Error probability
The probability of (b) is the probability of existence of a pair (i, j) other than (1, 1) typical with y This is bounded by the sum of the probabilities of (x1 (i), x2 (j), y) with (i, j) = (1, 1) being typical Roughly speaking, this can be bounded by terms similar to 2nI(X1 ,X2 |Y ) These terms, as seen before, also tend to zero as n increases See the consequences sections
Multisuser IT
AprilMay 2010
35 / 37
Convexity
The capacity region C is convex: Let (a, b) C Let (c, d) C Let [0, 1] Then z() = (a, b) + (1 )(c, d) is also in C This means that
Source 1 may send at the rate a + (1 )c, if it may send at rates a and c Source 2 may send at the rate b + (1 )d, if it may send at rates b and d
Of course, geometrically z() is a point in the line segment joining (a, b) to (c, d)
Multisuser IT
AprilMay 2010
36 / 37
Convexity proof
Given: two codebooks at the rates (a, b) and (c, d) Construct: a code at the rate (a + (1 )c, b + (1 )d) How?
Use codebook 1 for the rst n symbols Use codebook 2 for the remaining (1 )n symbols
Note each code being used a fraction or 1 of the time The new code has rates (a, b) + (1 )(c, d) The error probability for the new code is bounded by the sum of the two error probabilities for the two segments Hence, it also goes to zero: the new point is in the capacity region
Paulo J S G Ferreira (SPL) Multisuser IT AprilMay 2010 37 / 37