homework

Upload: ahshali

Post on 10-Mar-2016

213 views

Category:

Documents


0 download

DESCRIPTION

Information

TRANSCRIPT

  • A First Course in Information Theory

    Yuan Luo

    December 3, 2009

    1 Weak Typicality

    Problem 1 (p70.5) Let p and q be two probability distributions on the samealphabet . Denote = jH(p)H(q)j. Then, 8 > 0,

    limn!1Prfj

    1nlogP (X)H(q)j < g =

    0 if < ;1 if > ; (1.1)

    where X is composed of i.i.d. random variables with distribution P = pn.

    Proof.

    For the case < ,

    j 1nlogP (X)H(q)j

    = j 1nlogP (X)H(p) +H(p)H(q)j

    jH(p)H(q)j j 1nlogP (X)H(p)j

    = j 1nlogP (X)H(p)j:

    Then

    limn!1Prfj

    1nlogP (X)H(q)j < g

    limn!1Prfj

    1nlogP (X)H(p)j > g

    = 0:

    For the case > ,

    j 1nlogP (X)H(q)j

    = j 1nlogP (X)H(p) +H(p)H(q)j

    + j 1nlogP (X)H(p)j:

    1

  • Then

    limn!1Prfj

    1nlogP (X)H(q)j < g

    limn!1Prfj

    1nlogP (X)H(p)j < g

    = 1:

    Problem 2 (p70.6) Let p and q be two probability distributions on the samealphabet with the same support. Prove that for any > 0,

    jfx 2 n : j 1nlogQ(x) (H(p) +D(pjjq))j < gj 2n(H(p)+D(pjjq)+);

    limn!1Prfj

    1nlogQ(X) (H(p) +D(pjjq))j < g = 1;

    where X is composed of i.i.d. random variables with generic r.v. drawn accord-ing to distribution p, Q also denoted by qn is another n dimensional distribution.Note that, x = (x1; ; xn).Proof. Denote W = fx 2 n : j 1n logQ(x) (H(p) + D(pjjq))j < g. Forx 2W , we have

    Q(x) 2n(H(p)+D(pjjq)+):Then it follows from Q(x) 1 that jW j 2n(H(p)+D(pjjq)+).

    The second part of this problem is easy to be veried by using

    EP [ 1nlogQ(X)]H(p) =

    Xx

    1nP (x)logQ(x) +

    1nP (x)logP (x)

    =

    1nD(P jjQ) = D(pjjq)

    where P = pn.

    Problem 3 (p70.6) Universal source coding ...

    1. Proof.

    limn!1PrfX

    (s) 2 An (S)g limn!1PrfX

    (s) 2Wn[Xs]g = 1:

    2. Proof.

    jAn (S)j = j[s2S

    Wn[Xs]j

    Xs2S

    jWn[Xs]j

    Xs2S

    2n(H(X(s))+) (n is suciently large)

    Xs2S

    2n( H+)

    = jSj2n( H+) 2n( H+0) (n is suciently large):

    2

  • 3. In the corresponding Shannon's source coding scheme, we will show that,8 > 0, there exists a set An such that:(a) For each s 2 S,

    PrfX(s) 62 Ang < (1.2)when n is suciently large;

    (b)jR Hj < (1.3)

    where R = logjAnj

    n and n is suciently large.

    Furthermore, if there exist a such that 0 < < H, and a coding schemeBn such that R < H where R = logjB

    n j

    n and n is suciently large,then there is a source X 2 F such that

    (c) limn!1PrfX

    62 Bng = 1: (1.4)

    Proof. For any > 0, there exist > 0 such that:

    < ;1 > 2():

    Let An = An (S).a) Then for each s 2 S, by using the rst part of this problem,

    PrfX(s) 62 Ang = PrfX(s) 62 An (S)g < (1.5)when n is suciently large. (1.2) is obtained.

    b) By using the second part of this problem, we have

    jAnj = jAn (S)j < 2n( H+) (1.6)when n is suciently large. On the other hand, when n is sucientlylarge,

    jAnj = jAn (S)j jWn[X(s)]j (1 )2n(H(X(s))) for any s 2 S;

    and therefore

    jAnj (1 )2n( H) > 2n()2n( H) = 2n( H): (1.7)Thus, (1.3) follows from (1.6) and (1.7).

    c) Furthermore, assume that there exist a such that 0 < < H, anda coding scheme Bn such that R < H where R = logjB

    n j

    n and n issuciently large. Then for the source X 2 F with the largest entropy H,by using the converse part of Shannon's source coding theorem, we have

    limn!1PrfX

    62 Bng = 1

    3