3f1 information theory, lecture 1 - university of cambridgejs851/3f1mm13/l1.pdf · 1.introduction...

18
3F1 Information Theory, Lecture 1 Jossy Sayir UNIVERSITY OF CAMBRIDGE Department of Engineering Michaelmas 2013, 22 November 2013

Upload: others

Post on 12-Mar-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

3F1 Information Theory, Lecture 1

Jossy SayirUNIVERSITY OF CAMBRIDGEDepartment of Engineering

Michaelmas 2013, 22 November 2013

Organisation History Entropy Mutual Information 2 / 18

Course Organisation

I 4 lecturesI Course material: lecture notes (4 Sections) and slidesI 1 examples paperI Exam material: notes and example paper, past exams

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 3 / 18

Course contents

1. Introduction to Information Theory2. Good Variable Length Codes3. Higher Order Sources4. Communication Channels5. Continuous Random Variables

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 4 / 18

This lecture

I A bit of history. . .I Hartley’s measure of informationI Shannon’s uncertainty / entropyI Properties of Shannon’s entropy

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 5 / 18

Claude Elwood Shannon (1916-2001)

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 6 / 18

Shannon’s paper, 1948Reprinted with corrections fromThe Bell System Technical Journal,Vol. 27, pp. 379–423, 623–656, July, October, 1948.

A Mathematical Theory of Communication

By C. E. SHANNON

INTRODUCTION

THE recent development of various methods of modulation such as PCM and PPM which exchangebandwidth for signal-to-noise ratio has intensified the interest in a general theory of communication. A

basis for such a theory is contained in the important papers of Nyquist1 and Hartley2 on this subject. In thepresent paper we will extend the theory to include a number of new factors, in particular the effect of noisein the channel, and the savings possible due to the statistical structure of the original message and due to thenature of the final destination of the information.

The fundamental problem of communication is that of reproducing at one point either exactly or ap-proximately a message selected at another point. Frequently the messages havemeaning; that is they referto or are correlated according to some system with certain physical or conceptual entities. These semanticaspects of communication are irrelevant to the engineering problem. The significant aspect is that the actualmessage is oneselected from a setof possible messages. The system must be designed to operate for eachpossible selection, not just the one which will actually be chosen since this is unknown at the time of design.

If the number of messages in the set is finite then this number or any monotonic function of this numbercan be regarded as a measure of the information produced when one message is chosen from the set, allchoices being equally likely. As was pointed out by Hartley the most natural choice is the logarithmicfunction. Although this definition must be generalized considerably when we consider the influence of thestatistics of the message and when we have a continuous range of messages, we will in all cases use anessentially logarithmic measure.

The logarithmic measure is more convenient for various reasons:

1. It is practically more useful. Parameters of engineering importance such as time, bandwidth, numberof relays, etc., tend to vary linearly with the logarithm of the number of possibilities. For example,adding one relay to a group doubles the number of possible states of the relays. It adds 1 to the base 2logarithm of this number. Doubling the time roughly squares the number of possible messages, ordoubles the logarithm, etc.

2. It is nearer to our intuitive feeling as to the proper measure. This is closely related to (1) since we in-tuitively measures entities by linear comparison with common standards. One feels, for example, thattwo punched cards should have twice the capacity of one for information storage, and two identicalchannels twice the capacity of one for transmitting information.

3. It is mathematically more suitable. Many of the limiting operations are simple in terms of the loga-rithm but would require clumsy restatement in terms of the number of possibilities.

The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If thebase 2 is used the resulting units may be called binary digits, or more brieflybits, a word suggested byJ. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit ofinformation.N such devices can storeN bits, since the total number of possible states is 2N and log22N = N.If the base 10 is used the units may be called decimal digits. Since

log2M = log10M= log102

= 3:32log10M;

1Nyquist, H., “Certain Factors Affecting Telegraph Speed,”Bell System Technical Journal,April 1924, p. 324; “Certain Topics inTelegraph Transmission Theory,”A.I.E.E. Trans.,v. 47, April 1928, p. 617.

2Hartley, R. V. L., “Transmission of Information,”Bell System Technical Journal,July 1928, p. 535.

1

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 7 / 18

Shannon’s paper, 1948

INFORMATIONSOURCE

MESSAGE

TRANSMITTER

SIGNAL RECEIVEDSIGNAL

RECEIVER

MESSAGE

DESTINATION

NOISESOURCE

Fig. 1—Schematic diagram of a general communication system.

a decimal digit is about 313 bits. A digit wheel on a desk computing machine has ten stable positions andtherefore has a storage capacity of one decimal digit. In analytical work where integration and differentiationare involved the basee is sometimes useful. The resulting units of information will be called natural units.Change from the basea to baseb merely requires multiplication by logba.

By a communication system we will mean a system of the type indicated schematically in Fig. 1. Itconsists of essentially five parts:

1. An information sourcewhich produces a message or sequence of messages to be communicated to thereceiving terminal. The message may be of various types: (a) A sequence of letters as in a telegraphof teletype system; (b) A single function of timef (t) as in radio or telephony; (c) A function oftime and other variables as in black and white television — here the message may be thought of as afunction f (x;y; t) of two space coordinates and time, the light intensity at point(x;y) and timet on apickup tube plate; (d) Two or more functions of time, sayf (t), g(t), h(t) — this is the case in “three-dimensional” sound transmission or if the system is intended to service several individual channels inmultiplex; (e) Several functions of several variables — in color television the message consists of threefunctionsf (x;y; t), g(x;y; t), h(x;y; t) defined in a three-dimensional continuum — we may also thinkof these three functions as components of a vector field defined in the region — similarly, severalblack and white television sources would produce “messages” consisting of a number of functionsof three variables; (f) Various combinations also occur, for example in television with an associatedaudio channel.

2. A transmitterwhich operates on the message in some way to produce a signal suitable for trans-mission over the channel. In telephony this operation consists merely of changing sound pressureinto a proportional electrical current. In telegraphy we have an encoding operation which producesa sequence of dots, dashes and spaces on the channel corresponding to the message. In a multiplexPCM system the different speech functions must be sampled, compressed, quantized and encoded,and finally interleaved properly to construct the signal. Vocoder systems, television and frequencymodulation are other examples of complex operations applied to the message to obtain the signal.

3. Thechannelis merely the medium used to transmit the signal from transmitter to receiver. It may bea pair of wires, a coaxial cable, a band of radio frequencies, a beam of light, etc.

4. Thereceiverordinarily performs the inverse operation of that done by the transmitter, reconstructingthe message from the signal.

5. Thedestinationis the person (or thing) for whom the message is intended.

We wish to consider certain general problems involving communication systems. To do this it is firstnecessary to represent the various elements involved as mathematical entities, suitably idealized from their

2

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 8 / 18

R.V.L. Hartley (1888-1970)

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 9 / 18

Uncertainty (Entropy)

Shannon Entropy

For a discrete random variable X ,

H(X )def= −

∑x∈supp PX

PX (x) logb PX (x) = E[− logb PX (X )]

I b = 2, entropy in bitsI b = e, entropy in natsI b = 10, entropy in Hartleys

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 10 / 18

Properties of the EntropyExtremesIf X is defined over an alphabet of size N,

0 ≤ H(X ) ≤ logb N

with equality on the left if and only if | supp PX | = 1 and on the right ifand only if PX (x) = 1/N for all x .

Proof:

H(X )− logb N = −∑

x

PX (x) logb PX (x)−∑

x

PX (x) logb N

=1

loge b

∑x

PX (x) loge1

NPX (x)

≤ 1loge b

∑x

PX (x)(

1NPX (x)

− 1)

(IT inequality)

=1

loge b

(∑x

1N−∑

x

PX (x)

)= 0

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 11 / 18

IT-Inequality

−0.5 0 0.5 1 1.5 2−3

−2.5

−2

−1.5

−1

−0.5

0

0.5

1

ln(x)

x−1

Lemma (IT-Inequality)

loge(x) ≤ x − 1

with equality if and only if x = 1.

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 12 / 18

Binary Entropy Function h(p)

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

p

h(p)

Good to remember:

h(.11) ≈ 12

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 13 / 18

Block and Conditional EntropyI Entropy naturally extends to “vectors” or “blocks”:

H(X ) = −∑

x

PX (x) log PX (x)

H(XY ) = −∑xy

PXY (x , y) log PXY (x , y)

I Entropy conditioned on an event:

H(X |Y = y) = −∑

x

PX |Y (x |y) log PX |Y (x |y)

Equivocation or conditional entropy

H(X |Y ) =∑

y

PY (y)H(X |Y = y) = E [− log PX |Y (X |Y )]

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 14 / 18

Chain rule of entropies

Two random variables

H(XY ) = H(X ) + H(Y |X ) = H(Y ) + H(X |Y )

Follows directly from our definition of H(Y |X )

Any number of random variables

H(X1X2 . . .XN) = H(X1) + H(X2|X1) + . . .+ H(XN |X1 . . .XN)

Follows from recursive application of the two variable chain rule

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 15 / 18

Does conditioning reduce uncertainty?

s s s s s s s s s ss s s s s s s s s ss s s s s s s s s ss s s s s s s s s ss s s s s s s s s ss s s s s s s s s ss s s s s s s s s ss s s s s s s s s ss s s s s s s s s ss s s s s s s s s s I X color of randomly picked ballI Y row of picked ball (1-10)I H(X ) = h(1/4) = 2− 3

4 log2 3 =0.811 bits

I H(X |Y = 1) = 1 bit > H(X )

I H(X |Y = 2) = 0 bits < H(X )

I H(X |Y ) = 5×1+5×010 = 1

2 < H(X )

Conditioning Theorem

0 ≤ H(X |Y ) ≤ H(X )

Conditioning on a random variable only ever reduces uncertainty /entropy (but conditioning on an event can increaase it).

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 16 / 18

Proof of conditioning theorem

H(X |Y )− H(X ) = H(XY )− H(X )− H(Y ) (chain rule)

=∑x,y

P(x , y) logP(x)P(y)P(x , y)

≤∑x,y

P(x , y)[

P(x)P(y)P(x , y)

− 1]

(IT-inequality)

=∑x,y

P(x)P(y)−∑x,y

P(x , y) = 0

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 17 / 18

Mutual Information

Definition

I(X ;Y ) = H(X )− H(X |Y )

Mutual information is mutual:

I(X ;Y ) = H(X ) + H(Y )− H(XY ) = H(Y )− H(Y |X )

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir

Organisation History Entropy Mutual Information 18 / 18

Positivity of Mutual Information

Theorem

I(X ;Y ) ≥ 0

with equality if and only if X and Y are independent

Equivalent to H(X |Y ) ≤ H(X ).

UNIVERSITY OF CAMBRIDGEDepartment of Engineering 3F1 Information Theory c©Jossy Sayir