comm 1003: information theory - german university in cairo · 2020. 2. 20. · channel capacity in...

20
COMM 1003: Information Theory

Upload: others

Post on 08-Mar-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

COMM 1003: Information

Theory

Page 2: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Channel Capacity in AWGN Channels

Channel Input 𝑋 is not restricted to

be discrete (𝑋 ∈ −∞,∞ ).

Noise 𝑁 follows the Gaussian distribution (N 0, 𝜎𝑛2 ).

Rx Signal 𝑌 = 𝑋 + 𝑁

2

Channel Capacity: 𝑪𝑨𝑾𝑮𝑵 = 𝐦𝐚𝐱𝒇𝑿 𝒙𝑰 𝑿, 𝒀

𝐼 𝑋, 𝑌 = 𝐻 𝑌 − 𝐻 𝑌|𝑋

𝐻 𝑌|𝑋 = − 𝑓𝑋,𝑌 𝑥, 𝑦 log 𝑓𝑌|𝑋 𝑦|𝑥 𝑑𝑦 𝑑𝑥

𝑦𝑥

𝑓𝑌|𝑋 𝑦|𝑥 = 𝑓𝑁 𝑦 − 𝑥

𝐻 𝑌|𝑋 = − 𝑓𝑋 𝑥 𝑓𝑁 𝑦 − 𝑥 log 𝑓𝑁 𝑦 − 𝑥 𝑑𝑦 𝑑𝑥

𝑦𝑥

𝐻 𝑌|𝑋 = − 𝑓𝑋 𝑥 𝑓𝑁 𝑛 log 𝑓𝑁 𝑛 𝑑𝑛 𝑑𝑥

𝑛𝑥

Page 3: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Channel Capacity in AWGN Channels

Channel Input 𝑋 is not restricted to

be discrete (𝑋 ∈ −∞,∞ ).

Noise 𝑁 follows the Gaussian distribution (N 0, 𝜎𝑛2 ).

Rx Signal 𝑌 = 𝑋 + 𝑁

3

Channel Capacity: 𝑪𝑨𝑾𝑮𝑵 = 𝐦𝐚𝐱𝒇𝑿 𝒙𝑰 𝑿, 𝒀

𝐼 𝑋, 𝑌 = 𝐻 𝑌 − 𝐻 𝑌|𝑋

𝐻 𝑌|𝑋 = − 𝑓𝑋 𝑥 𝑑𝑥 𝑓𝑁 𝑛 log 𝑓𝑁 𝑛 𝑑𝑛

𝑛𝑥

= − 𝑓𝑁 𝑛 log 𝑓𝑁 𝑛 𝑑𝑛

𝑛

𝐻 𝑌|𝑋 = 𝐻 𝑁

Note: 𝒀 that depends on input while 𝑵 does not

⇒For Capacity: We need to find max 𝐻 𝑌

Page 4: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Channel Capacity in AWGN Channels

Theorem: The maximum value of the entropy 𝐻(𝑌) for some

continuous random variable 𝑌 ∈ −∞,∞ is uniquely achieved

by the Gaussian distribution where,

𝒇𝒀 𝒚 ~N 𝟎, 𝝈𝒀𝟐 𝑯 𝒀 =

𝟏

𝟐𝒍𝒐𝒈𝟐𝝅𝒆𝝈𝒀

𝟐

Proof: Assume some arbitrary 𝑓𝑌(𝑌) and let us define 𝜙𝑌(𝑌) to

depict the pdf for a Gaussian distribution for random variable 𝑌

𝑓𝑌 𝑦 log1

𝜙𝑌 𝑦𝑑𝑦 = 𝑓𝑌 𝑦 log 2𝜋𝜎𝑌

2 × 𝑒

𝑦2

2𝜎𝑌2𝑑𝑦

𝑓𝑌 𝑦 log1

𝜙𝑌 𝑦𝑑𝑦 = 𝑓𝑌 𝑦 log 2𝜋𝜎𝑌

2 +𝑦2

2𝜎𝑌2 log 𝑒 𝑑𝑦

𝑓𝑌 𝑦 log1

𝜙𝑌 𝑦𝑑𝑦 = log 2𝜋𝜎𝑌

2 𝑓𝑌 𝑦 𝑑𝑦 +log 𝑒

2𝜎𝑌2 𝑦

2𝑓𝑌 𝑦 𝑑𝑦

4

Page 5: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Channel Capacity in AWGN Channels

𝑓𝑌 𝑦 log1

𝜙𝑌 𝑦𝑑𝑦 = log 2𝜋𝜎𝑌

2 +log 𝑒

2

𝑓𝑌 𝑦 log1

𝜙𝑌 𝑦𝑑𝑦 =1

2log 2𝜋𝜎𝑌

2 +1

2log 𝑒

𝒇𝒀 𝒚 𝒍𝒐𝒈𝟏

𝝓𝒀 𝒚𝒅𝒚 =𝟏

𝟐𝒍𝒐𝒈𝟐𝝅𝒆𝝈𝒀

𝟐

𝐻 𝑌 −1

2𝑙𝑜𝑔 2𝜋𝑒𝜎𝑌

2 = 𝑓𝑌 𝑦 log1

𝑓𝑌 𝑦𝑑𝑦 − 𝑓𝑌 𝑦 log

1

𝜙𝑌 𝑦𝑑𝑦

𝐻 𝑌 −1

2𝑙𝑜𝑔 2𝜋𝑒𝜎𝑌

2 = 𝑓𝑌 𝑦 log𝜙𝑌 𝑦

𝑓𝑌 𝑦𝑑𝑦

𝐻 𝑌 −1

2𝑙𝑜𝑔 2𝜋𝑒𝜎𝑌

2 ≤ log 𝑒 𝑓𝑌 𝑦𝜙𝑌 𝑦

𝑓𝑌 𝑦− 1 𝑑𝑦

𝐻 𝑌 −1

2𝑙𝑜𝑔 2𝜋𝑒𝜎𝑌

2 ≤ log 𝑒 𝜙𝑌 𝑦 − 𝑓𝑌 𝑦 𝑑𝑦 = 0

5

Remember: 𝐥𝐨𝐠 𝒙 ≤ 𝐥𝐨𝐠𝒆 × 𝒙 − 𝟏

Page 6: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Channel Capacity in AWGN Channels

𝐻 𝑌 −1

2log 2𝜋𝑒𝜎𝑌

2 ≤ 0 ⇒ 𝑯 𝒀 ≤𝟏

𝟐𝒍𝒐𝒈𝟐𝝅𝒆𝝈𝒀

𝟐

Therefore Maximum Entropy is achieved when

𝐻 𝑌 =1

2𝑙𝑜𝑔 2𝜋𝑒𝜎𝑌

2

The Entropy 𝐻 𝑌 =1

2log 2𝜋𝑒𝜎𝑌

2 is actually the entropy of a Gaussian

distributed random variable 𝑌 with variance 𝜎𝑌2

6

Page 7: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Channel Capacity in AWGN Channels

Some Notes on the Theorem:

Maximum entropy of a continuous random

variable 𝑌 is achieved when 𝑌 follows a

Gaussian distribution

Channel capacity is achieved when the

received signal 𝑌 is Gaussian distributed

The theorem has set no constraints on the

distribution of the noise 𝑁.

A fact of life for additive channels 𝑋 + 𝑁 = 𝑌

If 𝑌 is Gaussian distributed then,

Both 𝑋 and 𝑁 MUST BE Gaussian distributed.

Fortunately, Gaussian noise is the common

physical fact as the thermal noise affecting

reception sensitivity of receivers

7

Channel Capacity:

𝑪𝑨𝑾𝑮𝑵 = 𝐦𝐚𝐱𝒇𝑿 𝒙𝑰 𝑿, 𝒀

⇒ 𝐼 𝑋, 𝑌 = 𝐻 𝑌 − 𝐻 𝑁

Conclusion: Channel capacity of additive channels is achieved if the input signal 𝑿 also follows the Gaussian distribution

Page 8: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Channel Capacity in AWGN Channels

We are now ready to derive the

formula for AWGN channel capacity

𝑋~N 0, 𝜎𝑋2

𝑁~N 0, 𝜎𝑁2

𝑌 = 𝑋 + 𝑁, 𝑌~N 0, 𝜎𝑌2

𝜎𝑌2 = 𝜎𝑋

2 + 𝜎𝑁2

𝐶𝐴𝑊𝐺𝑁 = 𝐻 𝑌 − 𝐻 𝑁

𝐶𝐴𝑊𝐺𝑁 =1

2log 2𝜋𝑒𝜎𝑌

2 −1

2log 2𝜋𝑒𝜎𝑁

2

𝐶𝐴𝑊𝐺𝑁 =1

2log𝜎𝑌2

𝜎𝑁2

𝑪𝑨𝑾𝑮𝑵 =𝟏

𝟐𝒍𝒐𝒈 𝟏 +

𝝈𝑿𝟐

𝝈𝑵𝟐

8

Channel Capacity:

𝑪𝑨𝑾𝑮𝑵 = 𝐦𝐚𝐱𝒇𝑿 𝒙𝑰 𝑿, 𝒀

⇒ 𝐼 𝑋, 𝑌 = 𝐻 𝑌 − 𝐻 𝑁

Baseband Real Signals

𝑪𝑨𝑾𝑮𝑵 =𝟏

𝟐𝒍𝒐𝒈 𝟏 + 𝑺𝑵𝑹

Passband Complex Signals

𝑪𝑨𝑾𝑮𝑵 = 𝒍𝒐𝒈 𝟏 + 𝑺𝑵𝑹

Page 9: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

AWGN Channel Capacity of Modulated Signals

In Digital Modulation, the transmitted signal is drawn from a discrete set

of modulated signals.

Inputs to the communication channel (being it AWGN or not) are

characterized as discrete random variables

Received signals are impacted by noise such that they constitute

continuous random variables

We want to find The channel capacity of AWGN channels with discrete

input-analog output characteristics.

9

Page 10: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

AWGN Channel Capacity of Modulated Signals

The mutual information 𝐼( 𝑋, 𝑌)

𝐼 𝑋, 𝑌 = 𝑓𝑋,𝑌 𝑥𝑖 , 𝑦 log𝑓𝑋,𝑌 𝑥𝑖 , 𝑦

𝑃𝑋 𝑥𝑖 . 𝑓𝑌 𝑦

−∞

𝑀

𝑖=1

𝑑𝑦

𝐼 𝑋, 𝑌 = 𝑃𝑋 𝑥𝑖 𝑓𝑌|𝑋 𝑦|𝑥𝑖 log𝑓𝑌|𝑋 𝑦|𝑥𝑖

𝑓𝑌 𝑦

−∞

𝑀

𝑖=1

𝑑𝑦

𝐼 𝑋, 𝑌 = 𝑃𝑋 𝑥𝑖 𝑓𝑌|𝑋 𝑦|𝑥𝑖 log𝑓𝑌|𝑋 𝑦|𝑥𝑖

𝑃𝑋 𝑥𝑗 . 𝑓𝑌|𝑋 𝑦|𝑥𝑗𝑀𝑗=1

−∞

𝑀

𝑖=1

𝑑𝑦

At Channel Capacity, the source generates equiprobable symbols,

i.e., 𝑃𝑋 𝑥𝑖 =1

𝑀, ∀𝑥𝑖 ∈ 𝑥1, 𝑥2, … , 𝑥𝑀

10

Page 11: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

AWGN Channel Capacity of Modulated Signals

The capacity 𝐶

𝐶 = 1

𝑀× 𝑓𝑌|𝑋 𝑦|𝑥𝑖 log 𝑀 ×

𝑓𝑌|𝑋 𝑦|𝑥𝑖

𝑓𝑌|𝑋 𝑦|𝑥𝑗𝑀𝑗=1

−∞

𝑀

𝑖=1

𝑑𝑦

𝐶 =1

𝑀× 𝑓𝑌|𝑋 𝑦|𝑥𝑖 log𝑀 + log

𝑓𝑌|𝑋 𝑦|𝑥𝑖

𝑓𝑌|𝑋 𝑦|𝑥𝑗𝑀𝑗=1

−∞

𝑀

𝑖=1

𝑑𝑦

𝐶 =log𝑀

𝑀× 𝑓𝑌|𝑋 𝑦|𝑥𝑖

−∞

𝑀

𝑖=1

𝑑𝑦 +1

𝑀× 𝑓𝑌|𝑋 𝑦|𝑥𝑖 log

𝑓𝑌|𝑋 𝑦|𝑥𝑖

𝑓𝑌|𝑋 𝑦|𝑥𝑗𝑀𝑗=1

−∞

𝑀

𝑖=1

𝑑𝑦

𝐶 = log𝑀 −1

𝑀× 𝑓𝑌|𝑋 𝑦|𝑥𝑖 log

𝑓𝑌|𝑋 𝑦|𝑥𝑗𝑀𝑗=1

𝑓𝑌|𝑋 𝑦|𝑥𝑖

−∞

𝑀

𝑖=1

𝑑𝑦

11

𝟏

Page 12: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

AWGN Channel Capacity of Modulated Signals

Assuming One-dimensional Modulation

𝑓𝑌|𝑥𝑖 𝑦|𝑥𝑖 =1

2𝜋𝜎𝑁2𝑒−𝑦−𝑥𝑖

2

2𝜎𝑁2

⇒ 𝐶 = log𝑀 −1

𝑀×

1

2𝜋𝜎𝑁2× 𝑒−𝑦−𝑥𝑖

2

2𝜎𝑁2log 𝑒

−𝑦−𝑥𝑗

2

2𝜎𝑁2𝑀

𝑗=1

𝑒−𝑦−𝑥𝑖

2

2𝜎𝑁2

−∞

𝑀

𝑖=1

𝑑𝑦

12

Page 13: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

AWGN Channel Capacity of Modulated Signals

Let 𝑛 = 𝑦 − 𝑥𝑖

⇒ 𝐶 = log𝑀 −1

𝑀×

𝑒−𝑛2

2𝜎𝑁2

2𝜋𝜎𝑁2× log

𝑒−𝑛+𝑥𝑖−𝑥𝑗

2

2𝜎𝑁2𝑀

𝑗=1

𝑒−𝑛2

2𝜎𝑁2

−∞

𝑀

𝑖=1

𝑑𝑛

⇒ 𝐶 = log𝑀 −1

𝑀× log 𝑒

𝑛2− 𝑛+𝑥𝑖−𝑥𝑗2

2𝜎𝑁2

𝑀

𝑗=1

−∞

𝑀

𝑖=1

× 𝑓𝑁 𝑛 𝑑𝑛

⇒ 𝐶 = log𝑀 −1

𝑀× E log 𝑒

𝑛2− 𝑛+𝑥𝑖−𝑥𝑗2

2𝜎𝑁2

𝑀

𝑗=1

𝑀

𝑖=1

The capacity could be evaluated using Monte-Carlo simulations where 𝒏 is Gaussian with

mean of 𝟎 and variance of 𝝈𝑵𝟐 and 𝒙𝒊 ∈ 𝒙𝟏, 𝒙𝟐, … , 𝒙𝑴 as well as 𝒙𝒋 ∈ 𝒙𝟏, 𝒙𝟐, … , 𝒙𝑴

13

Page 14: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

AWGN Channel Capacity of Modulated Signals

Assuming Two-dimensional Modulation

𝑓𝑌|𝑥𝑖 𝑦|𝑥𝑖 =1

2𝜋𝜎𝑁2 𝑒−|𝒚−𝒙𝒊|

𝟐

2𝜎𝑁2

⇒ 𝐶 = log𝑀 −1

𝑀×

𝑒−𝑛 2

2𝜎𝑁2

2𝜋𝜎𝑁2× log

𝑒−𝑛+𝑥𝑖−𝑥𝑗

2

2𝜎𝑁2𝑀

𝑗=1

𝑒−𝑛 2

2𝜎𝑁2

−∞

−∞

𝑀

𝑖=1

𝑑𝑛𝐼𝑑𝑛𝑄

𝐶 = log𝑀 −1

𝑀× E log 𝑒

|𝑛|2−|𝑛+𝑥𝑖−𝑥𝑗|2

2𝜎𝑁2

𝑀

𝑗=1

𝑀

𝑖=1

14

Euclidian Distance

𝑥𝑖 = 𝑥𝑖,𝐼 + 𝑗𝑥𝑖,𝑄

𝑦 = 𝑦𝐼 + 𝑗𝑦𝑄

𝑛 = 𝑛𝐼 + 𝑗𝑛𝑄

Page 15: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

AWGN Channel Capacity of Modulated Signals

Notes:

For One-Dimensional Modulation (e.g., M-PAM)

𝐸𝑆𝑁0= E𝑥𝑖2

𝜎𝑁2

For Two-Dimensional Modulation (e.g., M-PSK, M-QAM)

𝐸𝑆𝑁0= E𝑥𝑖2

2𝜎𝑁2

15

Page 16: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Modulated Signals vs Shannon Capacity Curves

16

Page 17: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Modulated Signals vs Shannon Capacity Curves

17

Page 18: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Modulated Signals vs Shannon Capacity Curves

18

Page 19: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Saturation Thresholds

19

Saturation thresholds are the SNR levels at which the channel

capacity can accommodate source Entropy 𝐻 𝑋 .

Modulation schemes do not reach saturation until a certain

operating saturation point (i.e., saturation threshold).

Before the saturation thresholds, we can not have 100% reliable

communication with those practical modulation schemes.

Page 20: COMM 1003: Information Theory - German University in Cairo · 2020. 2. 20. · Channel Capacity in AWGN Channels Some Notes on the Theorem: Maximum entropy of a continuous random

© Tallal Elshabrawy

Additional Remarks

Practical modulation are lagging Shannon capacity by some

amount.

The M-PSK modulation is the worse one.

The gap between the Shannon and the practical modulation

capacity increases with increasing rate.

Some channel coding scheme could have the potential to

approach the AWGN Shannon capacity curve at the expense of

a lower data rate.

20