transmitters information
TRANSCRIPT
-
8/8/2019 Transmitters Information
1/23
Outline
Transmitters (Chapters 3 and 4, SourceCoding and Modulation) (week 1 and 2)
Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization (Chapter 6) (week 5)
Channel Capacity (Chapter 7) (week 6)
Error Correction Codes (Chapter 8) (week 7 and 8)
Equalization (Bandwidth Constrained Channels) (Chapter10) (week 9)
Adaptive Equalization (Chapter 11) (week 10 and 11)
Spread Spectrum (Chapter 13) (week 12)
Fading and multi path (Chapter 14) (week 12)
-
8/8/2019 Transmitters Information
2/23
Transmitters (week 1 and 2) Information Measures
Vector Quantization Delta Modulation
QAM
-
8/8/2019 Transmitters Information
3/23
Digital Communication System:
Transmitter
Receiver
Information per bit increases
noise immunity increases
Bandwidth efficiency increases
-
8/8/2019 Transmitters Information
4/23
Transmitter Topics Increasing information per bit
Increasing noise immunity
Increasing bandwidth efficiency
-
8/8/2019 Transmitters Information
5/23
Increasing Information per Bit Information in a source
Mathematical Models of Sources
Information Measures
Compressing information
Huffman encoding Optimal Compression?
Lempel-Ziv-Welch Algorithm Practical Compression
Quantization of analog data Scalar Quantization
Vector Quantization
Model Based Coding Practical Quantization
Q-law encoding
Delta Modulation
Linear Predictor Coding (LPC)
-
8/8/2019 Transmitters Information
6/23
Increasing Noise Immunity Coding (Chapter
8, weeks 7 and 8)
-
8/8/2019 Transmitters Information
7/23
Increasing bandwidth Efficiency Modulation of
digital data into
analog waveforms
Impact of
Modulation on
Bandwidthefficiency
-
8/8/2019 Transmitters Information
8/23
Increasing Information per Bit Information in a source
Mathematical Models of Sources
Information Measures Compressing information
Huffman encoding Optimal Compression?
Lempel-Ziv-Welch Algorithm Practical Compression
Quantization of analog data Scalar Quantization
Vector Quantization Model Based Coding
Practical Quantization Q-law encoding Delta Modulation
Linear Predictor Coding (LPC)
MAB6
-
8/8/2019 Transmitters Information
9/23
Slide 8
MAB6 Got to here on 8/24/04Martin Brooke, 8/26/2004
-
8/8/2019 Transmitters Information
10/23
Mathematical Models of Sources Discrete Sources Discrete Memoryless Source (DMS)
Statistically independentletters from finite alphabet
Stationary Source Statistically dependentletters, but joint probabilities
of sequences of equal length remain constant
Analog Sources
Band Limited |f|
-
8/8/2019 Transmitters Information
11/23
Discrete Sources
-
8/8/2019 Transmitters Information
12/23
Discrete Memoryless Source (DMS)
Statistically independentletters from finitealphabet
e.g., a normal binary data stream X might be
a series of random events ofeitherX=1, orX=0
P(X=1) = constant = 1 - P(X=0)
e.g., well compressed data, digital noise
-
8/8/2019 Transmitters Information
13/23
Stationary Source
Statistically dependentletters, but jointprobabilities of sequences of equal lengthremain constant
e.g.,probability that sequence
ai,ai+1,ai+2,ai+3=1001
whenaj,aj+1,aj+2,aj+3=1010
is always the same
Approximation uncoded for text
-
8/8/2019 Transmitters Information
14/23
Analog Sources Band Limited |f|
-
8/8/2019 Transmitters Information
15/23
Information in a DMS letter
If an event X denotes the arrival of a letterxiwith probabilityP(X=xi) = P(xi) the information
contained in the event is defined as:
I(X=xi) = I(xi ) = -log2(P(xi))bits
Information vs Probability
0
1
2
3
4
5
0.00E+00 5.00E-01 1.00E+00
I(xi)
P(xi)
-
8/8/2019 Transmitters Information
16/23
Examples e.g., An eventXgenerates random letter of
value 1 or 0 with equal probabilityP(X=0) =
P(X=1) = 0.5
then I(X) = -log2(0.5) = 1
or1bit of info each time X occurs
e.g., if X is always 1 thenP(X=0) = 0, P(X=1) = 1then I(X=0) = -log2(0) = gand I(X=1) = -log2(1) = 0
-
8/8/2019 Transmitters Information
17/23
Discussion I(X=1) = -log2(1) = 0
Means no information is delivered by X,
which is consistent with X = 1 all the time.
I(X=0) = -log2(0) = gMeans if X=0 then a huge amount of
information arrives, however sinceP(X=0) = 0, this never happens.
-
8/8/2019 Transmitters Information
18/23
Average Information To help deal with
I(X=0) = g, whenP(X=0) = 0
we need to consider how much informationactually arrives with the event over time.
The average letter information for letterxi out of
an alphabet ofL letters, i = 1,2,3L, is
I(xi)P(xi) = -P(xi)log2(P(xi))
MAB2
-
8/8/2019 Transmitters Information
19/23
Slide 17
MAB2 whatabout symbol or letter information, instead ofaverage information, to avoid confusion?Martin Brooke, 8/23/2004
-
8/8/2019 Transmitters Information
20/23
Average Information Plotting this for 2 symbols (1,0) we see that on average at
most a little more than 0.5 bits of information arrive with a
particular letter, and that low or high probability letters
generally carry little information.
Aver ge LetterInfor tion vs Probability
0.00E+00
1.00E-01
2.00E-01
3.00E-01
4.00E-01
5.00E-01
6.00E-01
0.00E+00 5.00E-01 1.00E+00
-
8/8/2019 Transmitters Information
21/23
Average Information (Entropy) Now lets consider average information of the
event X made up of the random arrival of all the
lettersx
i in the alphabet. This is the (sum of) average information arriving
with each bit.
!!
!!L
i
ii
L
i
ii xPxPxPxIXH1
2
1
))((log)()()()(
-
8/8/2019 Transmitters Information
22/23
Average Information (Entropy) Plotting this forL = 2 we see that on
average at most 1 bit of information is
delivered per event, but only if both
symbols arrive with equal probability.A r g I f rm ti fE e t wirth 2 letter r ilit f
fir t letteri the lph et x1
0.00E+00
2.00E-01
4.00E-01
6.00E-01
8.00E-01
1.00E+00
1.20E+00
0.00E+00 5.00E-01 1.00E+00
-
8/8/2019 Transmitters Information
23/23
Average Information (Entropy) What is best possible entropy for multi symbol
code?
)(log))((log)()( 21
2 LxPxPXH
L
i
ii e! !
1 0
2 1
3 1.584963
4 2
5 2.321928
6 2.584963
7 2.807355
8 3
16 432 5
64 6
128 7
256 8
So multi bit binary symbols of equally
probable random bits will equal the mostefficient information carriers
i.e., 256 symbols made from 8 bit bytes is
OK from information standpoint