sergio aux
Post on 04-Apr-2018
220 Views
Preview:
TRANSCRIPT
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 1/15
The theoretical limits for coding
The setting of Shannon theorems
SourceEncoder
(rate r c )
Serial toParallel(1:m )
Modulator
(M=2 m )
Channel
(B=NR m /2 )
s R
m R
Noise
[Inform. bits/s] [Coded bits/s][Channelsymbols/s]
Codulator(rate r )
c
e
s
m
s
e
s
cmr
R
mR
R
Rr
R
Rr ==== ,
N is thedimensionalityof the signalspace
e R
1
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 2/15
The theoretical limits for coding
Channel capacity theorem (Shannon, 1948)
For any discrete-input memoryless channel, there exists a block codulatorhandling blocks of k information bits with rate r (information bits per channelsymbol) for which the word error probability with maximum likelihood decodingis bounded by
where n m =k/r is the number of channel symbols per information block, andE(r) is a convex U, decreasing positive function of r for
C being the channel capacity measured in bits/(channel symbol) (or
bits/channel use)
)(2)(
r E n
w
meP−
<
C r ≤≤0
2
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 3/15
The theoretical limits for coding
From channel coding theorem three ways to improve performance
1. Decrease the codulator rate r
For a given source rate, decreasing r means increasing the transmission
rate to the channel, and hence increasing the required channel bandwidth
)(r E
E(r)
r 2 r 1 r
3
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 4/15
The theoretical limits for coding
2. Increase the channel capacity C
For a given codulator rate r, increasing C makes E(r) increasing. However, to increase the
channel capacity for a given channel and a given bandwidth,
we need to increase the signal power
) / (log. B N PC 02
150 +=
)(r E
E(r)
r r
4
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 5/15
The theoretical limits for coding
3. Keep E(r) fixed, so that both the bandwidth and power remain the
same, and increase the code symbols length
Performance improves, at the expenses of the decoding complexity.
In fact, for randomly chosen codes and ML decoding, the decoding
complexity D is of the order of the number of codewords
so that
and the error probability decreases only algebraically with the decoding
complexity.r r E r E n
wDeP m
/ )()(2)( −−
=<
r nm D 2∝
5
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 6/15
The theoretical limits for coding
Practical implications of Shannon Theorem
• The theorem proves the existence of good codes; it does not say how to construct them
(Folk Theorem “all codes are good, except those that we know of “ )
• From the proof of the theorem, we learn that a randomly-chosen code will turn out to begood with high probability; however, its decoding complexity is proportional to thenumber of code words
6
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 7/15
The theoretical limits for coding
Channel capacity converse theorem
From the converse to the coding theorem, we can derive the inequality
where
is the entropy of a binary source with probabilities P b (e) and [1-P b (e) ] , and
P b (e) is the bit error probability
)(1 e H
C r
b−
≤
)](1log[)](1[)(log)()( ePePePePe H bbbbb
−−−−=
7
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 8/15
The theoretical limits for coding
Channel capacity bound
The unconstrained additive Gaussian channel with capacity
bits/symbol
yields
(1)
which permits finding the pairs (P b (e),E b /N 0 ) satisfying (1) with equality
+=
0
2
21log
2
1
NN
rE C
b
+
−≤
0
2
21log
)](1[2
1
NN
rE
e H r
b
b
8
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 9/15
The theoretical limits for coding
Channel capacity bounds for the
unconstrained additive
Gaussian channel
Bit error probability as a function
of the minimum requiredsignal-to-noise ratio for various
codulator rates in the case of
unidimensional signal space (N=1)
0.0=r
r=0
9
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 10/15
The theoretical limits for coding
Sphere-packing
bound:
Signal-to-noise
ratio versus
block size
10
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 11/15
The search for “good” codes
• For 45 years, code and information theorists looked for channelcodes yielding performance close to those predicted by Shannon
• They invented several classes of codes offering good performance:
– Block (memoryless) codes, such as BCH, Reed-Solomon codes
– Convolutional (with memory) codes
– Concatenated codes, a mixture of the two above
• In the best case, those codes would require a signal-to-noise ratio
E b /N 0 greater by 2.5 dBs from the Shannon limit
11
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 12/15
Parallel concatenated codes: Structure
Parallel Concatenated Convolutional Codes (PCCC) , “Turbo” codes
Constituent Codes (CC)
12
The breakthrough appeared in 1983 (ICC 93, Geneva, Switzerland)
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 13/15
The search for “good” codes
• The breakthrough appeared in 1983 (ICC 93, Geneva, Switzerland)
13
• Rate 1/2 turbo code
• 16-state constituent
encoders: systematic
recursive convolutional
encoders• Interleaver size
N = 65,536
• Bit error probability of 10-5
at 0.7 dB
•
Shannon limit at 0.5 dB
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 14/15
Main characteristics of turbo codes
• Maximum-likelihood (optimal) decoding is impossible in practice,since it would require Viterbi algorithm applied to a finite-statemachine with 2N states = 216,000 in the case of the original 1993code!
• The decoding algorithm is an one, with
with the code word block size
• Convergence of the algorithm is an issue:
– No general conditions that guarantee convergence
– When it converges, almost impossible to predict whether it will
converge to a code word or something different (“pseudo code words”)
14
7/31/2019 Sergio Aux
http://slidepdf.com/reader/full/sergio-aux 15/15
LDPC codes
•
Shortly after the invention of turbo codes (1996), there was therediscovery of a class of codes studied by Gallager in its PhD thesis(1060), the so-called Low-Density Paricity-Check (LDPC) codes
• Endowed with an iterative decoding algorithm borrowed from turbocodes and control theory (the “message passing” algorithm), LDPCcodes were shown to yield performance almost similar to those ofturbo codes
• The best result known so far is a very long LDPC code as close toShannon limit as 0.00004 dB
15
top related