OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER
Suayb S. Arslan, Pamela C. Cosman and Laurence B. Milstein
Department of Electrical and Computer Engineering
University of California, San Diego
Page 2
Main Content:“OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER”
Outline Transmission problem and motivation.
Background:
– Fountain (Rateless) Codes
• Encoding and Decoding.
– Previous Unequal Error Protection (UEP) Rateless codes
Proposed algorithms: Generalized Unequal Error Protection LT Codes
– Selection of distributions
– Progressive transmission system
Simulation:
– Performance comparisons
References
Page 3
Transmission problem:
Point to point transmission:
– Transmit information from one sender to another where the erasure channel between the sender and the receiver has a time varying and unknown erasure probability.
• OBJECTIVE: Transmission rate is close to the capacity of the channel.
Multicast transmission:
– Transmit information from one sender to multiple receivers where the channel between sender and the each receiver is an erasure channel with unknown erasure probability.
• OBJECTIVE: Transmission rate is close to the capacity on all the transmission channels, simultaneously.
Page 4
Motivation
Erasure channels: In a number of communications scenarios, data files sent over the internet are chopped into fixed or variable size packets, and each packet is either received without error or corrupted and therefore considered erased during the transmission.
Solution 1: A way of solving the transmission problem for erasure channels is to use forward error correction. This may lead to inefficient use of network resources when the channel information is missing.
Solution 2: Receivers acknowledge each received packet and senders retransmit the lost packets. This results in low efficiency and the capacity is wasted by feedback messages and retransmissions. The magnitude of the waste is exacerbated in a multicast scenario.
BBinaryinary EErasurerasure CChannelhannel
Page 5
A solution: “Digital Fountain Idea”
A paradigm for data transmission, without the need for almost any feedback messages.
What is received or lost is of no importance. It only matters whether enough is received.
n sym bols n sym bolsSend a b it tom ake it stop
Tra n sm itter Tra n sm itter
ReceiverReceiver
Coded sym bolsCoded sym bols
Page 6
Luby Transform (LT) codes (Luby ‘98)
First known fountain code design.
The basis for other fountain codes: Raptor codes or Online codes.
Send k information symbols. Only n = (1+)k coded symbols are enough to recover all k information symbols, where is the overhead.
Asymptotically capacity achieving:
– BEC channel have an erasure probability: and capacity is
– Number of symbols transmitted: and expected number of reliably received symbols is given by:
– Rate of the transmission:
– As k gets larger, the goes to 0, and . Thus, LT codes achieves capacity asymptotically.
Low encoding/decoding complexity.
p
n~ppC 1)(
pCk
pkn
pknk
1)(
)1()1()1(
~ kn
)1(~ pnn
[■] M. Luby, “LT-Codes”, Proc. 43rd Annual IEEE symposium on Foundations of Computer Science, pp. 271-280, 2002.
Page 7
Encoding…
Finally, we XOR selected information symbols of x to produce the coded symbol.
This process is repeated every time a new coded symbol is desired.
d = 21
s e lec tio n o f e d g es
se lectio n o fthe d eg ree
Degree Distribution
Selection Distribution
Example: INFORMATIONSYMBOLS
CODED SYMBOLSRECEIVED and UNERASED
Page 8
Example: INFORMATIONSYMBOLS
d = 22
Encoding…
Finally, we XOR selected information symbols of x to produce the coded symbol.
This process is repeated every time a new coded symbol is desired.
Degree Distribution
Selection Distribution
CODED SYMBOLSRECEIVED and UNERASED
Page 9
Example: INFORMATIONSYMBOLS
Encoding…
d = 13
Finally, we XOR selected information symbols of x to produce the coded symbol.
This process is repeated every time a new coded symbol is desired.
Degree Distribution
Selection Distribution
CODED SYMBOLSRECEIVED and UNERASED
Page 10
Example: INFORMATIONSYMBOLS
Encoding…
d = 34
Finally, we XOR selected information symbols of x to produce the coded symbol.
This process is repeated every time a new coded symbol is desired.
Degree Distribution
Selection Distribution
CODED SYMBOLSRECEIVED and UNERASED
Page 11
Example: INFORMATIONSYMBOLS
Encoding…
d = 25
Finally, we XOR selected information symbols of x to produce the coded symbol.
This process is repeated every time a new coded symbol is desired.
Degree Distribution
Selection Distribution
CODED SYMBOLSRECEIVED and UNERASED
Page 12
Example: INFORMATIONSYMBOLS
CODED SYMBOLSRECEIVED and UNERASED
Encoding…
d = 25
Finally, we XOR selected information symbols of x to produce the coded symbol.
This process is repeated every time a new coded symbol is desired.
Degree Distribution
Selection Distribution
Page 13
Decoding…
Coded symbols are sent over a binary erasure channel. Decoder uses a Belief Propagation (BP) algorithm.
Figure: n=5
Page 14
How to choose SD and DD?
Ripple is defined to be the set of check nodes of degree 1 in each iteration of the BP.
Thus, in order for BP not to terminate, Ripple has to have at least one element in each iteration.
– Luby proposed Soliton distribution that achieves an expected Ripple size 1 in each iteration of BP (POOR in practice).
– Robust Soliton Distribution (GOOD in practice: Expected Ripple size > 1 ).
In original LT coding, SD is assumed to be uniform distribution.
OBJECTIVE of the original design:
– Given the uniform SD, find the best DD that will achieve the least number of received unerased coded symbols while decoding the whole information block with negligible failure probability.
Original design does not provide Unequal Error Protection (UEP).
In multimedia communications, OBJECTIVE is not necessarily the OBJECTIVE of the original design.
Page 15
Previous UEP Rateless codes:
1s
There are two major studies in literature:
– (1) Weighted approach: Modification to SD (skewed SD).
2s
41 k 32 k7.01
3.02
Example: r = 2
Page 16
Previous UEP Rateless codes:
– (2) Expanding Window Fountain (EWF) codes:
a window-specific DD i.e., r different DDs can be used. (MORE FLEXIBLE than weighted approach)
1W2W
4|| 11 kW 7|| 2 kW
4.01
6.02
Example: r = 2
Page 17
An observation:
Let us observe the following:
– Decoding stage 1: A degree-1 check node decodes an information symbol.
– Decoding stage 2: Some of the degree-2 check nodes decode two information symbols.
– Decoding stage 3: A degree-3 check node decodes an information symbol.
Conclusion: low degree coded symbols decode information symbols earlier (early iterations) in BP.
This can be used for prioritized decoding.
Page 18
Proposed algorithms: Generalized Unequal Error Protection LT Codes -Idea: Degree Dependent Selection Distributions
“Degree dependent selection” idea: WEIGHTED APPROACHWEIGHTED APPROACH
– After selecting the degree number for the coded symbol, select the edge connections based on that degree number.
– If the degree number is ,
the information chunks
– Since these probabilities must sum to 1 and , there are (r-1)k parameters subject to optimization.
isp jij degree has symbol coded that thegiven selecting ofy probabilit lconditiona:,
md
},...,,{ toaccording selected are }{ ,,2,11 mdrmdmdrjj ppps
kdm 1
“Degree dependent selection” idea: EWF APPROACHEWF APPROACH
– After selecting the degree number for the coded symbol, select the windows based on that degree number.
– If the degree number is ,
the windows
md
},...,,{ toaccording selected are }{ ,,2,11 mdrmdmdrjjW
iW jij degree has symbol coded that thegiven selecting ofy probabilit lconditiona:,
Page 19
Selection of distributions:
Instead of designing all (r-1)k parameters, we introduce a functional dependence to reduce the parameter size.
Number of parameters are reduced to 3(r-1).
Page 20
Selection of distributions:
(Luby)
(Luby)
(Proposed)
Page 21
Progressive transmission system:
The reason for using such a demultiplexing methodology is that the proposed coding scheme is most powerful when the source bits within each segment have unequal importance.
Using demultiplexing, for example, the information bits in the first block, the most important information block, are equally shared by the segments.
Page 22
Optimization of the rateless code:
Let the BP algorithm iterate M times. The optimization problem is:
are degree and selection distributions of the proposed LT code. This implies that we optimize both of the distributions for a minimum-distortion criterion.
Maximum for M is set to in this study.
The minimization can be done at any iteration index m,
This way, we can present performance as a function of iteration index. This property may particularly be important for portable devices which are constrained by low-complexity receiver structures. We call this property Unequal Iteration Time (UIT) property.
max1 Mm
70max M
}{P , jip }γ{L , ji
Page 23
Packetization methodology: Fixed packet size
1 2 3 4 5 6
7 8
c 1 c 2 c 6 c 1 c 2 c 6
PACK ET 1
CHA NNEL
Source sym bol
Coded sym bol
CRC code sym bol
k in
form
atio
n sy
mbo
ls
symbols coded ~n
PACKETS are EITHER RECEIVED or LOST
Page 24
Packetization methodology : Fixed packet size
1 2 3 4 5 6
7 8
c 7
Source sym bol
Coded sym bol
CRC code sym bol
c 8 c 1 2
PACK ET 2
CHA NNELc 7 c 8 c 1 2
k in
form
atio
n sy
mbo
ls
symbols coded ~n
PACKETS are EITHER RECEIVED or LOST
Page 25
Packetization methodology : Fixed packet size
1 2 3 4 5 6
7 8Source sym bol
Coded sym bol
CRC code sym bol
PACK ET n
CHA NNEL
k in
form
atio
n sy
mbo
ls
Therefore, each LT codeword experiences the same erasure pattern.
PACKETS are EITHER RECEIVED or LOST
Page 26
Alternative methodology: Variable packet size
1 2 3 4
5 6
c 1 c 2 c 1 6 2
Source sym bol
Coded sym bol
CRC code sym bol
7 8
1 6 01 5 9
1 6 1 1 6 2
1 6 3 1 6 4
1 6 5 1 6 6
1 7 2
1 7 3 1 7 4
2 5 22 5 1
c 4 c 1 6 1 c 1 c 4 c 1 6 2c 1 6 1
c 1 6 9 c 1 6 8c 1 6 9 c 1 7 0
C h an ne l
C h an ne l
k in
form
atio
n sy
mbo
ls
PACKETS are EITHER RECEIVED or LOST
Page 27
Alternative methodology: Variable packet size
1
2
3
4 1
c 1 c 4 1 c 5 1
Source sym bol
Coded sym bol
CRC code sym bol
5 0 5 1
5 2 5 3
5 4
2 5 22 5 1
c 4 3 c 5 0 c 1 c 4 c 1 6 2c 1 6 1
c 1 6 9 c 1 7 0
C h an ne l
C h an ne l
4 0
4 2 4 3
4 4
4 9
C h an ne l
k in
form
atio
n sy
mbo
ls
PACKETS are EITHER RECEIVED or LOST
Page 28
Alternative methodology: Overhead Allocation
B /k B /k - XX
(B /k)bX b 1
(B /k-X )b 2
(B /k)b = X b + (B /k-X )b1 2
b > 0 1
b < 0 2
k in
form
atio
n sy
mbo
ls
Page 29
Alternative methodology: Overhead Allocation
B /k B /k - X - YX
(B /k)b
X b 1
(B /k-X -Y )b 3
(B /k)b = X b + (B /k-X )b1 2Y b + -Y1
Y
Y b 2
b 2> 0b 1> 0
b 3< 0
k in
form
atio
n sy
mbo
ls
Page 30
Numerical Results: Comparisons with “weighted approach”Comparisons with “weighted approach”
Simulation set-up/parameters:
– Take B = 50000 source bits ( 512 X 512 Lena image using SPIHT).
– Chop it into k equal segments each containing bits.
– Treat segments as information symbols and encode using proposed codes to produce coded symbols.
We first compare “weighted approach” with the proposed “UEP GLT” :
– Robust Soliton distribution with c=0.01, =0.01.
– r = 2 with k information symbols are treated as the first information chunk, the rest as the second information chunk.
– Optimize both systems optimize their design parameters.
• “weighted approach”: Only one parameter . We optimize .
• “UEP GLT” (GLTexp): Three parameters: . For simplicity we set . We optimize two parameters.
kB /
111 ,, CBA111 BA
1ω 1ω
Page 31
Numerical Results: Comparisons with “weighted approach”Comparisons with “weighted approach”
Page 32
Numerical Results: Comparisons with “weighted approach”Comparisons with “weighted approach”
URT property
166
Page 33
Numerical Results: Comparisons with “weighted approach”Comparisons with “weighted approach”
URT property
UEP property
30.8
166
Page 34
Numerical Results: : Comparisons with “weighted approach”Comparisons with “weighted approach”
Page 35
Numerical Results: Comparisons with “weighted approach”Comparisons with “weighted approach”
2 4 6 8 10 12 14 16 18 2021
22
23
24
25
26
27
28
29
30
I te ra tio n in d e x o f th e B P a lg o r ith m
Avg
. PS
NR
in d
B
Texp (op tim ized fo r = 70)M
"w eigh ted approach", (optim ized fo r =70)M
1 .7dB
O p tim a l fo r = 6M
O p tim a l fo r = 7 0M
Page 36
Numerical Results: Comparisons with “weighted approach”Comparisons with “weighted approach”
2 4 6 8 10 12 14 16 18 2021
22
23
24
25
26
27
28
29
30
I te ra tio n in d e x o f th e B P a lg o r ith m
Avg
. PS
NR
in d
B
Texp (op tim ized fo r =70)M
"w eigh ted approach", (optim ized fo r = 70)M
G LTexp ( = 0.55, = 0.45, = 1.9 )A B C1 1 1
3 .32dB
1 .7dB
O p tim a l fo r = 6M
O p tim a l fo r = 7 0M
UIT property
Page 37
Numerical Results: Comparisons with EWF codesComparisons with EWF codes
We compare “EWF codes” and the proposed “UEP GLT” :
– Number of chunks/windows: r = 2 .
– Optimize both systems optimize their design parameters.
• “EWF codes”:
– Two Robust Soliton distribution (RSD) with c =0.01, = 0.01. Since the window sizes are different RSDs are different.
– Two parameters to optimize: Window selection probability , and the chunk size parameter . Let denote the optimal parameters of the EWF code.
• “UEP GLT”:
– Use the compound degree distribution obtained by convex combination of the two RSDs used for EWF codes above.
– GLTexp: This system uses and .
– GLTexpOpt: This system uses only .
– GLTexpFullOpt: No constraints. Five parameters to optimize: .
1
111 BA
1
11111 ,,,, CBA
),( *
1
*
1
),( *
1
*
1 111 BA
Page 38
Numerical Results: Comparisons with EWF codesComparisons with EWF codes
Page 39
Numerical Results: Comparisons with EWF codesComparisons with EWF codes
URT property
UEP property
Page 40
Numerical Results: Optimal parametersOptimal parameters
k PSNR(dB)
100
“weighted approach”0.3 0.55 0.0 N/A 27.61
0.5 0.95 0.0 N/A 28.05
GLTexp0.3 0.19 0.81 2.0 29.19
0.1 0.06 0.94 0.9 29.82
PSNR(dB)
1000
“weighted approach”0.3 0.45 0.0 N/A 29.63
0.6 0.76 0.0 N/A 30.39
GLTexp0.3 0.17 0.83 1.9 32.23
0.25 0.25 0.75 1.2 32.46
11B1A 1C
11B1A 1C
k=100, = 0.4 PSNR(dB)
UEPEWF 0.5 N/A N/A N/A 0.97 29.63
GLTexpOpt 0.24 -0.62 1.62 1.1 0.55 30.94
11B1A 1C 1
0.3 fixed 1 optimal 1
3.0
3.0
Page 41
References:
[1] M. Luby, ``LT-Codes", Proc. 43rd Annual IEEE symposium on Foundations of Computer Sciencel, pp. 271-280, 2002.[2] A. Shokrollahi, “Raptor codes,” IEEE Trans. Inf. Theory, vol. 52, no. 6, pp. 2410–2423, Jun. 2006.[3] N. Rahnavard and F. Fekri, “Finite-Length Unequal Error Protection Rateless Codes: Design and Analysis", IEEE Globecom 2005.[4] N. Rahnavard, Badri N. Vellambi and F. Fekri, “Rateless Codes with Unequal Protection Property", IEEE Trans. Inf. Theory, Vol. 53, No. 4, pp. 1521-1532, April 2007.[5] D. Sejdinovic, D. Vukobratovic, A. Doufexi, V. Senk and R. Piechocki, “Expanding window Fountain codes for Unequal Error Protection", Proc. 41st Asilomar Conf., Pacific Grovem pp.1020-1024, 2007.[6] D. Vukobratovic, V. Stankovic, D. Sejdinovic, L. Stankovic and Z. Xiong, “Scalable Video Multicast using Expanding Window Fountain Codes”, IEEE Trans. on Multimedia., Vol. 11, No. 6, pp. 1094–1104, Oct. 2009.[7] M. Luby, Mitzenmacher, and A. Shokrallahi, “Analysis of random processes via and-or tree evaluation," in Proc. 9th Ann. ACM-SIAM Symp. Discrete Algorithms, 1998, pp.364-373.
Page 42
Questions ?