tutorial gfqdecoding part1
TRANSCRIPT
![Page 1: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/1.jpg)
Status of Knowledge on Non-Binary LDPC DecodersPart I: From Binary to Non-Binary Belief Propagation Decoding
D. Declercq 1
1ETIS - UMR8051ENSEA/Cergy-University/CNRS
France
IEEE SSC SCV Tutorial, Santa Clara, October 21st, 2010
D. Declercq (ETIS - UMR8051) 1 / 59
![Page 2: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/2.jpg)
Outline
1 Introduction
2 Belief Propagation on a General Graph
3 Binary Belief Propagation Decoder
4 Non-Binary Belief Propagation Decoding
D. Declercq (ETIS - UMR8051) 2 / 59
![Page 3: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/3.jpg)
Outline
1 Introduction
2 Belief Propagation on a General Graph
3 Binary Belief Propagation Decoder
4 Non-Binary Belief Propagation Decoding
D. Declercq (ETIS - UMR8051) 3 / 59
![Page 4: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/4.jpg)
Small History of Binary LDPC Codes: Landmarks
Gallager 1962 regular LDPC codes, proof of convergence (MLD), algo. A (bit
flipping), algo. B ,
Tanner 1981 composite codes on graphs, link with product codes and LDPCcodes,
MacKay 1995 Belief Propagation (BP) decoding, link with iterativeturbo-decoding, irregular LDPC codes,
Rich. et Urb. 2001 proof of convergence (BP), optimization of irregularity, codesapproaching capacity (BEC, BI-AWGN),
Since then Optimization for other types of channels (freq. selective, multilevel,multi-user, turbo-equalization, joint source-channel coding),finding good matrices for small sizes, lowering the error floor.
⇒ Golden age of LDPC codes, application in many standards.
D. Declercq (ETIS - UMR8051) 4 / 59
![Page 5: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/5.jpg)
Small History of Non-Binary LDPC Codes: Landmarks
Gallager 1963 LDPC codes in Galois Fields, iterative hard decoding algo. B fordv = 3,
MacKay 1998 Advantages for small blocks/high rates, ultra-sparse dv = 2 LDPCcodes in high-order Fields,
2003-2006 Development of practical decoders for Non-Binary LDPC codes,
2006-2010 Attempts to find applications where NB-LDPC codes outperformbinary LDPC codes,
2010-xxx Golden age of Non-Binary LDPC codes ?
[DAVEY 98] M. DAVEY AND D.J.C. MACKAY, “LOW DENSITY PARITY CHECK CODES OVER GF(Q)”,IEEE communication letter, VOL. 2, PP 165–167, JUNE 1998
[MACKAY 99] D.J.C. MACKAY AND M. DAVEY, “EVALUATION OF GALLAGER CODES FOR SHORT BLOCK LENGTH
AND HIGH RATE APPLICATIONS”, proc. of IMA workshop on codes, systems and graphical models, 1999
D. Declercq (ETIS - UMR8051) 5 / 59
![Page 6: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/6.jpg)
Definitions and Quantities
LDPC Code
CH =
c ∈ GF (q)×N | H.c
GF (q)= 0
ff LDPC Code
CG =˘
c = G.u,∀u ∈ GF (q)×K¯
with H(M × N), parity check matrix,
with G(N × K ), generator matrix.
Size
8<:of a codeword N R = K/Nof information K (if H is full rank)of the redundancy M
Density of H :
dH = nb. nonzero elements in HM.N LDPC : dH
N→+∞−→ 0
D. Declercq (ETIS - UMR8051) 6 / 59
![Page 7: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/7.jpg)
Tanner Graph Representation of a Binary LDPC Code
Tanner Graph is a Bi-partite graph with Adjacency Matrix H
LDPC: Low Density Parity Check Codes
10 2 3 5 6 74
10 2 3 5 6 74
Interleaver Π
Parity checks
Parity checks
codeword
codeword
: message
c c c c c c c c
cccccccc
01 1 10 0 0
1 1 1 10 0 0 0
0 0 0 1 111 0
10 0101
1
0 1
H =
D. Declercq (ETIS - UMR8051) 7 / 59
![Page 8: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/8.jpg)
Tanner graph Representation of a Non-Binary LDPCcode
Tanner graph of an Irregular Non-Binary LDPC code in GF (8)
03 7 20 0 0
4 3 5 10 0 0 0
1 0 0 6 735 0
00 0106
1
2
H =
0
.
.
.
.
.
.
.
.
10 2 3 5 6 74
10 2 3 5 6 74
Interleaver Π
Codeword
Codeword
: message
c c c c c ccc
c c c c c c c c
parity checks
parity checks
D. Declercq (ETIS - UMR8051) 8 / 59
![Page 9: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/9.jpg)
parameters for Non-Binary LDPC code irregularityIrregularity distribution, Irregularity profile
1 edges proportions :
λi : proportion of nonzero values {Hkl} in degree i columns,ρj : proportion of nonzero values {Hkl} in degree j rows,
2 node proportions :
λi : proportion of columns in H with degree i ,ρj : proportion of rows in H with degree j ,
λ(x) =dvmaxX
i=2
λi x i−1 ρ(x) =dcmaxX
j=2
ρj x j−1
3 non-zero values distribution :
hij(ω) : uniformly distributed in GF(q)\0.
D. Declercq (ETIS - UMR8051) 9 / 59
![Page 10: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/10.jpg)
Notion of Code Family
A LDPC code family is defined by (λ(x), ρ(x),N)
for characterization, proofs, theoretical studies
A fixed LDPC code is defined by (λ(x), ρ(x),N,Π, {hij})for practical application
Π
λ(x) =4
24x +
324
x2 +824
x3 +924
x8 ρ(x) = x5
D. Declercq (ETIS - UMR8051) 10 / 59
![Page 11: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/11.jpg)
Outline
1 Introduction
2 Belief Propagation on a General Graph
3 Binary Belief Propagation Decoder
4 Non-Binary Belief Propagation Decoding
D. Declercq (ETIS - UMR8051) 11 / 59
![Page 12: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/12.jpg)
Concept of Iterative Decoder on GraphFrom Local Computation to Global Optimization
The general concept of LDPC decoders is based on message passing between nodes inthe Tanner graph of the code, so that iterative updates of the messages lead to a stablestate of the messages: convergence to a fixed point.
Messages represent probability density functions of the random variables. For a discreterandom variable in a set of q elements:
µ(0) = Prob (xn = 0|.) . . . µ(q − 1) = Prob (xn = q − 1|.)q−1Xi=0
µ(k) = 1
The decoding result is the a posteriori probability of one random variable xn:
Prob (xn|y0, y1, . . . , yN−1)
a particular scheduling of the computation of the messages defines a decoding iteration.
D. Declercq (ETIS - UMR8051) 12 / 59
![Page 13: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/13.jpg)
Terminology and some HistoryBelief Propagation: BP
1 Artificial Intelligence
Statistical learning : Pearl’s Belief Propagation (1981-86)Neural Networks : sum-product algorithm (ΣΠ) (1985-86)
2 Information Theory
Gallager iterative decoders for LDPC (1963),Viterbi (1967), BCJR (1974): can be analysed as BP on Factor graphs,
3 Statistical Physics
BP = Bethe approximation of the global free energy of complex systems (1935),Generalized BP = Kikuchi approximation of the free energy (1951)
D. Declercq (ETIS - UMR8051) 13 / 59
![Page 14: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/14.jpg)
ExampleTanner graph (for PC codes) is a special case of Factor Graph
Let A(ω), B(ω), C(ω), D(ω) be dependant random variables.
Let A′, B′, C′, D′ be their noisy observation.
A B
C D
A’
B’
C’ D’
p(B|A’,C’,D’)
With Belief propagation on a tree, we get the a posteriori density : optimal solution.
D. Declercq (ETIS - UMR8051) 14 / 59
![Page 15: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/15.jpg)
Notations for bi-partite graphs
xn
µx f
µf x
F(.)
node
variable function
node
(µx → f ,µf → x ) : messages = p.d.f.
The graph is not oriented: messages needed in both directions.
2 types of nodes = 2 types of local updates: data node update and function node update
D. Declercq (ETIS - UMR8051) 15 / 59
![Page 16: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/16.jpg)
Concept of Computational TreeExpansion of the graph from a symbol/check node.
Nodes seen after1 iteration
Nodes seen after2 iterations
LLRLLR LLR
F2F1
D. Declercq (ETIS - UMR8051) 16 / 59
![Page 17: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/17.jpg)
Concept of Computational TreePast of the graph = set of nodes.
SF1
SF2
Nodes seen after1 iteration
Nodes seen after2 iterations
LLRLLR LLR
F2F1
Past of F1: Past of F2:
D. Declercq (ETIS - UMR8051) 17 / 59
![Page 18: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/18.jpg)
Concept of Computational TreeIndependence assumption
SF1
SF2
Nodes seen after1 iteration
Nodes seen after2 iterations
LLRLLR LLR
F2F1
Past of F1: Past of F2:
Disjoint
Independent
D. Declercq (ETIS - UMR8051) 18 / 59
![Page 19: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/19.jpg)
Variable Node Update: Bayesian Merging⇔ bitnode/symbol updates for LDPC codes
F(.)1
µx f
x1
F(.)2
F(.)
F(.)4
3
1
µf x
3
µf x
4
µf x
2
µ1x → f [k ] ∝
4Yi=2
µif → x [k ] ∀k = 0 . . . q − 1
ASSUMPTION: input messages µif → x are independent
ASSUMPTION: noisy symbol sets leading to µif → x are disjoint
∆! Update equation is NOT normalized
D. Declercq (ETIS - UMR8051) 19 / 59
![Page 20: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/20.jpg)
Function Node Update: Bayesian Marginalization⇔ checknode updates for LDPC codes
x1
x4
x1
x3
x2
x2
x3
x4
1µ
f x
µx f
4F( , , , )
µx f
2
µx f
3
µ1f → x [k1] =
Xk2,k3,k4
F`x1 = k1, x2 = k2, x3 = k3, x4 = k4
´ 4Yi=2
µix → f [ki ]
∀k1 = 0 . . . q − 1
ASSUMPTION: input messages µif → x are independent
ASSUMPTION: noisy symbol sets leading to µif → x are disjoint
D. Declercq (ETIS - UMR8051) 20 / 59
![Page 21: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/21.jpg)
Function Node Update: Bayesian MarginalizationNon-Binary Parity-check case in GF(q)
µ1f → x [k1] =
Xk2,k3,k4
F`x1 = k1, x2 = k2, x3 = k3, x4 = k4
´ 4Yi=2
µix → f [ki ]
let αk ∈ GF(q) =˘
0, 1, α, α2, . . . , αq−1¯Parity-check case: the function node reduces to an indicator function:
F“
x1 = α1, x2 = α2, x3 = α3, x4 = α4
”= 1 if α1 + α2 + α3 + α4 = 0
F“
x1 = α1, x2 = α2, x3 = α3, x4 = α4
”= 0 if α1 + α2 + α3 + α4 6= 0
Parity-check case: results in one less sum dimension in the marginalization:
µ1f → x [α1] =
Xα2,α3
µ2x → f [α2] µ3
x → f [α3] µ4x → f [α1 + α2 + α3].
D. Declercq (ETIS - UMR8051) 21 / 59
![Page 22: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/22.jpg)
Scheduling and Definition of Iteration
This ordering of the messages is called flooding schedule
One decoding iteration = µ→ µ→ APP
D. Declercq (ETIS - UMR8051) 22 / 59
![Page 23: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/23.jpg)
Concept of Computational Tree (cont’d)
Nodes seen after1 iteration
Nodes seen after2 iterations
APP
LLRLLR LLR
Computational span of L iterations: in L iterations, a maximum ofdv ∗ (dv − 1)L−1 ∗ (dc − 1)L nodes are seen from the top of the tree.
As a consequence, a usual assumption is that the BP decoder needs at least L = log(N)iterations to converge (to see all LLRs).
As a consequence, the independence assumption for the BP decoder breaks after at mostL = log(N) iterations.
D. Declercq (ETIS - UMR8051) 23 / 59
![Page 24: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/24.jpg)
Concept of Computational Tree (cont’d)Breaking the independance assumption
Nodes seen after1 iteration
Nodes seen after2 iterations
LLRLLR LLR
wrong APP
wrong update
wrong update
a crucial parameter of the graph is its girth g, i.e. the size of the smallest closed path/cycle,
As a consequence, only bg/4c decoding iterations correspond to an exact inference !!!
D. Declercq (ETIS - UMR8051) 24 / 59
![Page 25: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/25.jpg)
Alternate Scheduling of messages (1)Layered BP or Shuffled Scheduling
D. Declercq (ETIS - UMR8051) 25 / 59
![Page 26: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/26.jpg)
Alternate Scheduling of messages (2)Layered BP or Shuffled Scheduling
D. Declercq (ETIS - UMR8051) 26 / 59
![Page 27: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/27.jpg)
Alternate Scheduling of messages (3)Layered BP or Shuffled Scheduling
D. Declercq (ETIS - UMR8051) 27 / 59
![Page 28: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/28.jpg)
Alternate Scheduling of messages (4)Layered BP or Shuffled Scheduling
Advantage: for bitnodes with degree dv ≥ 3 Messages are computed several times during ONEiteration⇒ Faster convergence.
D. Declercq (ETIS - UMR8051) 28 / 59
![Page 29: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/29.jpg)
Outline
1 Introduction
2 Belief Propagation on a General Graph
3 Binary Belief Propagation Decoder
4 Non-Binary Belief Propagation Decoding
D. Declercq (ETIS - UMR8051) 29 / 59
![Page 30: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/30.jpg)
Binary Belief Propagation Algorithm in theLog-Domain
definition of messages in the log-domain
uk = logµk
f → x [0]
µkf → x [1]
vk = logµk
x → f [0]
µkx → f [1]
message update through the 2 types of nodes
VmU
k
U (LLR)0
C
vm = u0 +
dvXk=1,k 6=m
uktanh
uk
2=
dcYm=1;m 6=k
tanhvm
2Vm
Uk
D. Declercq (ETIS - UMR8051) 30 / 59
![Page 31: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/31.jpg)
From the Probability-Domain to the Log-Domain (1)Bitnode Update
µdvx → f [k ] =
dv−1Yi=1
µif → x [k ] ∀k = 0, 1
Let us consider a dv = 3 bitnode with v3 as output message:
v3 = logµ3
x → f [0]
µ3x → f [1]
= logµ0[0] µ1
f → x [0] µ1f → x [0]
µ0[1] µ1f → x [1] µ1
f → x [1]
= u0 + u1 + u2
D. Declercq (ETIS - UMR8051) 31 / 59
![Page 32: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/32.jpg)
From the Probability-Domain to the Log-Domain (2)Checknode Update
µdcf → x [αdc ] =
Xα1,...,αdc−1
dc−1Yi=1
µix → f [αi ] 1
Xk
αk = 0
!αk = 0, 1
Let us consider a dc = 3 bitnode with u3 as output message:
µ3f → x [0] = µ1
x → f [0]µ2x → f [0] + µ1
x → f [1]µ2x → f [1]
µ3f → x [1] = µ1
x → f [0]µ2x → f [1] + µ1
x → f [1]µ2x → f [0]
D. Declercq (ETIS - UMR8051) 32 / 59
![Page 33: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/33.jpg)
From the Probability-Domain to the Log-Domain (3)Checknode Update Intermediate step: decoding in the Fourier Domain
Now compute the factorization of the sum ...
(µ3f → x [0] + µ3
f → x [1]) = (µ1x → f [0] + µ1
x → f [1]) (µ2x → f [0] + µ2
x → f [1])
... and the factorization of the difference
(µ3f → x [0]− µ3
f → x [1]) = (µ1x → f [0]− µ1
x → f [1]) (µ2x → f [0]− µ2
x → f [1])
we can write in vector form:"1 11 −1
#"µ3
f → x [0]
µ3f → x [1]
#=
"1 11 −1
#"µ1
x → f [0]
µ1x → f [1]
#!⊗ "
1 11 −1
#"µ2
x → f [0]
µ2x → f [1]
#!
D. Declercq (ETIS - UMR8051) 33 / 59
![Page 34: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/34.jpg)
From the Probability-Domain to the Log-Domain (4)Checknode Update Intermediate step: decoding in the Fourier Domain
with the definitions of the Fourier Transforms
F =
»1 11 −1
–F−1 = 1
2
»1 11 −1
–
we obtain the checknode update in the Fourier Domain:
"µ3
f → x [0]
µ3f → x [1]
#= F−1 ×
F ×
"µ1
x → f [0]
µ1x → f [1]
#⊗ F ×
"µ2
x → f [0]
µ2x → f [1]
#!
D. Declercq (ETIS - UMR8051) 34 / 59
![Page 35: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/35.jpg)
From the Probability-Domain to the Log-Domain (5)
link between the probability domain and the log-domain
µ3x → f [0] =
eu3
eu3 + 1µ3
x → f [1] =1
eu3 + 1
From previous equations, we have:
(µ3f → x [0]− µ3
f → x [1]) = (µ1x → f [0]− µ1
x → f [1]) (µ2x → f [0]− µ2
x → f [1])
⇒eu3 − 1eu3 + 1
=
„ev1 − 1ev1 + 1
« „ev2 − 1ev2 + 1
«⇒
eu3/2 − e−u3/2
eu3/2 + e−u3/2=
ev1/2 − e−v1/2
ev1/2 + e−v1/2
! ev2/2 − e−v2/2
ev2/2 + e−v2/2
!
⇒ tanhu3
2= tanh
v1
2× tanh
v2
2
D. Declercq (ETIS - UMR8051) 35 / 59
![Page 36: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/36.jpg)
Final Step: Remove all Products
Lets compute the BP checknode update in the Log-Domain:
log tanh|u3|2
= log tanh|v1|2
+ log tanh|v2|2
The sign of the message is computed in a parallel stream:
sign“
tanhu3
2
”= sign
“tanh
v1
2
”× sign
“tanh
v2
2
”
sign (u3) = sign (v1) × sign (v2)
D. Declercq (ETIS - UMR8051) 36 / 59
![Page 37: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/37.jpg)
Binary Belief Propagation Algorithm in theLog-Domain
definition of messages in the log-domain
uk = logµk
f → x [0]
µkf → x [1]
vk = logµk
x → f [0]
µkx → f [1]
message update through the 2 types of nodes
VmU
k
U (LLR)0
C
vm = u0 +
dvXk=1,k 6=m
uk
log tanh|uk |
2=
dcXm=1;m 6=k
log tanh|vm|
2
sign(uk ) =
dcYm=1;m 6=k
sign(vm)
Vm
Uk
D. Declercq (ETIS - UMR8051) 37 / 59
![Page 38: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/38.jpg)
From the Log-Domain BP to the Min-Sum decoder
From previous equations:
µ3f → x [0] = µ1
x → f [0]µ2x → f [0] + µ1
x → f [1]µ2x → f [1]
µ3f → x [1] = µ1
x → f [0]µ2x → f [1] + µ1
x → f [1]µ2x → f [0]
u3 = log
µ3
f → x [0]
µ3f → x [1]
!
= log„
ev1
ev1 + 1ev2
ev2 + 1+
1ev1 + 1
1ev2 + 1
«− log
„ev1
ev1 + 11
ev2 + 1+
1ev1 + 1
ev2
ev2 + 1
«= log
`ev1+v2 + 1
´− log
`ev1 + ev2
´= max ∗(v1 + v2, 0)−max ∗(v1, v2)
where max ∗(x , y) = log`ex + ey´ denotes the Jacobian Logarithm.
D. Declercq (ETIS - UMR8051) 38 / 59
![Page 39: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/39.jpg)
From the Log-Domain BP to the Min-Sum decoder
After some transformations:
u3 = max ∗(v1 + v2, 0)−max ∗(v1, v2)
= max(v1 + v2, 0)−max(v1, v2) + log
1 + e|v1+v2|
1 + e|v1−v2|
!
= sign(v1) sign(v2) min (|v1| , |v2|) + log
1 + e|v1+v2|
1 + e|v1−v2|
!
The additionnal term log
1 + e|v1+v2|
1 + e|v1−v2|
!can be replaced by a constant value.
noting that this term is negative when v1 and v2 have the same sign, and the term ispositive when v1 and v2 have different signs, ...
D. Declercq (ETIS - UMR8051) 39 / 59
![Page 40: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/40.jpg)
We finally get the Corrected Min-Sum decoder
1 Bitnode update: same as for BP
vm = u0 +
dvXk=1,k 6=m
uk
2 Checknode update:
uk =
0@ dcYm=1;m 6=k
sign(vm)
1A . minm 6=k|vm|
3 Compensation/Correction:
uk = max(0, uk − γ) if uk > 0
uk = min(0, uk + γ) if uk < 0
D. Declercq (ETIS - UMR8051) 40 / 59
![Page 41: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/41.jpg)
Comments on the different Decoding Algorithms
Shuffled Scheduling can be parallelized if the LDPC is properly designed⇒ increasedthroughput,
Shuffled Scheduling converges approximately 2 to 3 times faster than flooding schedule⇒reduced latency,
Bit-flipping, Gal-A and Gal-B: easier to get theorems on theoretical performance,
Min-Sum with proper offset correction approaches BP for regular or slightly irregular LDPCcodes,
In some particular cases, the Min-Sum decoder can surpass the BP decoder in the errorfloor region.
D. Declercq (ETIS - UMR8051) 41 / 59
![Page 42: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/42.jpg)
Outline
1 Introduction
2 Belief Propagation on a General Graph
3 Binary Belief Propagation Decoder
4 Non-Binary Belief Propagation Decoding
D. Declercq (ETIS - UMR8051) 42 / 59
![Page 43: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/43.jpg)
Belief Propagation in the Probability DomainCheck Node Equations: augmenting the Factor graph representation
Now the code is defined from Non Binary Parity Check equations
dcXj=1
hij .cj = 0 in GF (q)
with GF (q) =˘
0, α0, α1, . . . , αq−1¯
h 1c1
c1 c2 c3
h 3c3h 2c2
h c + h c + h c 1 1 2 2 3 3
x fµµ
f x
c1 c2 c3
p cµ
v pµ
µc p
µp v
Permutation
Nodes
+
D. Declercq (ETIS - UMR8051) 43 / 59
![Page 44: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/44.jpg)
Belief Propagation in the Probability DomainVariable Node Equations
Now the code is defined from Non Binary Parity Check equations
dcXj=1
hij .cj = 0 in GF (q)
with GF (q) =˘
0, α0, α1, . . . , αq−1¯
Variable node Update is the Kronecker product of all incomming messages:
µdvv → p[k ] =
dv−1Yi=1
µip → v [k ] ∀k = 0, . . . , q − 1
Or in vector form:
µdvv → p
= µ0p → v
⊗ . . .⊗ µdv−1p → v
D. Declercq (ETIS - UMR8051) 44 / 59
![Page 45: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/45.jpg)
Explaining the Permutation Stepcyclic permutation / rotation
GF(q) is a cyclic Field, as such, multiplication by hij acts on the symbols as a cyclicpermutation of the Field elements:
µip → c [k ′] = µi
v → p[k ] αk′ = hij αk ∀k = 0, . . . , q − 1
α
α
α
α
α
α
0
1
2
3
4
5
6α
α
α
α
α
α
α
α
0
1
2
3
4
5
6
cji ih .
Internal operation
in GF(8)
00
ci
α =
α =
α =
α =
α =
0
1
2
3
4
5
6α =
α = α
α
α
α
α
α
α
2
2
2
2
2
2
2
GF(8)GF(8)
D. Declercq (ETIS - UMR8051) 45 / 59
![Page 46: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/46.jpg)
Belief Propagation in the Probability DomainCheck Node Equations
Now the code is defined from Non Binary Parity Check equations
dcXj=1
hij .cj = 0 in GF (q)
with GF (q) =˘
0, α0, α1, . . . , αq−1¯
Check node Update is still a Bayesian marginalization. Case of dc = 3 and GF(4):
h 1c1 h 3c3h 2c2
p cµ
p cµ
µc p
1
2
3
+
c1 \ c2 0 1 2 30 0 1 2 31 1 0 3 22 2 3 0 13 3 2 1 0
D. Declercq (ETIS - UMR8051) 46 / 59
![Page 47: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/47.jpg)
Belief Propagation in the Probability DomainCheck Node Equations
h 1c1 h 3c3h 2c2
p cµ
p cµ
µc p
1
2
3
+
c1 \ c2 0 1 2 30 0 1 2 31 1 0 3 22 2 3 0 13 3 2 1 0
Check node Update is still a Bayesian marginalization. Case of dc = 3 and GF(4):
µ3c → p[0] = µ1
p → c [0]µ2p → c [0] + µ1
p → c [1]µ2p → c [1] + µ1
p → c [2]µ2p → c [2] + µ1
p → c [3]µ2p → c [3]
µ3c → p[1] = µ1
p → c [0]µ2p → c [1] + µ1
p → c [1]µ2p → c [0] + µ1
p → c [2]µ2p → c [3] + µ1
p → c [3]µ2p → c [2]
µ3c → p[2] = µ1
p → c [0]µ2p → c [2] + µ1
p → c [2]µ2p → c [0] + µ1
p → c [1]µ2p → c [3] + µ1
p → c [3]µ2p → c [1]
µ3c → p[3] = µ1
p → c [0]µ2p → c [3] + µ1
p → c [0]µ2p → c [3] + µ1
p → c [1]µ2p → c [2] + µ1
p → c [2]µ2p → c [1]
The number of terms in the above equations grows as q2.
D. Declercq (ETIS - UMR8051) 47 / 59
![Page 48: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/48.jpg)
Belief Propagation in the Probability DomainHow to simplify the checknode update ?
h 1c1 h 3c3h 2c2
p cµ
p cµ
µc p
1
2
3
Fourier ?
+
D. Declercq (ETIS - UMR8051) 48 / 59
![Page 49: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/49.jpg)
Tensorial Notation of Messages
in case of binary extension fields GF(2p), the symbols c ∈GF(q) could be represented bya binary map, or a polynomial:
c = [c1, . . . , cp] with {c1, . . . , cp} ∈ {0, 1}
c(x) =
pXi=1
ci x i−1 with {c1, . . . , cp} ∈ {0, 1}
Let put the probability weights µ(c = αk ) in a size-2, p-dimensional tensor indexed bybinary values {c1, . . . , cp}.
C = =Prob(u(x)=x)
Prob(c(x)=1)
Prob(c(x)=0)
Prob(c(x)=1+x)
C[0,0]
C[1,0]
C[0,1]
C[1,1]
C[i,j]=
D. Declercq (ETIS - UMR8051) 49 / 59
![Page 50: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/50.jpg)
Tensorial Notation of Messagesa GF(8) example
Prob(c(x)=1+x+x )
Prob(c(x)=1)
Prob(c(x)=x)
Prob(c(x)=1+x)
Prob(c(x)=0)
Prob(c(x)=x )2
Prob(c(x)=x+x )2
2
Prob(c(x)=1+x )2
C =
C[1,0,0]C[0,1,0]C[0,0,1]C[1,1,0]C[0,1,1]C[1,1,1]C[1,0,1]
=C[i,j,k]=
C[0,0,0]
D. Declercq (ETIS - UMR8051) 50 / 59
![Page 51: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/51.jpg)
Tensorial Notation of Messagesa GF(8) example
Prob(c(x)=1)
Prob(c(x)=x)
Prob(c(x)=1+x)
Prob(c(x)=0)
Prob(c(x)=1+x+x )
Prob(c(x)=x )2
Prob(c(x)=x+x )2
2
Prob(c(x)=1+x )2
C =
C[1,0,0]C[0,1,0]C[0,0,1]C[1,1,0]C[0,1,1]C[1,1,1]C[1,0,1]
=C[i,j,k]=
C[0,0,0]
D. Declercq (ETIS - UMR8051) 51 / 59
![Page 52: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/52.jpg)
Tensorial Notation of Messagesa GF(8) example
Prob(c(x)=1)
Prob(c(x)=x)
Prob(c(x)=1+x)
Prob(c(x)=0)
Prob(c(x)=1+x+x )
Prob(c(x)=x )2
Prob(c(x)=x+x )2
2
Prob(c(x)=1+x )2
C =
C[1,0,0]C[0,1,0]C[0,0,1]C[1,1,0]C[0,1,1]C[1,1,1]C[1,0,1]
=C[i,j,k]=
C[0,0,0]
D. Declercq (ETIS - UMR8051) 52 / 59
![Page 53: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/53.jpg)
Tensorial Notation of Messagesa GF(8) example
Prob(c(x)=1)
Prob(c(x)=x)
Prob(c(x)=1+x)
Prob(c(x)=0)
Prob(c(x)=1+x+x )
Prob(c(x)=x )2
Prob(c(x)=x+x )2
2
Prob(c(x)=1+x )2
C =
C[1,0,0]C[0,1,0]C[0,0,1]C[1,1,0]C[0,1,1]C[1,1,1]C[1,0,1]
=C[i,j,k]=
C[0,0,0]
D. Declercq (ETIS - UMR8051) 53 / 59
![Page 54: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/54.jpg)
Fast Fourier Transform applied to Tensors
Expression of the Fourier Transform
◦C= F(C) = C ⊗1 F ⊗2 F . . .⊗p F F =
»1 11 −1
–where ⊗k denotes the tensor product in the k -th dimension of the tensor C(i1, . . . , ip).
in the k -th dimension, ∀ (i1, . . . , ik−1, ik+1, . . . , ip) ∈ {0, 1}p−1
◦C (i1, . . . , ik−1, 0, ik+1, . . . , ip) = C(i1, . . . , ik−1, 0, ik+1, . . . , ip) + C(i1, . . . , ik−1, 1, ik+1, . . . , ip)◦C (i1, . . . , ik−1, 1, ik+1, . . . , ip) = C(i1, . . . , ik−1, 0, ik+1, . . . , ip)− C(i1, . . . , ik−1, 1, ik+1, . . . , ip)
for the Fourier Transform in one dimension, we perform 2p = q operations,
the total number of operations for F(.) is then p 2p = q log(q) operations:⇒ Fast Fourier Transform
D. Declercq (ETIS - UMR8051) 54 / 59
![Page 55: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/55.jpg)
Illustration of the FFT in multiple dimensions
1 1
1 −1
1 1
1 −1
1 1
1 −1
11
1
−1
GF(4) : 2 dimensions
GF(8) : 3 dimensions
1 1
1 −1
D. Declercq (ETIS - UMR8051) 55 / 59
![Page 56: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/56.jpg)
Belief Propagation Decoding Steps in the FourierDomain
F F F F F F F F
Π
Information Symbols
Fourier Tranform
Permutation Nodes
Interleaver
Product Node
U
UV
V vppv
pccp
product
permutation
Fourier
product
D. Declercq (ETIS - UMR8051) 56 / 59
![Page 57: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/57.jpg)
Belief Propagation in the Log-Domain (1)recursive use of the max∗ operator.
Quantization impacts on the performance are very strong in the Probability Domain
u(k) = logµc → p[k ]
µc → p[0]∀k = 0, . . . , q − 1 v(k) = log
µp → c [k ]
µp → c [0]∀k = 0, . . . , q − 1
Case of dc = 3 and GF(4):
µ3c → p[0] = µ1
p → c [0]µ2p → c [0] + µ1
p → c [1]µ2p → c [1] + µ1
p → c [2]µ2p → c [2] + µ1
p → c [3]µ2p → c [3]
µ3c → p[1] = µ1
p → c [0]µ2p → c [1] + µ1
p → c [1]µ2p → c [0] + µ1
p → c [2]µ2p → c [3] + µ1
p → c [3]µ2p → c [2]
µ3c → p[2] = µ1
p → c [0]µ2p → c [2] + µ1
p → c [2]µ2p → c [0] + µ1
p → c [1]µ2p → c [3] + µ1
p → c [3]µ2p → c [1]
µ3c → p[3] = µ1
p → c [0]µ2p → c [3] + µ1
p → c [0]µ2p → c [3] + µ1
p → c [1]µ2p → c [2] + µ1
p → c [2]µ2p → c [1]
D. Declercq (ETIS - UMR8051) 57 / 59
![Page 58: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/58.jpg)
Belief Propagation in the Log-Domain (2)Limitations
After some manipulations:
u3(1) = max ∗(v1(1) , v2(1) , v1(2) + v2(3) , v1(3) + v2(2))− K
u3(2) = max ∗(v1(2) , v2(2) , v1(1) + v2(3) , v1(3) + v2(1))− K
u3(3) = max ∗(v1(3) , v2(3) , v1(1) + v2(1) , v1(2) + v2(1))− K
K = max ∗(0 , v1(1) + v2(1) , v1(2) + v2(2) , v1(3) + v2(3))
The number of max ∗ operators grows in O(q2),
Its a recursive implementation: approximations (e.g. use of max instead of max ∗, smallLUT) become rapidly catastrophic,
Log-Domain implementation and the FFT complexity reduction O(q2)→ O(q log(q)) arenot compliant.
D. Declercq (ETIS - UMR8051) 58 / 59
![Page 59: Tutorial GFqDecoding Part1](https://reader033.vdocuments.us/reader033/viewer/2022051311/544930d5b1af9f57618b4faf/html5/thumbnails/59.jpg)
Conclusion on Non-Binary Belief PropagationDecoding
The bottleneck of the decoder complexity is the check node update
1 Belief Propagation in the Time/Probability-Domain,[Davey,1998] M. DAVEY AND D.J.C. MACKAY, “LOW DENSITY PARITY CHECK CODES OVER GF(Q)”, IEEE communication
letter, VOL. 2, PP 165–167, JUNE 1998
2 Belief Propagation in the Time/Log-Domain (limits to GF(16)),[Wymeersch,2004] H. WYMEERSCH, H. STEENDAM AND M. MOENECLAEY, “LOG DOMAIN DECODING OF LDPC CODES
OVER GF (q)”, Proceedings of IEEE ICC, PARIS, FRANCE, JUNE 2004.
3 Belief Propagation in the Frequency/Probability Domain,[Davey,1998] M. DAVEY AND D.J.C. MACKAY, “LOW DENSITY PARITY CHECK CODES OVER GF(Q)”, IEEE communication
letter, VOL. 2, PP 165–167, JUNE 1998
[Barnault,2003] L. BARNAULT AND D. DECLERCQ, “FAST DECODING ALGORITHM FOR LDPC CODES OVER GF (2q )”,
Proceedings of IEEE Information theory workshop, PARIS, FRANCE, MARCH, 2003.
4 Belief Propagation in the Frequency/Log-Domain (partially - limits to GF(16)),[Song,2003] H. SONG AND J.R. CRUZ, “REDUCED-COMPLEXITY DECODING OF Q-ARY LDPC CODES FOR MAGNETIC
RECORDING”, IEEE Transactions on Magnetics, VOL. 39(2), MARCH 2003
D. Declercq (ETIS - UMR8051) 59 / 59