reporter :林煜星 advisor : prof. y.m. huang

Post on 31-Dec-2015

40 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code. Reporter :林煜星 Advisor : Prof. Y.M. Huang. Outline. Introduction Related Research Transmission Model for BCJR Simulation for BCJR Algorithm - PowerPoint PPT Presentation

TRANSCRIPT

1

Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for

Parallel Concatenated Variable Length Code and Convolutional Code

Reporter:林煜星Advisor: Prof. Y.M. Huang

2

Outline

• Introduction

• Related Research– Transmission Model for BCJR

– Simulation for BCJR Algorithm

• Proposed Methodology– Transmission Model for Sequential

– Simulation for Soft-Decision Sequential Algorithm

• Conclusion

3

DemodulatorChannel Decoder

Source Decoder

User Joint Decoder

資料壓縮

錯誤更正碼

Discrete source

Source Encoder

Channel Encoder

Modulator

Introduction

Channel

4

Related Research

• [1]L. Guivarch, J.C. Carlach and P. Siohan

• [2]M. Jeanne, J.C. Carlach, P. Siohan and L.Guivarch

• [3]M. Jeanne, J.C. Carlach, Pierre Siohan

5

Transmission Model for BCJR

Independent Source

or first orderMarkov Source

Huffman CodingTurbo Coding

parallelconcatenation

Additive WhiteGaussian Noise

Channel

Turbo decodingUtilization

of the SUBMAP

HuffmanDecoding

ku

kv

pcP symbols

kdK bits

ku

kv

kx

ky

ˆkd

K bits

ˆkd

P symbols

a priori

6

Transmission Model for BCJR-Independent Source or first order Markov Source

Independent Source

or first orderMarkov Source

Huffman CodingTurbo Coding

parallelconcatenation

Additive WhiteGaussian Noise

Channel

Turbo decodingUtilization

of the SUBMAP

HuffmanDecoding

ku

kv

pcP symbols

kdK bits

ku

kv

kx

ky

ˆkd

K bits

ˆkd

P symbols

a priori

7

Transmission Model for BCJR-Independent Source or first order Markov Source(1)

Symbol Probability

A 0.75

B 0.125

C 0.125

8

Transmission Model for BCJR-Independent Source or first order Markov Source(2)

9

Transmission Model for BCJR-Independent Source or first order

Markov Source(3)

Y↓ X→∣ a b C

a 0.94 0.18 0.18

b 0.03 0.712 0.108

c 0.03 0.108 0.712

0.75

P a P a P a a P b P a b P c P a c

Example:

10

Transmission Model for BCJR-Huffman Codign

Independent Source

or first orderMarkov Source

Huffman CodingTurbo Coding

parallelconcatenation

Additive WhiteGaussian Noise

Channel

Turbo decodingUtilization

of the SUBMAP

HuffmanDecoding

ku

kv

pcP symbols

kdK bits

ku

kv

kx

ky

ˆkd

K bits

ˆkd

P symbols

a priori

11

VLC Symbol Probability

0 A 0.75

10 B 0.125

11 C 0.125

1 0.125( ) 0.251 0.5P y

P yP y

Transmission Model for BCJR-Huffman Coding

12

Transmission Model for BCJR-Turbo Coding parallel concatenation

Independent Source

or first orderMarkov Source

Huffman CodingTurbo Coding

parallelconcatenation

Additive WhiteGaussian Noise

Channel

Turbo decodingUtilization

of the SUBMAP

HuffmanDecoding

ku

kv

pcP symbols

kdK bits

ku

kv

kx

ky

ˆkd

K bits

ˆkd

P symbols

a priori

13

d = (11101)

u

v

Non SystematicConvolution code

Transmission Model for BCJR-Turbo Coding parallel concatenation(1)

14

10

(4,1)

(4,3)

(5,1)

(5,3)

(5,2)

(5,0)(4,0)

(4,2)

(3,1)

(3,3)

(3,0)

(3,2)

(2,1)

(2,3)

(2,0)

(2,2)

(1,1)

(1,3)

(1,0)

(1,2)

(0,1)

(0,3)

(0,0)

(0,2)

11

d = (11101)

01

10

01

00

Transmission Model for BCJR-Turbo Coding parallel concatenation(2)

15

Recursive Systematic Convolution(RSC)

kd 1lu

2lu

2 4

3 41

11, D D D

D DG D

Transmission Model for BCJR-Turbo Coding parallel concatenation(3)

Rate=1/2

16

Rate=1/4

2lv

ld

1lv

1lu

2lu

Transmission Model for BCJR-Turbo Coding parallel concatenation(4)

17

1 2 3 4

5 6 7 8

9 10 11 12

13 14 15 16

Interleaver

1 5 9 13

2 6 10 14

3 7 11 15

4 8 12 16

Transmission Model for BCJR-Turbo Coding parallel concatenation(5)

18

Rate=1/4

2lv

ld

1lv

1lu

2lu

Transmission Model for BCJR-Turbo Coding parallel concatenation(6)

Turbo Code rate1/3

19

Turbo Code rate=1/2

ld lu

2lu

2lv

lv

Transmission Model for BCJR-Turbo Coding parallel concatenation(7)

20

Independent Source

or first orderMarkov Source

Huffman CodingTurbo Coding

parallelconcatenation

Additive WhiteGaussian Noise

Channel

Turbo decodingUtilization

of the SUBMAP

HuffmanDecoding

ku

kv

pcP symbols

kdK bits

ku

kv

kx

ky

ˆkd

K bits

ˆkd

P symbols

a priori

Transmission Model for BCJR-AWGN

21

Transmission Model for BCJR-AWGN(1)

22

Independent Source

or first orderMarkov Source

Huffman CodingTurbo Coding

parallelconcatenation

Additive WhiteGaussian Noise

Channel

Turbo decodingUtilization

of the SUBMAP

HuffmanDecoding

ku

kv

pcP symbols

kdK bits

ku

kv

kx

ky

ˆkd

K bits

ˆkd

P symbols

a priori

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP

,k k kx y r

23

BCJR1

priori

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5)

BCJR2

24

MAP Decoder

1( ) P K

k kAPP d d R

Define

1

1

1

( , )

( ) ( )

, , , , 0,1k

k

k k

N

k k k

d

k k k k k k k

m P m R

m P R m

r m m P d m r m d

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(1)

1 1

K

KR r r

0,1kd

25

( 1)( 0)( ) ln k

k

APP dAPP dkd

Logarithm of Likelihood Ratio(LLR)

11

0 0

01

0 0

, ,

, ,ln

M M

k k k km mM M

k k k km m

r m m m m

r m m m m

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(2)

Recall

1

1ln( ) maxn

ii ne e

26

(4,1)

(4,3)10

01

00

11

(5,1)

(5,3)10

01

00

11

(5,2)

10

0 0 1 1( ) MN m 0 0 0m

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(3)

27

(0,0)

(2,3) (3,3)

(5,0)

2 3 3 3 1

3 3,2,3r2

2 1(3 , )P R 3 3 21,3 , 3P r 5

4 33P R

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(4)

( 1) ?kAPP d

28

BCJR1

priori

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5)

BCJR2

29

11

0 0

01

0 0

, ,

, ,ln

M M

k k k km mM M

k k k km m

r m m m m

kr m m m m

d

k c k kLa L x Le

Pr (1)

0 0)

1

Pr (ln k

k

iP

Pk iLa

22

cL

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(6)

30

Simulation for BCJR Algorithm

• The end of the transmission occurs when either the maximum bit error number fixed to 1000, or the maximum transmitted bits equal to 10 000 000 is reached.

• Input date into blocks of 4096 bits

31

Simulation for BCJR Algorithm(1)

1.0E-06

1.0E-05

1.0E-04

1.0E-03

1.0E-02

1.0E-01

1.0E+00

0 0.5 1 1.5 2 Eb/N0

BER

1NP2NP3NP4NP1P2P3P4P

1NP: 1次 iteration independent sourceNo Use a priori probability

1NP: 1次 iteration independent sourceUse a priori probability

32

1.0E-06

1.0E-05

1.0E-04

1.0E-03

1.0E-02

1.0E-01

1.0E+00

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2

Eb/N0

BER

1NP

2NP

3NP

4NP

1MP

2MP

3MP

4MP

Simulation for BCJR Algorithm(2)1NP:1次 iterationMarkov SourceNo use a proiri probability

1MP:1次 iterationMarkov SourceUse Markvo a prioriprobability

33

1.0E-07

1.0E-06

1.0E-05

1.0E-04

1.0E-03

1.0E-02

1.0E-01

1.0E+00

0 0.5 1 1.5 2

Eb/N0

BER

12D22D32D42D13D23D33D43D

Simulation for BCJR Algorithm(3)12D:1次 iterationIndependent SourceUse a priori probabilityBit time(level)、 Convolutionstate

13D:1次 iterationIndependent SourceUse a priori probabilityBit time(level)、 tree stateConvolution state

34

Proposed Methodology

• [4]Catherine Lamy, Lisa Perros-Meilhac

35

Sequential

priori

Transmission Model for Sequential-Sequential Decoding

BCJR2

36

1( ) Pr ......... 0,1N

k k kAPP d d R d

1 1

0 0

ln Pr ln Prpk

K n

k pk k k k kk k

pi c cr c r iy c y

ky 1 : if 0kr

0 : Otherwise

:pkc Code word bits

Transmission Model for Sequential-Sequential Decoding(1)

37(0,0)

0

(0,0)

3

1

(0,0)

(1,1)

1100

y=(10)

|r|=(13)

(1,0)

(0,0)(1,1)

(1,0)

Origin node

Open

y=(00)

|r|=(21)

1

4

(2,1)

1100 (2,0)

(2,1)

(1,1)

(2,0)

(1,0)

(1,0)4

1

(3,1)

1100

y=(11)

|r|=(21)

(3,0)(3,1)

(3,0)(2,0)

(2,0)

Close

4

2

(4,3)

01

10

y=(11)

|r|=(31)

(4,2)

(2,1)

(3,0)

(4,3)

(1,1)

(4,2)

(3,1)

(3,1)(5,0)

y=(00)

|r|=(12)

(5,1)

2

5

00

11

(5,0)

(2,1)

(3,0)

(4,3)

(1,1)

(5,1)

(4,2)(4,2)

Example:r=(-1, 3,2,1,-2,-1,-3,-1,1,2)y=(1,0,0,0,1,1,1,1,0,0)

2

3

4

4

4

5

Transmission Model for Sequential-Sequential Decoding(2)

38

1

1

( 1)( 0)1( ) ln APP d

APP dd

2 3 1

2

2

( 1)( 0)2( ) ln APP d

APP dd

2 4 2

Transmission Model for Sequential-Sequential Decoding(2)

39

Simulation for Sequential Algorithm

1.0E-02

1.0E-01

1.0E+00

0 0.5 1 1.5 2

Eb/N0

BER

3D1

3D2

3D3

3D4

2D1

2D2

2D3

2D4

2D1:1次 iterationIndependent SourceUse a priori probabilityBit time(level)、 Convolutionstate

3D1:1次 iterationIndependent SourceUse a priori probabilityBit time(level)、 Convolutionstate、 tree state

40

1.0E-03

1.0E-02

1.0E-01

1.0E+00

0 0.5 1 1.5 2

Eb/N0

BER

2D1

2D2

2D3

2D4

1NP

1P

41

• Heuristic方法求 Sequential Decoder Soft-Output value運用在 Iterative解碼架構,雖然使錯誤降低,節省運算時間,但解碼效果無法接近 Tubro Decoder的解碼效果,為來將繼續研究更佳的方法求 Sequential Decoder Soft-Output value使解碼效果更逼近 Turbo Decoder的解碼效果

Conclusion

top related