1.coding theory

51
CODING THEORY CODING THEORY A Bird’s Eye View A Bird’s Eye View

Upload: mailstonaik

Post on 21-Oct-2015

41 views

Category:

Documents


0 download

DESCRIPTION

Theory

TRANSCRIPT

Page 1: 1.Coding Theory

CODING THEORYCODING THEORY

A Bird’s Eye ViewA Bird’s Eye View

Page 2: 1.Coding Theory

2

Text BooksShu Lin and Daniel. J. Costello Jr., “Error Control Coding:

Fundamentals and applications”, Prentice Hall Inc.

R.E. Blahut, “Theory and Practice of Error Control Coding”, MGH

References: Rolf Johannesson, Kamil Sh. Zigangirov, “Fundamentals of

Convolutional Coding”, Universities Press (India) Ltd. 2001.

Proakis, “Digital Communications”, MGH.

Page 3: 1.Coding Theory

3

Introduction

Role of Channel Coding in Digital Communication System

Block Codes and Convolution Codes

Channel Models

Decoding Rules

Error Correction Schemes

References

Page 4: 1.Coding Theory

4

Slide 41

Page 5: 1.Coding Theory

5

Page 6: 1.Coding Theory

6

Page 7: 1.Coding Theory

7

Page 8: 1.Coding Theory

8

Page 9: 1.Coding Theory

9

Page 10: 1.Coding Theory

10

Page 11: 1.Coding Theory

11

Page 12: 1.Coding Theory

12

Page 13: 1.Coding Theory

13

Page 14: 1.Coding Theory

14

Shannon’s Theorem Shannon’s Theorem (1948)(1948)

Noisy Coding Theorem due toNoisy Coding Theorem due to ShannonShannon:: Roughly: Consider channel with capacity Roughly: Consider channel with capacity CC. If . If

we are willing to settle for a rate of we are willing to settle for a rate of transmission that is strictly below C, then transmission that is strictly below C, then there is an encoding scheme for the source there is an encoding scheme for the source datadata,, that that will reduce the probability of a will reduce the probability of a decision error to any desired level. decision error to any desired level.

Problem: Proof is not constructive! To this Problem: Proof is not constructive! To this day, no one has found a way to construct the day, no one has found a way to construct the coding schemes promised by Shannon’s coding schemes promised by Shannon’s theorem. theorem.

Page 15: 1.Coding Theory

15

Shannon’s Theorem Shannon’s Theorem (1948)-contd(1948)-contd

Additional concerns: Additional concerns: Is the coding scheme easy to Is the coding scheme easy to

implement, both in encoding and implement, both in encoding and decoding? decoding?

May require extremely long codes.May require extremely long codes.

Page 16: 1.Coding Theory

16

The Shannon-Hartley The Shannon-Hartley TheoremTheorem

GGives us a theoretical maximumives us a theoretical maximum bit-rate that can be bit-rate that can be transmitted with an arbitrarily small bit-error rate transmitted with an arbitrarily small bit-error rate ((BERBER), with a), with a given average signal power, over a given average signal power, over a channel with bandwidth channel with bandwidth B HzB Hz,, which is affected by which is affected by AWGN.AWGN.

ForFor any given any given BERBER, however small, we can find a , however small, we can find a coding techniquecoding technique that achieves this that achieves this BERBER; smaller the ; smaller the given given BERBER, the more complicated will be the , the more complicated will be the coding coding technique.technique.

Page 17: 1.Coding Theory

17

Shannon-Hartley Theorem-Shannon-Hartley Theorem-contdcontd..

Let the channel bandwidth be Let the channel bandwidth be B HzB Hz and signal to noise ratio be and signal to noise ratio be S/NS/N (not (not in dB).in dB).

sec/)/1(log2

bitsNSBC

Page 18: 1.Coding Theory

18

Shannon-Hartley Theorem-Shannon-Hartley Theorem-contdcontd..

FFor a given bandwidth or a given bandwidth BB and a given and a given S/NS/N, , we can find a way ofwe can find a way of transmitting data at a transmitting data at a bit-rate bit-rate RR bits/secondbits/second, with a bit-error rate , with a bit-error rate ((BERBER) as low as we) as low as we like, as long as like, as long as RR C C..

Now assume we wish to transmit at an Now assume we wish to transmit at an average average energy/bitenergy/bit of of EEbb and the and the AWGNAWGN noise has two sided power spectral density noise has two sided power spectral density NN00 /2 /2 Watts per Watts per Hz. It follows that the Hz. It follows that the signal power signal power S = S = EEbbRR and the noise power and the noise power N = NN = N00BB Watts. Watts.

Page 19: 1.Coding Theory

19

Shannon-Hartley Theorem-Shannon-Hartley Theorem-contdcontd..

R/BR/B ratio is called ratio is called bandwidth efficiencybandwidth efficiency in in bits/sec/Hzbits/sec/Hz. How many bits per sec do I get for . How many bits per sec do I get for each Hz of bandwidth.We want this to be as each Hz of bandwidth.We want this to be as high as possible.high as possible. EEbb /N/N00 is the is the normalised normalised

average energy/bitaverage energy/bit, where the normalisation is , where the normalisation is with respect to thewith respect to the one sided PSD of the noise.one sided PSD of the noise.

The law gives the following bounds:The law gives the following bounds:

Page 20: 1.Coding Theory

20

Shannon-Hartley Theorem-Shannon-Hartley Theorem-contdcontd..

BRNE

BNRE

BR

BR

b

b

/12

)(

)1(log

/

min

0

0

2

Page 21: 1.Coding Theory

21

Shannon LimitShannon Limit TheThe bound bound gives the minimum possible gives the minimum possible normalised normalised

energy energy per bit per bit satisfying the Shannon-Hsatisfying the Shannon-Hartleyartley law. law. If we draw a graph of If we draw a graph of (E(Ebb/N/N00 ) )minmin against against (R/B)(R/B) we we

observe the that observe the that ((EEbb/N/N00 ) )minmin never goes less than about never goes less than about 0.69 which is about 0.69 which is about -1.6-1.6dBdB..

Therefore if our normalised energy per bit is less than Therefore if our normalised energy per bit is less than -1.6dB-1.6dB, we can never, we can never satisfy the Shannon-Hartley satisfy the Shannon-Hartley law, howeverlaw, however inefficient (in terms of bit/sec/Hz) we inefficient (in terms of bit/sec/Hz) we areare prepared to be.prepared to be.

Page 22: 1.Coding Theory

22

Shannon Limit-contd.Shannon Limit-contd.

There exists a limiting value of There exists a limiting value of (E(Ebb/N/N00 ) ) below which there cannot be below which there cannot be error free communication at any error free communication at any transmission rate.transmission rate.

The curve The curve R = CR = C will divide the will divide the achievable and non-achievable achievable and non-achievable regions.regions.

Page 23: 1.Coding Theory

23

Page 24: 1.Coding Theory

24

Modulation-Coding Modulation-Coding trade-offtrade-off

For For PPbb=10=10-5-5, , BPSKBPSK modulation requiresmodulation requires EEbb/N/N00 = = 9.6dB9.6dB (optimum un-coded binary modulation) (optimum un-coded binary modulation)

For this case, Shannon’s work promised a performance For this case, Shannon’s work promised a performance improvement of improvement of 11.2dB11.2dB over the performance of un- over the performance of un-coded binary modulation, through the use of coding coded binary modulation, through the use of coding techniques.techniques.

Today, Today, “Turbo Codes”,“Turbo Codes”, are capable of achieving are capable of achieving an improvement close to this.an improvement close to this.

““Turbo Codes” are Near Shannon limit error correcting Turbo Codes” are Near Shannon limit error correcting codescodes

Page 25: 1.Coding Theory

25

Coding TheoryCoding Theory--IntroductionIntroduction

Main problem:Main problem: A stream of source data, in the form of 0’s A stream of source data, in the form of 0’s

and 1’s, is being transmitted over a and 1’s, is being transmitted over a communication channel, such as a communication channel, such as a telephone line. Occasionally, disruptions telephone line. Occasionally, disruptions can occur in the channel, causing 0’s to can occur in the channel, causing 0’s to turn into 1’s and vice versa.turn into 1’s and vice versa.

Question: How can we tell when the Question: How can we tell when the original data has been changed, and when original data has been changed, and when it has, how can we recover the original it has, how can we recover the original data?data?

Page 26: 1.Coding Theory

26

Coding TheoryCoding Theory--IntroductionIntroduction

Easy things to try:Easy things to try: Do nothing. If a channel error occurs Do nothing. If a channel error occurs

with probability with probability pp, then the , then the probability of making a decision error probability of making a decision error is is pp..

Send each bit 3 times in successionSend each bit 3 times in succession. . The bit that occurs the majority of the The bit that occurs the majority of the timetime,, gets picked. (E.g. 010 => 0) gets picked. (E.g. 010 => 0)

Repetition codes!!Repetition codes!!

Page 27: 1.Coding Theory

27

Coding TheoryCoding Theory--IntroductionIntroduction

Generalize above:Generalize above: Send each bit Send each bit nn times, times, choose majority bit. In this way, we can choose majority bit. In this way, we can make the probability of making a decision make the probability of making a decision error arbitrarily small, but inefficient in error arbitrarily small, but inefficient in terms of transmission rate. terms of transmission rate.

As As nn increases the achievable increases the achievable BER BER reduces, at the expense of increased reduces, at the expense of increased codeword length (reduced code rate)codeword length (reduced code rate)

Repetition coding is inefficient…Repetition coding is inefficient…

Page 28: 1.Coding Theory

28

Coding Theory Introduction Coding Theory Introduction (cont’d)(cont’d)

Encode source information, by adding Encode source information, by adding additional informationadditional information ( (redundancyredundancy)), that , that can be used to detect, and perhaps can be used to detect, and perhaps correct, errors in transmission. correct, errors in transmission. TThe more he more redundancy we add, the more reliably we redundancy we add, the more reliably we can detect and correct errors, but the less can detect and correct errors, but the less efficient we become at transmitting the efficient we become at transmitting the source data.source data.

Page 29: 1.Coding Theory

29

Error control Error control applicationsapplications

DataData commcommunicationunication networks networks (Ethernet, FDDI, WAN, Bluetooth)(Ethernet, FDDI, WAN, Bluetooth)

Satellite and Deep space Satellite and Deep space communicationscommunications

Cellular mobile communicationsCellular mobile communications ModemsModems Computer busesComputer buses Magnetic disks and tapesMagnetic disks and tapes CDs, DVDs. Digital sound needs ECC!CDs, DVDs. Digital sound needs ECC!

Page 30: 1.Coding Theory

30

Error control categoriesError control categories

The error control problem can be classiThe error control problem can be classifified in ed in several ways:several ways:

Types of error control coding: Types of error control coding: detection detection vs. correctionvs. correction

Types of errors: how muchTypes of errors: how much clusteringclustering- - random, burst etc.random, burst etc.

Types of codes: Types of codes: block vs. convolutionalblock vs. convolutional

Page 31: 1.Coding Theory

31

Error Control StrategiesError Control Strategies

Error detectionError detection.. Goal: avoid accepting faulty data.Goal: avoid accepting faulty data. Lost data may be unfortunate; wrong data Lost data may be unfortunate; wrong data

may be disastrous.may be disastrous. ((Forward) error correction (FEC or ECC).Forward) error correction (FEC or ECC). Use redundancy in encoded message to Use redundancy in encoded message to

estimate from the received dataestimate from the received data what what message was actually sent.message was actually sent.

The best estimate is usually the The best estimate is usually the ““closest" closest" message. The optimal estimatemessage. The optimal estimate is the is the message that is most probable given what message that is most probable given what is received.is received.

Page 32: 1.Coding Theory

32

Page 33: 1.Coding Theory

33

Page 34: 1.Coding Theory

34

Types of Channel CodesTypes of Channel Codes Block CodesBlock Codes ( Codes with ( Codes with strong algebraic strong algebraic

flavorflavor)) ~1950---~1950--- Hamming CodeHamming Code ( (Single error Single error

correctioncorrection)) All codes in 50’s were too weak compared All codes in 50’s were too weak compared

to the codes promised by Shannonto the codes promised by ShannonMajor Breakthrough……..1960Major Breakthrough……..1960 BCH Codes…BCH Codes… Reed-Solomon Codes…Reed-Solomon Codes… Capable of correcting Capable of correcting Multiple ErrorsMultiple Errors

Page 35: 1.Coding Theory

35

Page 36: 1.Coding Theory

36

Page 37: 1.Coding Theory

37

Page 38: 1.Coding Theory

38

Convolutional CodesConvolutional Codes

Codes with Codes with Probabilistic flavor.Probabilistic flavor. Late 1950’s but gained popularity after Late 1950’s but gained popularity after

the introduction of the introduction of Viterbi algorithmViterbi algorithm in in 1967.1967.

Developed from the idea of sequential Developed from the idea of sequential decodingdecoding

Non-block codesNon-block codes Codes are generated by a convolution Codes are generated by a convolution

operation on the information sequenceoperation on the information sequence

Page 39: 1.Coding Theory

39

Page 40: 1.Coding Theory

40

Page 41: 1.Coding Theory

41

Coding Schemes: Trend…Coding Schemes: Trend…

Since 1970’s the two avenues of Since 1970’s the two avenues of research started working together research started working together

This resulted in the development This resulted in the development towards the codes promised by towards the codes promised by ShannonShannon

Today Today “Turbo Codes”,“Turbo Codes”, are capable are capable of achieving an improvement close of achieving an improvement close to Shannon Limitto Shannon Limit

Page 42: 1.Coding Theory

42

Coding SchemesCoding Schemes

Applications demand for wide range Applications demand for wide range of data rates, block sizes, error of data rates, block sizes, error rates.rates.

No single error protection scheme No single error protection scheme works for all applications.works for all applications. Some Some requires the use of multiple coding requires the use of multiple coding techniques.techniques.

A common combination uses an A common combination uses an inner convolutional codeinner convolutional code and an and an outerouter Reed-Solomon code.Reed-Solomon code.

Page 43: 1.Coding Theory

43

Page 44: 1.Coding Theory

44

Page 45: 1.Coding Theory

45

Page 46: 1.Coding Theory

46 Slide 3

Page 47: 1.Coding Theory

47

Page 48: 1.Coding Theory

48

Page 49: 1.Coding Theory

49

Page 50: 1.Coding Theory

50

Page 51: 1.Coding Theory

51