unit-1.ppt
DESCRIPTION
unit-1.pptTRANSCRIPT
![Page 1: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/1.jpg)
IT2303 IT2303 INFORMATION THEORY INFORMATION THEORY AND CODINGAND CODING
![Page 2: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/2.jpg)
SYLLABUSSYLLABUSUNIT I INFORMATION THEORY
Information – Entropy, Information rate, classification of codes, Kraft McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended Huffman coding - Joint and conditional entropies, Mutual information - Discrete memoryless channels – BSC, BEC – Channel capacity, Shannon limit.
UNIT II SOURCE CODING: TEXT, AUDIO AND SPEECH
Text: Adaptive Huffman Coding, Arithmetic Coding, LZW algorithm – Audio: Perceptual coding, Masking techniques, Psychoacoustic model, MEG Audio layers I,II,III, Dolby AC3 - Speech: Channel Vocoder, Linear Predictive Coding
![Page 3: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/3.jpg)
UNIT III SOURCE CODING: IMAGE AND VIDEO
Image and Video Formats – GIF, TIFF, SIF, CIF, QCIF – Image compression: READ, JPEG – Video Compression: Principles-I,B,P frames, Motion estimation, Motion compensation, H.261, MPEG standard
UNIT IV ERROR CONTROL CODING: BLOCK CODES
Definitions and Principles: Hamming weight, Hamming distance, Minimum distance decoding - Single parity codes, Hamming codes, Repetition codes - Linear block codes, Cyclic codes - Syndrome calculation, Encoder and decoder - CRC
UNIT V -ERROR CONTROL CODING: CONVOLUTIONAL CODES
Convolutional codes – code tree, trellis, state diagram - Encoding – Decoding: Sequential search and Viterbi algorithm – Principle of Turbo coding
![Page 4: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/4.jpg)
REFERENCE BOOKSREFERENCE BOOKSTEXT BOOKS:
R Bose, “Information Theory, Coding and Cryptography”,
TMH 2007 Fred Halsall, “Multimedia Communications: Applications,
Networks, Protocols and Standards”, Pearson Education Asia, 2002
REFERENCES: K Sayood, “Introduction to Data Compression” 3/e, Elsevier
2006 S Gravano, “Introduction to Error Control Codes”, Oxford
University Press 2007 Amitabha Bhattacharya, “Digital Communication”, TMH 2006
![Page 5: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/5.jpg)
UNIT I UNIT I INFORMATION THEORYINFORMATION THEORY
![Page 6: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/6.jpg)
ContentsContentsInformation – Entropy, Information rate,classification of codes, Kraft McMillan
inequality, Source coding theorem, Shannon-Fano
coding, Huffman coding, Extended Huffman coding
Joint and conditional entropies, Mutual information Discrete memoryless channels – BSC,
BECChannel capacity, Shannon limit.
![Page 7: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/7.jpg)
![Page 8: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/8.jpg)
Communication systemCommunication system
![Page 9: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/9.jpg)
![Page 10: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/10.jpg)
![Page 11: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/11.jpg)
Information is closely related to uncertainty or surprise.When message from source known->No surprise No informationProbability is low more surprise more information.Amount of information is inverse of probability of occurrence
![Page 12: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/12.jpg)
What is information theory ?◦ Information theory is needed to enable
the communication system to carry information (signals) from sender to receiver over a communication channel it deals with mathematical modelling and
analysis of a communication system its major task is to answer to the questions of
signal compression and transfer rate◦ Those answers can be found and solved
by entropy and channel capacity
![Page 13: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/13.jpg)
![Page 14: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/14.jpg)
Uncertainty, surprise & Uncertainty, surprise & InformationInformation
Before the event X= X i occurs, amount of uncertainty.
When the event X= X i occurs, amount of surprise.
After the occurrence of X= X i ,gain in amount of information.
Amount of information is related to inverse of probability of occurrence.
![Page 15: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/15.jpg)
![Page 16: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/16.jpg)
EntropyEntropy
![Page 17: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/17.jpg)
Property of entropyProperty of entropy
Entropy is bounded by
0 ≤ H(X) ≤ log2 K
![Page 18: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/18.jpg)
•The entropy is maximum with uniform distribution and minimum when there is only one possible value.
![Page 19: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/19.jpg)
Source Coding TheoremSource Coding Theorem
Source coding- an effective representation of data generated by a discrete source◦representation by source
encoderstatistics of the source must
be known (e.g. if coding priorities exist)
![Page 20: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/20.jpg)
• Two types of coding
1)Fixed length code
2)Variable length code (Morse code)• In morse code, letters and alphabets are encoded
as dots”.” and dashes”-”• Short code frequently occurring source symbol
(e)• Long code rare source symbol (q)• Efficient source should satisfy 2 condition
i. Code word produce by the encoder are in binary form
ii. The source code should be uniquely decodable.
![Page 21: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/21.jpg)
Shannon’s first TheoremShannon’s first TheoremL represents the average code word length.Lmin represents minimum possible value of
L.Coding efficiency is defined as ή = Lmin / L L ≥ LminAccording to source coding theorem, H(X)
represents as fundamental limit on the average number of bits per source symbol,so we can equate H(X) to Lmin
ή = H(X) / L
![Page 22: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/22.jpg)
Data CompactionData CompactionData compaction (lossless data
compression) means that we will remove redundant information from the signal prior the transmission◦ basically this is achieved by assigning
short descriptions to the most frequent outcomes of the source output and vice versa
Source-coding schemes that are used in data compaction are e.g. prefix coding, huffman coding, lempel-ziv,shano-fano.
![Page 23: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/23.jpg)
Prefix CodingPrefix Coding
![Page 24: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/24.jpg)
Huffman CodingHuffman Coding
![Page 25: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/25.jpg)
Contd.,Contd.,
![Page 26: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/26.jpg)
Discrete memoryless Discrete memoryless channelschannels
![Page 27: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/27.jpg)
EntropyEntropy
![Page 28: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/28.jpg)
Contd.,Contd.,Conditional entropy
(equivocation)-amount of uncertainty remaining about the channel input after the channel output is observed.
Marginal probability distribution of o/p random variable Y is obtained by averaging out dependence of on
![Page 29: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/29.jpg)
![Page 30: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/30.jpg)
![Page 31: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/31.jpg)
![Page 32: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/32.jpg)
Binary symmetric channelBinary symmetric channel
![Page 33: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/33.jpg)
![Page 34: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/34.jpg)
BSC.,BSC.,Conditional probability of error channel capacity is
C varies with probability of error in convex manner ,which is symmetric about p=1/2.
Channel noise free, set p=0 => C attains maximum value of one bit per channel use.At this value H(p) attains min value.
When error p=1/2, => C attains maximum value of zero,whereas entropy H(p) attains max value of unity, and channel is said to be useless.
![Page 35: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/35.jpg)
Mutual informationMutual information
I(X,Y) = H(X) – H(X,Y)
![Page 36: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/36.jpg)
Mutual informationMutual information
![Page 37: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/37.jpg)
![Page 38: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/38.jpg)
Properties of Mutual Properties of Mutual informationinformation
Symmetric
Non negative
I(X,Y) = H(X) + H(Y) – H(X,Y)
Mutual information of channel is related to joint entropy of channel input and channel output by
![Page 39: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/39.jpg)
![Page 40: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/40.jpg)
Channel CapacityChannel Capacity
![Page 41: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/41.jpg)
Definition – channel Definition – channel capacitycapacityChannel capacity (C)of a discrete
memory less channel is the maximum mutual information I(X;Y) in any single use of the channel( i.e., signaling interval),where maximization is over all possible input probability distributions.
C measured in bits per channel use or bits per transmission.
![Page 42: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/42.jpg)
Channel coding theoremChannel coding theorem“If a discrete memoryless source with an alphabet
‘S’ has an entropy H(S) and produces symbols every ‘Ts’
seconds; and a discrete memoryless channel has a capacity I (X,
Y) Max and is used once every Tc seconds; then if
c
Max
s T
)Y,X(I
T
)S(H
There exists a coding scheme for which the source output can be transmitted over the channel and be reconstructed with an arbitrarily small probability of error. The parameter C/Tc is called the critical rate. When this condition is satisfied with the equality sign, the system is said to be signaling at the critical rate.
![Page 43: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/43.jpg)
Conversely, if,
c
Max
s T
)Y,X(I
T
)S(H
it is not possible to transmit information over the channel and reconstruct it with an arbitrarily small probability of error
![Page 44: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/44.jpg)
![Page 45: unit-1.ppt](https://reader033.vdocuments.us/reader033/viewer/2022061108/5450355eb1af9f0e7e8b4598/html5/thumbnails/45.jpg)