jawaharlal nehru engineering college · a given channel (binary symmetric channel) 18 4 generation...

44
Jawaharlal Nehru Engineering College Laboratory Manual Information Theory and Coding For Third Year Students Manual made by Dr. V.B.MALODE Prof. V. A.KULKARNI ” Author JNEC, Aurangabad

Upload: others

Post on 16-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

Jawaharlal Nehru Engineering College

Laboratory Manual

Information Theory and Coding

For

Third Year Students

Manual made by

Dr. V.B.MALODE Prof. V. A.KULKARNI

” Author JNEC, Aurangabad

Page 2: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

2

MGM’S

Jawaharlal Nehru Engineering College

N-6, CIDCO, Aurangabad

Department of Electronics &Telecommunication

Vision of the Department:

To develop GREAT technocrats and to establish centre of excellence in the field of Electronics

and Telecommunications.

Global technocrats with human values ”

“ Research and lifelong learning attitude,

Excellent ability to tackle challenges

Awareness of the needs of society

Technical expertise

Mission of the Department: 1. To provide good technical education and enhance technical competency by providing good

infrastructure, resources, effective teaching learning process and competent, caring and

committed faculty.

2. To provide various platforms to students for cultivating professional attitude and ethical

values.

3. Creating a strong foundation among students which will enable them to pursue their career

choice.

Page 3: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

3

Jawaharlal Nehru Engineering College

Technical Document

This technical document is a series of Laboratory manuals of Electronics & Telecommunication

and is a certified document of Jawaharlal Nehru Engineering College. The care has been taken to

make the document error free but still if any error is found kindly bring it to the notice of subject

teacher and HOD.

Recommended by,

HOD

Approved by,

Principal

Copies:

• Departmental Library

• Laboratory

• HOD

• Principal

Page 4: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

4

FOREWORD

It is my great pleasure to present this laboratory manual for Third year engineering students for

the subject of Information Theory and coding, keeping in view the vast coverage required for

visualization of concepts of ITC.

As a student, many of you may be wondering with some of the questions in your mind regarding

the subject and exactly that has been tried to answer through this manual.

Faculty members are also advised to cover these aspects in initial stage itself, It will greatly

relieve them in future, as much of the load will be taken care by the enthusiasm of the students

once they are conceptually clear. Students are advised to thoroughly go through this manual

rather than only the topics mentioned in the syllabus, as practical aspects are the key to

understanding and conceptual visualization of theoretical aspects covered in the books.

Good Luck for your Enjoyable Laboratory Sessions.

H.O.D.

Page 5: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

5

SUBJECT INDEX

Sr.

No.

Title Page No.

1 Dos and Don’ts, Instruction to Lab teachers 6

2 Lab course objectives 7

3 Lab Exercises

I Pre-Requisite 1

II Pre-Requisite 2: Introduction to Matlab 8

1 Determination of entropy of a given source 14

2 Determination of various entropies and mutual information of

a given channel (Noise free channel) 15

3 Determination of various entropies and mutual information of

a given channel (Binary symmetric channel) 18

4 Generation and evaluation of variable length source coding

using MATLAB (Huffman Coding and decoding) 21

5 Coding & decoding of Linear block codes 24

6 Coding & decoding of Cyclic codes 26

7 Coding and decoding of convolutional codes 28

8 Coding and decoding of BCH codes. 30

I Postrequisite 33

2 Questions based on the subject 39

3 Conduction of viva voce examination 44

4 Evaluation and marking scheme 44

Page 6: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

6

1. DOs and DON’Ts in Laboratory:

1. Do not handle software and PC without reading the instructions/

Instruction manuals.

2. Refer Help for debugging the program.

3. Go through Demos of communication tool box in Matlab.

4. Strictly observe the instructions given by the teacher/Lab Instructor.

2 Instruction for Laboratory Teachers:

1. Lab work completed during prior session should be corrected during the

next lab session.

2. Students should be guided and helped whenever they face difficulties.

3. The promptness of submission should be encouraged by way of marking

and evaluation patterns that will benefit the sincere students.

Page 7: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

7

Laboratory Course Objectives:

1. To expose students to knowledge about information and various entropies.

2. To make students understand working of various codes like linear block, cyclic,

convolution and BCH codes.

3. To explore source coding for text, audio and speech.

Course Outcomes:

Students will be able to

1. Demonstrate various entropies and information.

2. Apply source coding techniques.

3. Construct codes using different coding techniques.

4. Explain various coding schemes for text ,speech and audio.

Page 8: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

8

INTRODUCTION TO MATLAB- PREREQUISITE

Q. Solve the following using Matlab commands:

1) Generate and Display a sequence of 10 numbers.

COMMAND: V = 1:1:10

OUTPUT: V =

1 2 3 4 5 6 7 8 9 10

2) Display a sequene of 20 numbers with an interval of 2

COMMAND: B = 1:2:40

OUTPUT: B =

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39

3) Write a row vector

COMMAND: C = [1,2,3,4]

OUTPUT: C =

1 2 3 4

4) Write a Column vector

COMMAND: D = [1;2;3;4]

OUTPUT: D = 1

2

3

4

5) Write a matrix 3*3

COMMAND: E = [1,2,3;4,5,6;7,8,9]

OUTPUT: E =

1 2 3

4 5 6

7 8 9

6) Add 2 Matrix.

COMMAND: X = [1,2,3;4,5,6] ; Y = [6,5,4;3,2,1]

F = X+Y

Page 9: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

9

OUTPUT: X =

1 2 3

4 5 6

Y =

6 5 4

3 2 1

F =

7 7 7

7 7 7

7) X is row vector, change it to column vector.

COMMAND: Z = X'

OUTPUT: Z =

1 4

2 5

3 6

8) Generate the following matrix

COMMAND: A=[1,2,3;4,5,6;7,8,9]

C=[1^3,2+sqrt(3),3*sin(1);exp(2),17/3,pi+3;1/3,2-sqrt(3),-7*cos(pi/7)]

x=[1 2 3 4 5 6]'

OUTPUT: A =

1 2 3

4 5 6

7 8 9

Page 10: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

10

C =

1.0000 3.7321 2.5244

7.3891 5.6667 6.1416

0.3333 0.2679 -6.3068

x =

1

2

3

4

5

6

9) Generate 5x3 matrix whose first three rows are rows of A and last two rows are all ones.

COMMAND: w=[A;ones(2,3)] OUTPUT: w =

1 2 3

4 5 6

7 8 9

1 1 1

1 1 1

(3) What happens: (1) B = A; B(3,3) = 10; (2) x = [1:50] ’; x(length(x))

(3) D = [1 1; 2 2]

E = [5 10; 5 10]

F = [D E D; E D E]

G = [F, rand(4,1) ]

COMMAND: 1 ) B = A; B(3,3) = 10; (2) x = [1:50] ’; x(length(x))

(3) D = [1 1; 2 2]

E = [5 10; 5 10]

F = [D E D; E D E]

G = [F, rand(4,1) ]

OUTPUT: (1) B =

1 2 3

4 5 6

7 8 10

(2) ans =

50

(3) D =

1 1

2 2

E =

5 10

5 10

Page 11: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

11

F =

1 1 5 10 1 1

2 2 5 10 2 2

5 10 1 1 5 10

5 10 2 2 5 10

G =

1.0000 1.0000 5.0000 10.0000 1.0000 1.0000 0.8147

2.0000 2.0000 5.0000 10.0000 2.0000 2.0000 0.9058

5.0000 10.0000 1.0000 1.0000 5.0000 10.0000 0.1270

5.0000 10.0000 2.0000 2.0000 5.0000 10.0000 0.9134

(10) Using matrix A, find 1.A(1) 2. A(2) 3. A(9) 4. A(1,1) 5. A(2,1) 6. A(1,2) 7. A(1:3, 2)

8. A([1 3 4], [1 3]) 9. a = A(:, 2) 10. a = A(:) 11. A(2,end)

COMMAND: 1.A(1) 2. A(2) 3. A(9) 4. A(1,1) 5. A(2,1) 6. A(1,2) 7. A(1:3, 2)

8. A([1 3 ], [1 3]) 9. a = A(:, 2) 10. a = A(:) 11. A(2,end)

OUTPUT:

1 ans =

1

2.ans =

4

3.ans =

9

4.ans =

1

5.ans =

2

6.ans =

4

7.ans =

2

5

8

8.ans =

1 3

7 9

9. a =

2

5

8

10. a =

1

4

7

2

5

8

3

6

9

11. ans =

6

Page 12: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

12

11) Create a 3x4 matrix for the following

1. random matrix 2. zero matrix 3. ones matrix 4. identity matrix

COMMAND: 1. rand(3,4) 2.zeros(3,4) 3.ones(3,4) 4. eye(3,4)

OUTPUT: 1. ans =

0.6324 0.5469 0.1576 0.4854

0.0975 0.9575 0.9706 0.8003

0.2785 0.9649 0.9572 0.1419

2.ans =

0 0 0 0

0 0 0 0

0 0 0 0

3.ans =

1 1 1 1

1 1 1 1

1 1 1 1

4.ans =

1 0 0 0

0 1 0 0

0 0 1 0

12) Create a magic matrix of 3x3.

COMMAND: MAGIC(3)

OUTPUT: 8 1 6

3 5 7

4 9 2

13) Create 3D 3x3 ones matrix.

COMMAND: y=ones(3,3,3)

OUTPUT: y(:,:,1) =

1 1 1

1 1 1

1 1 1

y(:,:,2) =

1 1 1

1 1 1

1 1 1

y(:,:,3) =

1 1 1

1 1 1

1 1 1

14) Write command to delete second row of A.

COMMAND: A(2,:)=[]

OUTPUT: A =

1 2 3

7 8 9

15) Write a command to delete 3rd through 5th column of 6x6 matrix. For this create a 6x6

matrix.

COMMAND:

Z=[1,2,3,4,5,6;7,8,9,10,11,12;13,14,15,16,17,18;19,20,21,22,23,24;25,26,27,28,29,30;31,32,33,

34,35,36]

Z(:,3:5)=[]

Page 13: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

13

OUTPUT: Z =

1 2 3 4 5 6

7 8 9 10 11 12

13 14 15 16 17 18

19 20 21 22 23 24

25 26 27 28 29 30

31 32 33 34 35 36

Z =

1 2 6

7 8 12

13 14 18

19 20 24

25 26 30

31 32 36

16) Write the commands/programs and calculations wherever applicable:

1. Area=pi*r^2, (where r=(pi^(1/3 )-1))

2.Calculate and compare for:1. exp((pi/2)*i) 2. exp(pi/2i)

3. e^3 4. ln(e^3) 5. log10(e^3) 6. log10(10^5) 7. e^(pi*sqrt(163))

8.solve for 3^x=17

COMMAND: 1. Area= pi*((pi^(1/3))-1)^2 2.1 exp((pi/2)*i) 2.2 . exp(pi/2i)

3.exp(3) 4. Log(exp(3)) 5. Log10(exp(3)) 6. Log10(10^5) 7.

exp(pi*sqrt(163))

8. x=log(17)/log(3)

OUTPUT: 1. Area = 2.1. ans = 2.2 ans=

0.6781 0.0000 + 1.0000i 0.0000 - 1.0000i

3. ans = 4. ans = 5. ans = 6. ans= 7. ans=

20.0855 3 1.3029 5 2.6254e+017

8. x=

2.5789

17) Create a vector x whose entries are the square root of natural numbers 1 to 10 (use’ for’

statement)

COMMAND: for i=1:10 x(i)=i^2

end

OUTPUT: x=

1 4 9 16 25 36 49 64 81 100

Page 14: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

14

INFORMATION THEORY AND CODING TECHNIQUES

EXPERIMENT NO. 1

Determination of Entropy

Aim: To find information and entropy of a given source.

Apparatus: PC, Matlab s/w

Theory: (1) What is discrete memory less source?

(2) Write definition, formula and units for the following

i) Information

ii) Entropy

iii) Information rate.

(3).What are the different types of entropies?

Algorithm:

1. Enter no. of symbols.

2. Input the probabilities of symbols resp.

3. Calculate the entropy of the channel input. i.e. H(x) using the formula:

H(x)=

Conclusion:

Program:

(1)% Find entropy of the source

clc;

clear all;

close all;

i=input('Enter no. of elements=');

p=input('Enter probabilities=');

sum=0;

for n=1:i

H=sum+(p(n)*log2(1/p(n)));

sum=H;

end

disp('H(x): ');

disp(H);

output:

Enter no. of elements=2

Enter probabilities=[2/3,4/5]

H(x):

0.6475

Page 15: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

15

Experiment no.2

Determination of various entropies and mutual information of the given channel.

Aim: Write a program for determination of various entropies and mutual information of a

given channel. Test various types of channel such as

a) Noise free channel. b) Error free channel

c) Binary symmetric channel

Compare channel capacity of above channels.

Apparatus: PC,MATAB/C Theory:

1. Explain discrete memoryless channel.

2. Explain various types of channels with their equations and neat diagrams.

3. Explain in detail mutual information with its equation.

4. Explain relations between various entropies and mutual information, giving their

equations Algorithm:

I) Entropies:

1. Input the no. of inputs of a channel.

2. Input the no. of outputs of a channel.

3. Input the channel matrix. Test the condition that sum of all the entries in each row

should be equal to 1.

4. Input the channel input probabilities. i.e. P[X].

5. Calculate the entropy of the channel input. i.e. H(X)

6. Calculate output probability matrix P[Y], by multiplying input probability matrix by

channel matrix.

7. Also calculate entropy of channel output. i.e. H(Y).

8. Convert input probability matrix into diagonal matrix.i.e. P[X]d

9. Calculate the joint probability matrix by multiplying input probability matrix in

diagonal form by channel matrix.

10. Calculate joint entropy with the help of formula

11. Calculate conditional entropies H(Y/X)&H(X/Y).

12. Also we can calculate mutual information as

I(X;Y)=H(X)-H(X/Y) or

I(X;Y)=H(Y)-H(Y/X)

II) Various channels

1) Noise free channel

1. For noise free channel enter only diagonal elements of the joint probability matrix.

Condition mentioned on step no.3 should be satisfied.

2. Repeat steps from 4 to 10.

2)Error free channel

1. A channel is said to be error free if capacity of the channel is greater than entropy of the

channel. So at first calculate the capacity of the channel using the formula

Capacity C= log M bits/symbol

Where M:- No. of inputs of the channel.

Page 16: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

16

2. Calculate entropy of the input.

3.Compare capacity of the channel with channel input entropy.

3) Binary Symmetric Channel

1.BSC channel is characterized by No.of input = No. of output = 2

2.Conditional probability matrix is as follows

P(Y/X)=

3. Derive the joint probability matrix from this matrix by multiplying it by

P(X) =[ 0 1]

So the matrix which we take input from user is

P(X,Y)=

Where p should be entered by the user.

4.Then repeat steps 4 to 8 to calculate all the required quantities.

Conclusion:

Program: % %prgm for entropy and MI for noise free channel clc; clear all; close all; i=input('Enter no. of elements='); q=input('Enter joint probabilities matrix='); sum=0; %probability P(x) for n=1:i w=0; for m=1:i p(n)=w+q(n,m) w=p(n); end end disp('P(x):'); disp(p); % entropy H(x) for n=1:i H=sum+(p(n)*log2(1/p(n))); sum=H; end disp('H(x): '); disp(H); %conditional probability matrix for n=1:i for m=1:i a(n,m)=q(n,m)/p(n); end end disp('P(Y/X):'); disp(a); % entropy H(Y/X) d=0; for n=1:i

Page 17: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

17

for m=1:i if(a(n,m)>0) H1=d+(q(n,m)*log2(1/a(n,m))); d=H1 end end end disp('H(Y/X):'); disp(H1); % MI m=H-H1; disp('MI='); disp(m);

% probability P(Y) for n=1:i w=0; for m=1:i s(n)=w+q(m,n); w=s(n); end end disp('P(Y):'); disp(s);

% entropy H(Y) k=0; for n=1:i H2=k+(s(n)*log2(1/s(n))); k=H2; end disp('H(Y): '); disp(H2);

Output

: Enter no. of elements=3 Enter joint probabilities matrix=[.2 0 0;0 .4

0;0 0 .4]

p =

0.2000 0.4000 0.4000

P(x):

0.2000 0.4000 0.4000

H(x):

1.5219

P(Y/X):

1 0 0

0 1 0

0 0 1

H(Y/X):

0

MI=

1.5219

P(Y):

0.2000 0.4000 0.4000

H(Y):

1.5219

Page 18: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

18

Experiment no.3

Determination of various entropies and mutual information of the given BSC channel.

Aim: Write a program for determination of various entropies and mutual information of a given channel. (Binary symmetric channel).

Apparatus: PC,MATAB/C Theory:

1. Explain in detail BSC with neat diagram.

2. Find capacity of BSC channel. Algorithm:

I) Entropies:

1. Input the no. of inputs of a channel.

2. Input the no. of outputs of a channel.

3. Input the channel matrix. Test the condition that sum of all the entries in each row should

be equal to 1.

4. Input the channel input probabilities. i.e. P[X].

5. Calculate the entropy of the channel input. i.e. H(X)

6. Calculate output probability matrix P[Y], by multiplying input probability matrix by

channel matrix.

7. Also calculate entropy of channel output. i.e. H(Y).

8. Convert input probability matrix into diagonal matrix.i.e. P[X]d

9. Calculate the joint probability matrix by multiplying input probability matrix in diagonal

form by channel matrix.

10. Calculate joint entropy with the help of formula

11. Calculate conditional entropies H(Y/X)&H(X/Y).

12. Also we can calculate mutual information as

I(X;Y)=H(X)-H(X/Y) or

I(X;Y)=H(Y)-H(Y/X)

Binary Symmetric Channel

1.BSC channel is characterized by No.of input = No. of output = 2

2.Conditional probability matrix is as follows

P(Y/X)=

3. Derive the joint probability matrix from this matrix by multiplying it by

P(X) =[ 0 1]

So the matrix which we take input from user is

P(X,Y)=

Where p should be entered by the user.

4.Then repeat steps 4 to 8 to calculate all the required quantities.

Conclusion:

Page 19: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

19

% %prgm for entropy and MI for BSC channel

clc; clear all; close all; i=input('Enter no. of elements='); p=input('Enter probability='); q=input('Enter conditional probabilities matrix='); sum=0;

% entropy H(x) for n=1:i H=sum+(p(n)*log2(1/p(n))); sum=H; end disp('H(x): '); disp(H);

%Joint probability matrix for n=1:i for m=1:i a(n,m)=q(n,m)*p(n); end end disp('P(X,Y):'); disp(a);

% entropy H(Y/X) d=0; for n=1:i for m=1:i H1=d+(a(n,m)*log2(1/q(n,m))); d=H1;

end end disp('H(Y/X):'); disp(H1);

% probability P(Y) for n=1:i w=0; for m=1:i s(n)=w+a(m,n); w=s(n); end end disp('P(Y):'); disp(s);

% entropy H(Y) k=0; for n=1:i H2=k+(s(n)*log2(1/s(n))); k=H2; end disp('H(Y): '); disp(H2);

% Find Mutual Information m=H2-H1; disp('MI='); disp(m);

Page 20: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

20

Output: Enter no. of elements=2 Enter probability= [3/4 1/4] Enter conditional probabilities matrix=[1/3 2/3;2/3 1/3] H(x): 0.8113 P(X,Y): 0.2500 0.5000 0.1667 0.0833 H(Y/X): 0.9183 P(Y): 0.4167 0.5833 H(Y): 0.9799 MI= 0.0616

Page 21: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

21

INFORMATION THEORY AND CODING TECHNIQUES

Experiment. No.4

Encoding and decoding of Huffman code

(Variable length source coding )

Aim: Write a program for generation and evaluation of variable length source coding using

Huffman Coding and decoding.

Calculate the entropy, average length and efficiency of Huffman Coding.

Apparatus: Matlab/C Theory:

1. Explain variable length coding.

2. Explain Huffman coding techniques.

3. Solve theoretically and verify using matlab program the given example.

4. Explain the commands:1. Huffmandict, 2. Huffmanenco 3. huffmandeco

Algorithm:

1. Start.

2. Input the total number of probabilities.

3. Arrange the messages in decreasing order of probabilities.

4. Add last two probabilities.

5. Assign them ‘0’ and ‘1’.

6. With addition & other probabilities again sort out the total probabilities.

7. If the addition result is equal to probability of an symbol then put it on the top

8. Repeat the program from step 4 until addition is 1.

9. To find code for particular symbol take the path of probability of symbol and write cod

in reverse fashion.

10. Find out entropy, avg. code word length and efficiency.

11. Stop .

Conclusion:

Page 22: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

22

INFORMATION THEORY AND CODING Program:

1. %Write a MATLAB based program for encoding and decoding of Huffman code %(variable length source coding )

clc;

clear all;

close all;

symbol =[1:5]; % Distinct data symbols appearing in sig

p = [0.1 0.1 0.4 .3 .1]; % Probability of each data symbol

[dict,avglen]=huffmandict(symbol,p)

samplecode = dict{5,2} % Codeword for fifth signal value

dict{1,:}

dict{2,:}

dict{3,:}

dict{4,:}

dict{5,:}

hcode = huffmanenco(symbol,dict); % Encode the data.

dhsig = huffmandeco(hcode,dict); % Decode the code.

disp('encoded msg:');

disp(hcode);

disp('decoded msg:');

disp(dhsig);

code_length=length(hcode) for m=1:5 H=Hx+(p(m)*log2(1/p(m))); Hx=H; end disp('Hx='); disp(H);

Efficiency=(Hx/avglen)*100

Output:

dict =

[1] [1x4 double]

[2] [1x4 double]

[3] [ 1]

[4] [1x2 double]

[5] [1x3 double]

avglen =

2.1000

samplecode =

0 0 1

ans =

1

ans =

0 0 0 1

ans =

2

ans =

0 0 0 0

ans =

3

ans =

1

ans =

4

ans =

0 1

ans =

5

ans =

0 0 1

encoded msg:

0 0 0 1 0 0 0

0 1 0 1 0 0 1

decoded msg:

1 2 3 4 5

code_length =

14

Hx=

2.0464

Efficiency =

97.4495

Page 23: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

23

II: Huffman coding:string

%Write a MATLAB based program for encoding and decoding of Huffman code

clc;

clear all;

close all;

msg='TEECT'

symbol ={'T' 'E' 'C'}; % Distinct data symbols appearing in sig

p = [2/5 2/5 1/5]; % Probability of each data symbol

[dict,avglen]=huffmandict(symbol,p)

dict{1,:}

dict{2,:}

dict{3,:}

hcode = huffmanenco(msg,dict); % Encode the data.

dhsig = huffmandeco(hcode,dict); % Decode the code.

disp('encoded msg:');

disp(hcode);

disp('decoded msg:');

disp(dhsig);

Output:

msg =

TEECT

dict =

'T' [1x2 double]

'E' [ 1]

'C' [1x2 double]

avglen =

1.6000

ans =

T

ans =

0 0

ans =

E

ans =

1

ans =

C

ans =

0 1

encoded msg:

0 0 1 1 0 1 0 0

decoded msg:

'T' 'E' 'E' 'C' 'T'

Page 24: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

24

Expt no. 5 Title: Write a Program for coding of Linear block codes.

Aim: Error detecting and correcting using liner block code.

Apparatus: MATAB/C

THEORY: (1)Explain Linear block codes in detail

(2) Explain Generator matrix and Parity check matrix giving their relation.

(3) Explain syndrome in LBC.

(4) Explain with example coding and decoding in LBC. Algorithm: 1. Start

2. Accept size of LBC block code in terms n and k

3. Accept parity p matrix of size k x (n-k)

4. Generate generator matrix such that G = [Ik | P] Of size k x n in which Ik is an identity

matrix. 5. Generate parity check matrix such that H= [PT | In-k] Of size (n-k) x n in which PT is an

transpose of P matrix. 6. Generate msg. vector

7. Generate code vector by formula, C = MG

8. Display it

9. Also calculate hamming weight of each code word and that is done by calculating total no.

of ones in the code vector. Display it. 10. Calculate detecting capability by Td =dmin- 1 , where dmin is minimum hamming distance.

11. Calculate error correcting capability tc by, tc= (dmin-1) /2

12. Display parity matrix H

13. Calculate syndrome vector for different error pattern ‘E’. S= E.HT

14. Compare this S with each S created for different error pattern where S is matched error is in

that respective bit w.r.t. the error pattern. Display no. of bits where is there. If S of received

vector is ‘0’ then display received vector is correct. Conclusion:

Page 25: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

25

Program % LINEAR BLOCK CODE %Generate the codeword using LBC

clc;

close all;

n=6;

k=3;

p=[0 1 1 ; 1 0 1; 1 1 0]; % Parity Matrix

d=input('enter three bit message=');

ik=eye(k);

g=cat(2,ik,p);

disp('Generator Matrix:');

disp(g);

c1=mtimes(d,g);

c=mod(c1,2);

disp('The codeword for given message is:');

disp(c);

OUTPUT: enter three bit message[1 0 0] Generator Matrix: 1 0 0 0 1 1 0 1 0 1 0 1 0 0 1 1 1 0 The codeword for given message is: 1 0 0 0 1 1

Page 26: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

26

Expt. No. 6 Aim: Write a Program for coding & decoding of Cyclic codes.

Objective:

Error detecting and correcting using Cyclic code.

Software Requirement: MATAB/C

THEORY: (1)Explain coding and decoding of cyclic codes in detail

Cyclic Code:

Cyclic code are the sub-class of linear block codes. They have a property that a cyclic shift

of one code word produces another code word. Suppose there is an n-bit code vector. X = (xn-1, xn-2, ……………………x1, x0)

Here xn-1, xn-2, ……………………x1, x0 represent the individual bits of the code vector X.

if the code vector is shifted cyclically , then another code vector X is obtained X1 = (xn-2, xn-3, ……………………x1, x0, xn-1)

Algorithm:

1. Start

2. Get the values of n & k.

3. Get the generator polynomial i.e. its coefficient from user.

4. Get the message vector.

5. Get the message generator matrix.

6. Multiply message polynomial with msg bit shifted by n-k

7. Divide this term i.e. Xn-k d(x) by g(x) 8. To get code word polynomial add Xn-k d(x) with reminder of division. 9. Display the code word.

10. Generate the error pattern & corresponding syndrome with displaying.

11. Enter the received code vector.

12. Divide the received code vector polynomial by generator polynomial.

13. The reminder of division will be the syndrome polynomial.

14. From syndrome detect the corresponding error pattern.

15. Stop.

Conclusion:

Page 27: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

27

Program: % Encoding for (7,4) Cyclic code clc;

clear all;

%Encoding

n=7; k=4;

p=[1 1 0 ; 1 1 1; 0 0 1 ; 1 0 1]; % Parity Matrix

d=[1 1 0 1]; % Message word

ik=eye(k);

g=cat(2,ik,p);

disp ('Generator Matrix:');

disp(g);

g1=cyclpoly(n,k,'max');

disp(‘g1=’);

disp(g1);

gp=poly2sym(g1);

disp('Generator Polynomial:');

disp(gp);

c1=mtimes(d,g);

c=mod(c1,2);

disp('The codeword for given message is:');

disp(c);

OUTPUT: g1 = 1 1 0 1 Generator Polynomial: 1 0 0 1 0 1 1 0 1 0 1 1 1 0 0 0 1 0 1 1 1 The codeword for given message is: 1 1 0 1 1 0 0

Page 28: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

28

Expt no.7 Title: Write a program for coding and decoding of convolutional codes.

Objective:

Error detecting and correcting using convolutional code.

Software Requirement: MATAB/C

THEORY:

Explain in detail convolution code Convolution Code:

Convolution coding is an alternative to block codes. They differ from block codes in that the

encoder contains memory. It means encoder output at any given time is dependent on

presentas well as past inputs.

Convolution codes are commonly specified by three parameter (n , k, m ) where n is no.

of outputs bits(coded), k is no. of inputs bits(msg.), m is memory order.

Algorithm:

1. Start

2. Get the values of n & k.

3. Get the generator polynomial i.e. its coefficient from user.

4. Get the message vector.

5. Get the message generator matrix.

Program: %cyclic convolution

clc;

clear all;

k=input('ennter the no. of message bits k=');

q=input('given data q=');

fprintf('data polynomial is')

d=poly2sym(q)

n=input('enter the no. of information bits n=');

w=[1,0,0,0];

x=poly2sym(w);

Page 29: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

29

e=cyclpoly(n,k); fprintf('generator polynomial ') g=poly2sym(e) z=conv(w,q); r=poly2sym(z) [m,v]=gfdeconv(z,e); fprintf('polynomial') p=poly2sym(v) b=r+p; fprintf('codeword is=') c=sym2poly(b) OUTPUT:

Page 30: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

30

Expt. No.8

Title: Write a program for coding and decoding of BCH codes.

Objective:

Error detecting and correcting using BCH codes. Software Requirement: MATAB/C

THEORY: Explain BCH codes

BCH Code:

The BCH code are the most powerful and widely used random error correcting cyclic codes. These code were discovered by Hocquenghem in 1959 and independently by Bose

and Choudhary in 1960.

An (n,k) binary BCH code is specified as, Block length n= 2m – 1

Parity check bits n-k = mtc

Minimum distance dmin ≥ 2 tc +1

Where m ≥ 3 is any integer and tc is no. of error the code is capable of correcting.

Procedure :

1. Given code length n and error correcting capability tc

2. Find m = log2 (n + 1) where n= 2m – 1 3. Find k = 2m – 1 – mtc = n - mtc 4. Take a primitive polynomial of degree m

5. Construct the finite field GF(2m) using the primitive polynomial p(x) 6. Find the minimal polynomial of each element in GF(2m)

7. Find g(x) = LCM [ m1(x).m3(x). m5(x)……………….m2tc-1(x)]

8. Find the code word using c(x) = d(x) . g(x)

Conclusion:

Page 31: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

31

Program: % Program for encoding and decoding BCH code clc; clear all; close all; m=4; n=2^m-1;%codeword length k=5;%message length m=input('enter msg of length 5='); %m=[1 1 1 0 1]; msg=gf(m); disp('message='); disp(msg); %Find t,error correction cap[ability [genpoly,t]=bchgenpoly(n,k); disp('Error correction capability='); disp(t); % Encode the message code=bchenc(msg,n,k); disp('Encoded message='); c=gf(code); disp(c); noisycode=code+randerr(1,n,1:t); disp('received codeword ='); disp(noisycode); % decode noisy code [newmsg,err,ccode]=bchdec(noisycode,n,k); disp('decoded message='); disp(newmsg); if msg==newmsg disp('message recovered perfectly') else disp('Error in message recovered'); end

Output:

enter msg of length 5=[1 0 0 0 1]

message=

gf object: 1-by-5

Error correction capability=

3

Encoded message=

gf object: 1-by-15

received codeword =

gf object: 1-by-15

decoded message=

gf object: 1-by-5

Page 32: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

32

% Implementation of algorithms for RS Coding & Decoding %RS CODING

clear all;

clc; n=input('accept n=');

k=input('accept k=');

m=input('accept message=');

msg=gf([m],k); c = rsenc(msg,n,k); % Code will be a Galois array.

disp(c) OUTPUT: accept n=6

accept k=4

accept message=[1 1 1 1]

gf object: 1-by-6

Page 33: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

33

Post Requisite

% Create Integrator model using Simulink Matlab

Create a New Model

Before creating a model, you need to start MATLAB® and then start Simulink.

1. Start the Simulink software. In the MATLAB Command

Window, enter:Simulink,The Simulink Library Browser opens.

2.From the Simulink Library Browser menu, select File > New > Model.

A Simulink Editor window opens with an empty canvas in the right-hand pane

Select File > Save as. The Save As dialog box opens.

In the File name box, enter a name for your model. For example, enter simple_model. Then

click Save.

Your model is saved with the file name simple_model.

To create this simple model, you need four Simulink blocks:

• Sine Wave — Generates an input signal for the model.

• Integrator — Processes the input signal.

• Bus Creator — Combines the input signal and processed signal into one signal.

• Scope — Visualizes the signals.

Page 34: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

34

Simulating this model integrates a sine wave signal to a cosine signal and then displays

the result, along with the original signal, in a scope window.

Open the Simulink Library Browser

Browse or Search for Specific Blocks

1 Search for a Sine Wave block. In the search box on the browser toolbar, enter sine, and

then press the Enter key. Simulink searches the libraries for blocks with sine in their

name, and then displays the blocks.

Page 35: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

35

2 Get detailed information about a block. Right-click a block, and then select Help for the

<block name>. The Help browser opens with the reference page for the block.

3 View block parameters. Right-click a block, and then select Block Parameters. The block

parameters dialog box opens.

Add Blocks to a Model

To build a model, begin by copying blocks from the Simulink Library Browser to the

Simulink Editor.

1 In the Simulink Library Browser, select the Sources library.

2 From the right pane, select the Sine Wave block.

Page 36: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

36

3 Drag the Sine Wave block to the Simulink Editor. A copy of the Sine Wave block

appears in your model.

4 Similarly Add Integrator, Bus connector and Scope block in model.

5 Draw Signal Lines Between Block

Page 37: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

37

Define Simulation Parameters

Before you simulate the behavior of a model, define the simulation parameters. Simulation

parameters include the type of numerical solver, start, and stop times, and maximum step

size. clicking the parameters button on the Simulink Editor toolbar 1 From the Simulink Editor menu, select Simulation > Model Configuration Parameters.

The Configuration Parameters dialog box opens to the Solver pane.

2 In the Stop time field, enter 20. In the Max step size field, enter 0.2.

3 Click OK.

Run Simulation

After you define the Model Configuration Parameters, you are ready to simulate your

model.

1 From the Simulink Editor menu bar, select Simulation > Run.

The simulation runs, and then stops when it reaches the stop time specified in the

Model Configuration Parameters dialog box.

Observe Simulation Results

After simulating a model you can view the simulation results in a Scope window.

1 Double-click the Scope block. The Scope window opens and displays the simulation results. The plot shows a sine wave signal with the resulting cosine wave signal

Page 38: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

38

3 Change the appearance of the display. For example, select white for the display color

and axes background color (icons with a pitcher).

4 Select black for the ticks, labels, and grid colors (icon with a paintbrush).

5 Change signal line colors for the Sine Wave to blue and the Integrator to red. To see

your changes, click OK or Apply.

Page 39: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

39

Questions based on Syllabus

1. What is information?

Ans: Information can be defined as the inverse of probability of occurrence

= log2(1/ pk)

2. What are different units of information?

Ans: Bits, Nats, Decit

3. What is entropy?

Ans: Entropy can be defined as the average amount of information per source symbol.

4. What is discrete source?

Ans: If a source emits symbols = {s0, s1, s2 ,… s k-1} from a fixed finite alphabet then

the source is said to be discrete source.

5. State Shannon’s first theorem.

Ans: A distortion less coding occurs when L H(x) where L represents the average codeword

length and H(x) represents entropy.

6. What is data compaction?

Ans: Data compaction is used to remove redundant information so that the decoder

reconstructs the original data with no loss of information.

7. What is decision tree? Where it is used?

Ans: The decision tree is a tree that has an initial state and terminal states corresponding to

source symbols s0, s1, s2 ,… s k-1. Once each terminal state emits its symbol, the

decoder is reset to its initial state.

Decision tree is used for decoding operation of prefix codes.

8. What is instantaneous code?

Ans: If a codeword is not a prefix of any other code word then it is said to be instantaneous

code. 9. What is discrete channel?

Ans: The channel is said to be discrete when both the alphabets and have finite sizes. 10. What is memory less channel?

Ans: The channel is said to be memory less when the current output symbol depends only on

the current input symbol and not any of the previous choices. 11. What is the important property while using the joint probability (xj, yk)?

Ans: The sum of all the elements in a matrix is equal to 1. 12. What is the important property while using the conditional probability (xj / yk)?

Ans: The sum of all the elements along the column side should be equal to 1.

Page 40: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

40

13. What is the important property while using the conditional probability (yk / xj)?

Ans: The sum of all the elements along the row side should be equal to 1. 14. What is prefix coding?

Ans: Prefix coding is variable length coding algorithm. It assigns binary digits to the messages

as per their probabilities of occurrence. In prefix code, no codeword is the prefix of any

other codeword.

15. State the channel coding theorem for a discrete memoryless channel.

Ans: Given a source of „M‟ equally likely messages, with M>>1, which is generating

information t a rate R. Given channel with capacity C. Then if, R ≤ C, there exists a

coding technique such that the output of the source may be transmitted over the

channel with probability of error in the received message which may be made

arbitrarily small.

16. Explain channel capacity theorem.

Ans: The channel capacity of the discrete memory less channel is given as maximum

average mutual information. The maximization is taken with respect to input

probabilities P(xi). C = B log2(1+S/N) bits/sec, where B is channel bandwidth.

17. Define mutual information?

Ans: Mutual information of the channel is the average amount of information gained by the

transmitter when the state of the receiver is known.

I(X ;Y ) = H(Y) – H(Y /X ) or = H(X) – H(X /Y ) 18. Define channel capacity?

Ans: Channel capacity of a discrete memory less channel can be defined as the maximum

value of the mutual information I ( X;Y ) , Where the maximization is carried out for all

input probabilities {p(xj)} when the symbols whose input probabilities {p(xj)} are

equiprobable. 19. What is the use of error control coding?

Ans: The main use of error control coding is to reduce the overall probability of error, which

is also known as channel coding. 20. What is the difference between systematic code and non-systematic code?

Ans: • If the parity bits are followed by message bits then it is said to be systematic codes.

• If the message bits and parity check bits are randomly arranged then it is said to be non-

systematic codes. 21. What is a Repetition code?

Ans: A single message bit is encoded in to a block of ‘n’ identical bits producing a

(n, 1) block code. There are only two code words in the code. all-zero code word

and all-one code word.

Page 41: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

41

22. What is forward acting error correction method?

Ans: The method of controlling errors at the receiver through attempts to correct noise-

induced errors is called forward acting error correction method. 23. What is error detection?

Ans: The decoder accepts the received sequence and checks whether it matches a valid

message sequence. If not, the decoder discards the received sequence and notifies the

transmitter that errors have occurred and the received message must be retransmitted.

This method of error control is called error detection. 24. Give the properties of syndrome in linear block code.

Ans: • The syndrome depends only on the error patterns and not on the transmitted code word.

• All error patterns that differ by a code word have the same syndrome. 25. What is Hamming code?

Ans: This is a family of (n,k) linear block code.Block length : n= 2m – 1. Number of message bits : k = 2m – m-1. Number of parity bits : n – k = m

Where m ≥ 3 and m should be any positive integer.

26. What is the linear property of a code word?

Ans Linearity property: The sum of any two code words in the code is also a code word. 27. What is the cyclic property of a code word?

Ans: Cyclic property:Any cyclic shift of a code word in the code is also a code word. 28. Give the difference between linear block code and cyclic code.

Ans: • Linear block code can be simply represented in matrix form

• Cyclic code can be represented by polynomial form. 29. Define Hamming distance (HD)?

Ans: The number of bit position in which two adjacent code vectors differs is known as

Hamming distance.(e.g) if c1 = 1 0 0 1 0 1 1 0 and c2 = 1 1 0 0 1 1 0 1,then HD = 5 30. Define Weight of a code vector?

Ans: The number of non-zero components in a code vector is known as weight of a code

vector.(e.g) if c1 = 1 0 0 1 0 1 1 0, then W(c1) = 4 31. Define minimum distance?

Ans: The minimum distance of a linear block code is the smallest hamming distance between

any pairs of code words in a code.(e.g) if c1 = 0 0 1 1 1 0, c2 = 0 1 1 0 1 1, c3 = 1 1 0 1 1 0, then d min = 3.

32. What is convolutional code?

Ans: A convolutional code is the code in which parity bits are continuously interleaved by

information (or) message bits.

Page 42: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

42

33. .How compression is taken place in text and audio?

Ans: In text the large volume of information is reduced, where as in audio the bandwidth is

reduced. 34. Specify the various compression principles?

Ans: • Source encoders and destination decoders • Loss less and lossy compression

• Entropy encoding • Source encoding 35. Define run-length encoding?

Ans: This can be used for long sub strings of the same character or binary digits.

(e.g) 000000011111111110000011……..This can be represented in run-length as:

0,7,1,10,0,5,1,2………. 36. Define transform coding?

Ans: This is used to transform the source information from spatial time domain representation

into frequency domain representation.

37. Define code redundancy.

Ans: It is the measure of redundancy of bits in the encoded message sequence. It is given as,

Redundancy = 1 – code efficiency, i.e. 1 – ή .It should be as low as possible.

38. What is the capacity of the channel having infinite bandwidth?

Ans: The capacity of such channel is given as, C = 1.44 (S/N0)

39. Give differences between Arithmetic coding and Huffman coding.

Ans: • In arithmetic coding a single code word is used for each encoded string of characters.

• In Huffman coding a separate codeword is used for each character.

40. An alphabet set contains 3 letters A,B, C transmitted with probabilities of 1/3, 1/4

1/4. Find entropy.

Ans: p1 = 1/3, p2 = 1/4, p3 = 1/4.

H =∑pk log 2 (1/pk) = 1/3 log23 + 1/4 log2 4 +1/4 log24 = 1.52832 bits/symbol

41. What is meant by linear code?

Ans: A code is linear if modulo-2 sum of any two code vectors produces another code vector.

This means any code vector can be expressed as linear combination of other code

vectors.

42. For M equally likely messages, the average amount of information H is____.

Ans: log2M

43. The capacity of a binary symmetric channel, given H(P) is binary entropy function

is _________

Ans: 1-H(P)

Page 43: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

43

44. For M equally likely messages, M>>1, if the rate of information R ≤ C, the

probability of error is ____.

Ans: very small

45. Code rate r, k information bits and n as total bits, is defined as

Ans: r=k/n

46. The information rate R for given average information H= 2.0 for analog signal

band limited to B Hz is

Ans: 4 bps

47. The expected information contained in a message is called

Ans: Entropy

48. The capacity of Gaussian channel is

Ans: C = B(1+S/N) bits/s

49. According to Shannon Hartley theorem,

Ans: The channel capacity does not become infinite with infinite bandwidth. And

Has a tradeoff between bandwidth and Signal to noise ratio.

50. The negative statement for Shannon's theorem states that

Ans: If R > C, the error probability increases towards Unity

Page 44: Jawaharlal Nehru Engineering College · a given channel (Binary symmetric channel) 18 4 Generation and evaluation of variable length source coding using MATLAB (Huffman Coding and

ETC 324: Information Theory and coding 2018

44

Conduction of Viva-Voce Examinations:

Teacher should conduct oral exams of the students with full preparation. Normally, the objective

questions with guess are to be avoided. To make it meaningful, the questions should be such that

depth of the students in the subject is tested. Oral examinations are to be conducted in co-cordial

environment amongst the teachers taking the examination. Teachers taking such examinations

should not have ill thoughts about each other and courtesies should be offered to each other in

case of difference of opinion, which should be critically suppressed in front of the students.

4. Evaluation and marking system:

Basic honesty in the evaluation and marking system is absolutely essential and in the process,

impartial nature of the evaluator is required in the examination system. It is a wrong approach or

concept to award the students by way of easy marking to get cheap popularity among the

students, which they do not deserve. It is a primary responsibility of the teacher that right

students who are really putting up lot of hard work with right kind of intelligence are correctly

awarded.

The marking patterns should be justifiable to the students without any ambiguity and teacher

should see that students are faced with just circumstances.