information theory detour

Post on 18-May-2022

18 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Things we will cover todayEntropy

Information Theory Detour

Hayder RadhaPresented by: Kiran Misra

Department of Electrical and Computer EngineeringMichigan State University

October 8, 2008

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

No Professor Radha!!! My name is “Kiran”

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Why learn about information theory?

Basic elements of information theory are necessary for manyaspects of Multimedia Coding, Communication and Networking

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Why learn about information theory?

Basic elements of information theory are necessary for manyaspects of Multimedia Coding, Communication and Networking

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

List of things we cover

Entropy Definition, ExampleJoint Entropy DefinitionConditional Entropy Motivation, Definition, Graphical representationMutual Information Definition, Graphical representation, Inequalities

Some of you may have already seen this in ECE867.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

List of things we cover

Entropy Definition, ExampleJoint Entropy DefinitionConditional Entropy Motivation, Definition, Graphical representationMutual Information Definition, Graphical representation, Inequalities

Some of you may have already seen this in ECE867.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Its a measure of uncertainty associated with a random variable.

Example: Assume you live in a desert where it rains once a year.Random variable of interest: Weather Report

Today’s weather forecast: No Rain → Little Uncertainty → LittleInformation.

Today’s weather forecast: Rain → Lot of Uncertainty → Lot ofInformation.

A more precise definition was formulated by Shannon in 1948.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Its a measure of uncertainty associated with a random variable.

Example: Assume you live in a desert where it rains once a year.Random variable of interest: Weather Report

Today’s weather forecast: No Rain → Little Uncertainty → LittleInformation.

Today’s weather forecast: Rain → Lot of Uncertainty → Lot ofInformation.

A more precise definition was formulated by Shannon in 1948.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Its a measure of uncertainty associated with a random variable.

Example: Assume you live in a desert where it rains once a year.Random variable of interest: Weather Report

Today’s weather forecast: No Rain

→ Little Uncertainty → LittleInformation.

Today’s weather forecast: Rain → Lot of Uncertainty → Lot ofInformation.

A more precise definition was formulated by Shannon in 1948.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Its a measure of uncertainty associated with a random variable.

Example: Assume you live in a desert where it rains once a year.Random variable of interest: Weather Report

Today’s weather forecast: No Rain → Little Uncertainty → LittleInformation.

Today’s weather forecast: Rain → Lot of Uncertainty → Lot ofInformation.

A more precise definition was formulated by Shannon in 1948.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Its a measure of uncertainty associated with a random variable.

Example: Assume you live in a desert where it rains once a year.Random variable of interest: Weather Report

Today’s weather forecast: No Rain → Little Uncertainty → LittleInformation.

Today’s weather forecast: Rain

→ Lot of Uncertainty → Lot ofInformation.

A more precise definition was formulated by Shannon in 1948.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Its a measure of uncertainty associated with a random variable.

Example: Assume you live in a desert where it rains once a year.Random variable of interest: Weather Report

Today’s weather forecast: No Rain → Little Uncertainty → LittleInformation.

Today’s weather forecast: Rain → Lot of Uncertainty → Lot ofInformation.

A more precise definition was formulated by Shannon in 1948.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Its a measure of uncertainty associated with a random variable.

Example: Assume you live in a desert where it rains once a year.Random variable of interest: Weather Report

Today’s weather forecast: No Rain → Little Uncertainty → LittleInformation.

Today’s weather forecast: Rain → Lot of Uncertainty → Lot ofInformation.

A more precise definition was formulated by Shannon in 1948.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Alternatively,X → Sender →

msg Receiver

Entropy of X is the “minimum number of bits” needed (“onaverage”) for coding the outcomes of X .

The most likely outcome will require the least number of bits.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Alternatively,X → Sender →

msg Receiver

Entropy of X is the “minimum number of bits” needed (“onaverage”) for coding the outcomes of X .

The most likely outcome will require the least number of bits.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

Things we will cover todayEntropy

Entropy

Alternatively,X → Sender →

msg Receiver

Entropy of X is the “minimum number of bits” needed (“onaverage”) for coding the outcomes of X .

The most likely outcome will require the least number of bits.

Hayder Radha Presented by: Kiran Misra ECE 802-606: Information Theory Detour

1/13 Copyright © 2005-2007 – Hayder Radha

2/13 Copyright © 2005-2007 – Hayder Radha

3/13 Copyright © 2005-2007 – Hayder Radha

4/13 Copyright © 2005-2007 – Hayder Radha

5/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Line
Kiran
Pencil
Kiran
Pencil

6/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Line
Kiran
Line
Kiran
Pencil

7/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Rectangle
Kiran
Rectangle

8/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Rectangle
Kiran
Rectangle

9/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Line
Kiran
Pencil

10/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Pencil
Kiran
Pencil

11/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Line
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil

12/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil
Kiran
Pencil

13/13 Copyright © 2005-2007 – Hayder Radha

Kiran
Pencil
Kiran
Pencil

top related