2003-oct-17

Upload: mradul-yadav

Post on 03-Apr-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/28/2019 2003-Oct-17

    1/9

    What is Information Theory?

    The Basics

    Sensor Reading Group

    10 October 2003

  • 7/28/2019 2003-Oct-17

    2/9

    Motivation

    The fundamental problem of communication

    is that of reproducing at one point eitherexactly or approximately a message selectedat another point.

    - C.E. Shannon, 1948

    Not only for communication, but for sensing,classification, economics, computer science,mathematics, networking,

  • 7/28/2019 2003-Oct-17

    3/9

    The Main Idea

    Info theory examines the uncertain world.

    Cant assume perfect communication (or sensing,classification, etc.), but we can characterize theuncertainty.

    Quantities possess information, but are corruptedby noise.

    C.E. Shannon wrote A Mathematical Theoryof Communication

    Rigorous analysis of the essence of information

  • 7/28/2019 2003-Oct-17

    4/9

    Entropy: What is H(X)?

    Entropy is a measure of the uncertainty in a

    random variable. This is a function of the probability distribution, not

    of the individual samples of that distribution.

    Examples

    20 Questions Entropy is the minimum expected

    # of yes/no questions needed to determine X. Classification Entropy represents the inherent

    uncertainty of the class of the target.

  • 7/28/2019 2003-Oct-17

    5/9

    Entropy, H(X)

    Given a r.v.Xwith probability (mass) functionp(x),the entropy is defined as:

    Example calculation

    Given two classes (e.g. friend or foe) that aredistributed uniformly, the entropy, oruncertainty, in the class of the target is:

    =x

    xpxpXH )(log)()(

    bitXH

    foeXpfoeXp

    friendXpfriendXpXH

    12logloglog)(

    )(log)(

    )(log)()(

    2

    1

    2

    1

    2

    1

    2

    1===

    ==

    ===

  • 7/28/2019 2003-Oct-17

    6/9

    Mutual Information: What is I(X;Y)?

    Mutual information is a measure of theamount of information shared between

    random variables. What does knowing information about one thing

    tell you about another thing?

    Examples Wheel of Fortune You can guess what the

    missing words/letters are, based on the ones

    showing J eopardy Youre given the answer, you have to

    determine what it tells you about the question.

  • 7/28/2019 2003-Oct-17

    7/9

    Mutual Information, I(X;Y)

    Given two (discrete) r.v.s X, Ywith pmfs p(x), p(y),p(x,y), the mutual information is defined as:

    =

    =

    yx ypxp

    yxp

    yxp

    YXHXHYXI

    , )()(

    ),(

    log),(

    )|()();(

    J eopardy example:

    )|();()( YXHYXIXH +=What is X?

    The answer isThis is the

    contestants job.

  • 7/28/2019 2003-Oct-17

    8/9

    Relationship between Entropy and Mutual

    Information

    H(X)

  • 7/28/2019 2003-Oct-17

    9/9

    Fanos Inequality

    Probability of error of classification

    ||log

    1);()(

    ||log

    1)|(

    =

    YXIXHYXHPe

    This is the reason why some of us are interested inMAXIMIZING mutual information!