l7 probabilistic reasoning

Upload: giaosutruongbk

Post on 03-Jun-2018

223 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/12/2019 l7 Probabilistic Reasoning

    1/32

    Knowledge Engineering

    (IT4362E)

    Quang Nhat NGUYEN

    ([email protected])

    Hanoi University of Science and Technology

    School of Information and Communication Technology

    -

  • 8/12/2019 l7 Probabilistic Reasoning

    2/32

  • 8/12/2019 l7 Probabilistic Reasoning

    3/32

    Basic probability concepts Suppose we have an experiment (e.g., a dice roll) whose

    Sample spaceS. A set of all possible outcomesE. ., S= 1, 2, 3, 4, 5, 6 for a dice roll

    EventE. A subset of the sample space

    E.g., E= {1}: the result of the roll is one

    .g., E= {1, 3, 5}: e resu o e ro s an o num er

    Event spaceW. The possible worlds the outcome can occurE. . Wincludes all dice rolls

    Random variableA. A random variable represents an

    event, and there is some degree of chance (probability)

    a e even occurs

    3Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    4/32

    Visualizing probabilityP( A) : the fraction of possible worlds in which A is true

    Event space of allpossible worlds

    Worlds in which

    A is true

    Its area is 1

    ~

    4Knowledge Engineering

    . . .

  • 8/12/2019 l7 Probabilistic Reasoning

    5/32

    Boolean random variablesA Boolean random variable can take either of the two

    ,

    The axioms

    P( t r ue) = 1

    P( A V B) = P( A) + P( B) - P( A B)

    e coro ar es P( not A) P( ~A) = 1 - P( A)

    ~

    5Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    6/32

  • 8/12/2019 l7 Probabilistic Reasoning

    7/32

    Conditional probability (1)P( A| B) is the fraction of worlds in which A is true given

    Example

    A: I will go to the football match tomorrow

    B: It will be not raining tomorrow

    P( A| B) : The probability that I will go to the football

    match if (given that) it will be not raining tomorrow

    7Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    8/32

    Conditional probability (2)),( BAP

    )(BPWorldsCorollaries:

    B is trueP( A, B) =P( A| B) . P( B)

    Worlds in whichis true

    P( A| B) +P( ~A| B) =1

    k

    = ==iiv

    1

    8Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    9/32

    Independent variables (1) Two events A and B are statistically independent if the

    when B occurs, or

    when B does not occur, or

    when nothing is known about the occurrence of B

    ExampleA: I will play a football match tomorrow

    B: Bob will play the football match

    P( A| B) = P( A)

    Whether Bob will play the football match tomorrow does not

    influence m decision of oin to the football match.

    9Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    10/32

    Independent variables (2)From the definition of independent variables

    ,

    P( ~A| B) = P( ~A)

    P( B| A) = P( B)

    P( A, B) = P( A) . P( B)

    P( ~A, B) = P( ~A) . P( B)

    P( A, ~B) = P( A) . P( ~B)

    P( ~A, ~B) = P( ~A) . P( ~B)

    10Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    11/32

    Conditional probability >2 variables P( A| B, C) is the probability of A given B

    Example

    B C

    A: I will walk along the river tomorrow

    morning

    A

    morning

    C: I will get up early tomorrow morning

    ,

    P( A| B, C) : The probability that I will walk

    along the river tomorrow morning if (given

    11Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    12/32

    Conditional independence Two variables A and C are conditionally independent

    as the probability of A given B and C

    = ,

    Example

    A: I will la the football match tomorrow

    B: The football match will take place indoor

    C: It will be not raining tomorrow

    , =Given knowing that the match will take place indoor, the

    probability that I will play the match does not depend on the

    12Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    13/32

    Probability Important rules Chain rule

    , . .

    P( A| B) = P( A, B) / P( B) = P( B| A) . P( A) / P( B)

    = =

    = P( A| B,C) . P( B| C)

    P( A| B) = P( A) ; if A and B are independent

    P A B C =P A C . P B C ifA andB are conditionall

    independent given C

    P( A1,,An| C) = P( A1| C) P( An| C) ; if A1,,An are

    13Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    14/32

    Bayes theorem)().|( hPhDP

    )(DP . .,

    classification) h

    P( D| h) : Probability of observing the data Dgivenhypothesis h

    P( h| D) : Probability of hypothesis h given the observeddata D

    14Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    15/32

    Bayes theorem Example (1)Assume that we have the following data (of a person):

    D1 Sunny Hot High Weak No

    D2 Sunny Hot High Strong No

    vercas o g ea es

    D4 Rain Mild High Weak Yes

    D5 Rain Cool Normal Weak YesD6 Rain Cool Normal Strong No

    D7 Overcast Cool Normal Strong Yes

    D8 Sunny Mild High Weak No

    D9 Sunny Cool Normal Weak YesD10 Rain Mild Normal Weak Yes

    D11 Sunny Mild Normal Strong Yes

    15Knowledge Engineering

    D12 Overcast Mild High Strong Yes

    [Mitchell, 1997]

  • 8/12/2019 l7 Probabilistic Reasoning

    16/32

    Bayes theorem Example (2) Dataset D. The data of the days when the outlook is sunny

    and the wind is stron

    Hypothesis h. The person plays tennis Prior probability P( h) . Probability that the person plays tennis

    (i.e., irrespective of the outlook and the wind)

    Prior probability P( D) . Probability that the outlook is sunny

    P( D| h) . Probability that the outlook is sunny and the wind is

    strong, given knowing that the person plays tennis

    P( h| D) . Probability that the person plays tennis, given

    knowing that the outlook is sunny and the wind is strong

    e are n eres e n s pos er or pro a y

    16Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    17/32

    Maximum a posteriori (MAP) Given a set Hof possible hypotheses (e.g., possible

    ,

    hypothesish( H) given the observed data D

    Such a maximall robable h othesis is called a maximum a

    posteriori (MAP) hypothesis

    )|(maxarg DhPhMAP =Hh

    )().|(maxarg

    hPhDPhMAP=

    (by Bayes theorem)

    )().|(maxarg hPhDPhMAP=(P( D) is a constant,inde endent of h

    17Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    18/32

    MAP hypothesis Example The set Hcontains two hypotheses

    h : The erson will la tennis

    h2: The person will not play tennis Compute the two posteriori probabilities P( h1| D) , P( h2| D)

    The MAP hypothesis: hMAP=h1 if P( h1| D) P( h2| D) ;otherwise h

    MAP

    =h2

    ecause = , 1 , 2 s e same or o 1 anh2, we ignore it

    , . P( D| h2) . P( h2) , and make the conclusion:

    If P( D| h1) . P( h1) P( D| h2) . P( h2) , the person will play tennis;

    Otherwise, the person will not play tennis

    18Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    19/32

    Maximum likelihood estimation (MLE)

    Recall of MAP. Given a set Hof possible hypotheses,n e ypo es s a max m zes: P . P

    Assumption in MLE. Every hypothesis in H is equallypro a e a pr or : i = j , i , j

    MLE finds the hypothesis that maximizes P( D| h) ; where

    The maximum likelihood (ML) hypothesis

    )|(maxarg hDPhHh

    ML

    =

    19Knowledge Engineering

  • 8/12/2019 l7 Probabilistic Reasoning

    20/32

    ML hypothesis Example The set Hcontains two hypotheses

    h : The erson will la tennis

    h2: The person will not play tennis

    D: The data of the dates when the outlook is sunny and the wind is strong

    ompu e e wo e oo va ues o e a a g ven e wo

    hypotheses: P( D| h1) and P( D| h2)

    P( Out l ook=Sunny, Wi nd=St r ong| h1) = 1/8P( Out l ook=Sunny, Wi nd=St r ong| h2) = 1/4

    The ML hypothesis hML=h1 if P( D| h1) P( D| h2) ; otherwise

    ML 2 Because P( Out l ook=Sunny, Wi nd=St r ong| h1)