inc 551 artificial intelligence lecture 8 models of uncertainty
TRANSCRIPT
![Page 1: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/1.jpg)
INC 551 Artificial Intelligence
Lecture 8
Models of Uncertainty
![Page 2: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/2.jpg)
Inference by Enumeration
![Page 3: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/3.jpg)
Bayesian Belief Network Model
Causes -> Effect
Graph Structure shows dependency
![Page 4: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/4.jpg)
Burglar Alarm ExampleMy house has a burglar alarm but sometimes it rings becauseof an earthquake. My neighbors, John and Mary promise meto call if they hear the alarm. However, their ears are not perfect.
![Page 5: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/5.jpg)
One Way to Create BBN
![Page 6: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/6.jpg)
Computing Probability
= 0.90x0.70x0.001x0.999x0.998 = 0.00062
![Page 7: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/7.jpg)
BBN Construction
There are many ways to construct BBN ofa problem because the events depend oneach other (related).
Therefore, it depends on the order of eventsthat you consider.
The most simple is the most compact.
![Page 8: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/8.jpg)
![Page 9: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/9.jpg)
![Page 10: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/10.jpg)
![Page 11: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/11.jpg)
Not compact
![Page 12: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/12.jpg)
Inference Problem
)|( eEXP i Find
),|( trueMaryCallstrueJohnCallsBurglaryP
For example, find
![Page 13: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/13.jpg)
Inference by Enumeration
y
yeXPeXPeP
eXPeXP ),,(),(
)(
),()|(
Note: let
α is called “Normalized Constant”
)(
1
eP
y are other events
![Page 14: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/14.jpg)
),|( trueMaryCallstrueJohnCallsBurglaryP
(summation of all events)
![Page 15: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/15.jpg)
Calculation Tree
Inefficient, Compute P(j|a)P(m|a) for every value of e
![Page 16: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/16.jpg)
001.0
998.))06.05.01(.)94.9.7((.
002.))05.05.01(.)95.9.7((.
),,(),|(
mjbPmjbP
Next, we have to find P(~b|j,m) and find α = 1/P(j,m)
![Page 17: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/17.jpg)
Approximate Inference
Idea: Count from real examples
We call this procedure “Sampling”
Sampling = get real examples from the world model
![Page 18: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/18.jpg)
Sampling Example
Cloudy = มี�เมีฆSprinkler = ละอองน้ำ��
![Page 19: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/19.jpg)
?)?,?,,(
),,,(
T
WetGrassRainSprinklerCloudy
![Page 20: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/20.jpg)
?)?,?,,(
),,,(
T
WetGrassRainSprinklerCloudy
![Page 21: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/21.jpg)
?)?,,,(
),,,(
FT
WetGrassRainSprinklerCloudy
![Page 22: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/22.jpg)
?),,,(
),,,(
TFT
WetGrassRainSprinklerCloudy
![Page 23: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/23.jpg)
?),,,(
),,,(
TFT
WetGrassRainSprinklerCloudy
![Page 24: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/24.jpg)
),,,(
),,,(
TTFT
WetGrassRainSprinklerCloudy
1 Sample
![Page 25: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/25.jpg)
Rejection Sampling
Idea: Count only the sample that agree with e
To find )|( eXP i
![Page 26: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/26.jpg)
Rejection Sampling
Drawback: There are not many samples that agree with e
From the example above, from 100 samples, only 27 areUsable.
![Page 27: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/27.jpg)
Likelihood Weighting
Idea: Generate only samples that are relevant with e
However, we must use “weighted sampling”
e.g. Find ),|( trueWetGrasstrueSprinklerRainP
![Page 28: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/28.jpg)
Fix sprinkler = TRUE Wet Grass = TRUE
Weighted Sampling
Sample from P(Cloudy)=0.5 , suppose we get “true”
![Page 29: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/29.jpg)
![Page 30: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/30.jpg)
![Page 31: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/31.jpg)
Sprinkler already has value = true,therefore we multiply with weight = 0.1
![Page 32: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/32.jpg)
Sample from P(Rain)=0.8 , suppose we get “true”
![Page 33: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/33.jpg)
WetGrass already has value = true,therefore we multiply with weight = 0.99
![Page 34: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/34.jpg)
Finally, we got a sample (t,t,t,t)With weight = 0.099
![Page 35: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/35.jpg)
Temporal Model (Time)
When the events are tagged with timestamp
1dayRain 2dayRain 3dayRain
Each node is considered “state”
![Page 36: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/36.jpg)
Markov Process
Let Xt = stateFor Markov processes, Xt depends only on finite numberof previous xt
![Page 37: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty](https://reader033.vdocuments.us/reader033/viewer/2022051316/56649ec55503460f94bcfdb3/html5/thumbnails/37.jpg)
Hidden Markov Model (HMM)
Each state has observation, Et.
We cannot see “state”, but we see “observation”.