cs 573: advanced ai probabilities & monte-hall problem
TRANSCRIPT
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 2
In a perfect simple world …
Actions have deterministic effects
States are completely observable
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 3
But, this world is not perfect …
For example, a robot in the field
Actions -> uncertain effects Observations -> noises & errors Unpredictable events
• Rocks fall from sky• Robots gain intelligence and rebel
Uncertainties => Probabilities
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 4
Notation
P(VAR = value)
The probability variable VAR takes on the given value
P(STUDENTS=20) = 0.93
P(Propositional_sentence)P(Propositional_sentence=true)
The probability the given propositional sentence is true
P(HAPPY) = 0.40 P(HAPPY AND HEALTHY) = 0.39
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 5
Conditional probabilities
Rainy
0.36No Stars
0.44Rainy & No Stars
0.30
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 6
Conditional probabilities
Rainy
0.36No Stars
0.44Rainy & No Stars
0.30
We know its rainy today, what’s the probability that last night there is no stars in sky?
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 7
Conditional probabilities
Rainy
0.36No Stars
0.44Rainy & No Stars
0.30
P(No Stars | Rainy)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 8
Conditional probabilities
Rainy
0.36No Stars
0.44Rainy & No Stars
0.30
P(No Stars | Rainy) = P(Rainy & No Starts) / P(Rainy)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 9
Conditional probabilities
Rainy
0.36No Stars
0.44Rainy & No Stars
0.30
P(No Stars | Rainy) = P(Rainy & No Starts) / P(Rainy)
= 0.30 / 0.36
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 10
Conditional probabilities
P(A | B) = P(A & B) / P(B)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 11
Axioms of probabilities
1. 0 <=P(A) <= 1
2. P(true) = 1; P(false) = 0
3. P(A or B) = P(A) + P(B) – P(A & B)
AB
A & B
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 13
Example derivation
P(A) + P(NOT A) = 1
P(A or B) = P(A) + P(B) – P(A & B)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 14
Example derivation
P(A) + P(NOT A) = 1
P(A or B) = P(A) + P(B) – P(A & B)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 15
Example derivation
P(A) + P(NOT A) = 1
P(A or B) = P(A) + P(B) – P(A & B)
P(A or NOT A) = P(A) + P(NOT A) – P(A & NOT A)
Let B be NOT A
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 16
Example derivation
P(A) + P(NOT A) = 1
P(A or B) = P(A) + P(B) – P(A & B)
P(A or NOT A) = P(A) + P(NOT A) – P(A & NOT A)
Let B be NOT A
P(true) P(false)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 17
Example derivation
P(A) + P(NOT A) = 1
P(A or B) = P(A) + P(B) – P(A & B)
P(A or NOT A) = P(A) + P(NOT A) – P(A & NOT A)
Let B be NOT A
P(true) P(false)
1 = P(A) + P(NOT A) – 0
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 18
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
Happy
Healthy
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 19
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
Happy
Healthy
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 20
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
Happy
Healthy
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 21
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
Happy
Healthy
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 22
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
Happy
Healthy
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 23
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
P(NOT Healthy)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 24
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
P(NOT Healthy) = P(Healthy=false)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 25
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
P(NOT Healthy) = P(Healthy=false)
=P(Happy & NOT Healthy) + P(NOT Happy & NOT Healthy)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 26
Joint probability distribution
Happy Healthy P
true true 0.39
true false 0.01
false true 0.06
false false 0.54
P(NOT Healthy) = P(Healthy=false)
=P(Happy & NOT Healthy) + P(NOT Happy & NOT Healthy)
=0.01 + 0.54 = 0.55
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 28
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(A|B)
= P(A & B) / P(B) -> conditional probabilities
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 29
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(A|B)
= P(A & B) / P(B) -> conditional probabilities
P(A|B) * P(B) = P(A & B) -> *P(B)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 30
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(B|A)
= P(A & B) / P(A) -> conditional probabilities
P(B|A) * P(A) = P(A & B) -> * P(A)
P(A|B)
= P(A & B) / P(B) -> conditional probabilities
P(A|B) * P(B) = P(A & B) -> *P(B)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 31
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(B|A)
= P(A & B) / P(A) -> conditional probabilities
P(B|A) * P(A) = P(A & B) -> * P(A)
P(A|B)
= P(A & B) / P(B) -> conditional probabilities
P(A|B) * P(B) = P(A & B) -> *P(B)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 32
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(B|A)
= P(A & B) / P(A) -> conditional probabilities
P(B|A) * P(A) = P(A & B) -> * P(A)
P(A|B) * P(B) = P(B|A) * P(A)
P(A|B)
= P(A & B) / P(B) -> conditional probabilities
P(A|B) * P(B) = P(A & B) -> *P(B)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 33
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(disease|Symptom)
Disease
• Open your abdomen -> painful but accurate
• based on symptom -> less pain but less accurate
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 34
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(disease|Symptom)
Disease
• Open your abdomen -> painful but accurate
• based on symptom -> less pain but less accurate
= P(disease) * P(symptom|disease) / P(symptom)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 35
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(disease|Symptom)
Disease
• Open your abdomen -> painful but accurate
• based on symptom -> less pain but less accurate
= P(disease) * P(symptom|disease) / P(symptom)
Population sampling
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 36
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(disease|Symptom)
Disease
• Open your abdomen -> painful but accurate
• based on symptom -> less pain but less accurate
= P(disease) * P(symptom|disease) / P(symptom)
Population sampling
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 37
Bayes rule
P(A|B) = P(A) * P(B|A) / P(B)
P(disease|Symptom)
Disease
• Open your abdomen -> painful but accurate
• based on symptom -> less pain but less accurate
= P(disease) * P(symptom|disease) / P(symptom)
Patients sampling
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 39
The “Monte-Hall” Problem
Your selection: DOOR 2
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 40
The “Monte-Hall” Problem
Your selection: DOOR 2
P(You-get-prize) = 1/3
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 41
The “Monte-Hall” Problem
Your selection: DOOR 2
P(You-get-prize) = 1/3
Host open DOOR 1
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 42
The “Monte-Hall” Problem
Your selection: DOOR 2
P(You-get-prize) = 1/3
Host open DOOR 1
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 43
The “Monte-Hall” Problem
Before DOOR 1 open:
P(prize=1) = P(prize=2) = P(prize=3) = 1/3
After DOOR 1 open:
P(prize=1)=0, P(prize=2) = P(prize=3) = 1/2
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 44
The “Monte-Hall” Problem
There are three objects
1. Empty-A
2. Empty-B
3. Prize
Three Doors
1. DOOR-1
2. DOOR-2
3. DOOR-3
1. You choose Empty-A
2. You choose Empty-B
3. You choose Prize
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 45
The “Monte-Hall” Problem
There are three objects
1. Empty-A
2. Empty-B
3. Prize
Three Doors
1. DOOR-1
2. DOOR-2
3. DOOR-3
1. You choose Empty-A host reveal Empty-B switch->win
2. You choose Empty-B
3. You choose Prize
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 46
The “Monte-Hall” Problem
There are three objects
1. Empty-A
2. Empty-B
3. Prize
Three Doors
1. DOOR-1
2. DOOR-2
3. DOOR-3
1. You choose Empty-A host reveal Empty-B switch->win
2. You choose Empty-B host reveal Empty-A switch->win
3. You choose Prize
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 47
The “Monte-Hall” Problem
There are three objects
1. Empty-A
2. Empty-B
3. Prize
Three Doors
1. DOOR-1
2. DOOR-2
3. DOOR-3
1. You choose Empty-A host reveal Empty-B switch->win
2. You choose Empty-B host reveal Empty-A switch->win
3. You choose Prize host reveal Empty-A or B switch->lose
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 48
The “Monte-Hall” Problem
Bayes rule explanation
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 49
The “Monte-Hall” Problem
Notation:
Oi : Open DOOR i
Xi : Prize is behind DOOR i
Choose Door 3
P(X2) = 1/3
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 50
The “Monte-Hall” Problem
Notation:
Oi : Open DOOR i
Xi : Prize is behind DOOR i
Choose Door 3
P(X2) = 1/3
P(O1)=P(O2)=1/2
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 51
The “Monte-Hall” Problem
Notation:
Oi : Open DOOR i
Xi : Prize is behind DOOR i
Choose Door 3
P(X2) = 1/3
P(O1)=P(O2)=1/2
• X1 -> O2
• X2 -> O1
• X3 -> O1 or O2
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 52
The “Monte-Hall” Problem
Notation:
Oi : Open DOOR i
Xi : Prize is behind DOOR i
Choose Door 3
P(X2) = 1/3
P(O1)=P(O2)=1/2
Host open DOOR 1, P(X2 | O1) = ?
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 53
The “Monte-Hall” Problem
Notation:
Oi : Open DOOR i
Xi : Prize is behind DOOR i
Choose Door 3
P(X2) = 1/3
P(O1)=P(O2)=1/2
Host open DOOR 1, P(X2 | O1) = ?
P(X2|O1)
= P(O1|X2)*P(X2)/P(O1)
CS 573 : Advanced AI Probabilities & Monte-Hall Problem 54
The “Monte-Hall” Problem
Notation:
Oi : Open DOOR i
Xi : Prize is behind DOOR i
Choose Door 3
P(X2) = 1/3
P(O1)=P(O2)=1/2
Host open DOOR 1, P(X2 | O1) = ?
P(X2|O1)
= P(O1|X2)*P(X2)/P(O1)
= 1 *