lifted probabilistic inference in relational modelsweb.cs.ucla.edu/~guyvdb/slides/uai14.pdf–a...
TRANSCRIPT
![Page 1: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/1.jpg)
Lifted Probabilistic Inference
in Relational Models
Guy Van den Broeck Dan Suciu
KU Leuven U. of Washington
Tutorial UAI 2014 1
![Page 2: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/2.jpg)
About the Tutorial
Slides available online.
Bibliography is at the end.
Your speakers:
I work in AI I work in DB
http://www.guyvdb.eu/
https://homes.cs.washington.edu/~suciu/
![Page 3: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/3.jpg)
About the Tutorial
• The tutorial is about
– deep connections between AI and DBs
– a unified view on probabilistic reasoning
– a logical approach to Lifted Inference
• The tutorial is NOT an exhaustive
overview of lifted algorithms for graphical
models (see references at the end)
![Page 4: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/4.jpg)
Outline
• Part 1: Motivation
• Part 2: Probabilistic Databases
• Part 3: Weighted Model Counting
• Part 4: Lifted Inference for WFOMC
• Part 5: The Power of Lifted Inference
• Part 6: Conclusion/Open Problems
Tutorial UAI 2014 4
![Page 5: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/5.jpg)
Part 1: Motivation
• Why do we need relational representations
of uncertainty?
• Why do we need lifted inference
algorithms?
Tutorial UAI 2014 5
![Page 6: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/6.jpg)
Why Relational Data?
• Our data is already relational!
– Companies run relational databases
– Scientific data is relational:
• Large Hadron Collider generated 25PB in 2012
• LSST Telescope will produce 30TB per night
• Big data is big business:
– Oracle: $7.1BN in sales
– IBM: $3.2BN in sales
– Microsoft: $2.6BN in sales
≈ GDP of
Iceland!
6 [Gartner‟06]
![Page 7: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/7.jpg)
Why Probabilistic Relational Data?
• Relational data is increasingly probabilistic
– NELL machine reading (>50M tuples)
– Google Knowledge Vault (>2BN tuples)
– DeepDive (>7M tuples)
• Data is inferred from unstructured information using statistical models
– Learned from the web, large text corpora, ontologies, etc.
– The learned/extracted data is relational
Tutorial UAI 2014 7 [Carlson‟10, Dong‟14, Niu‟12]
![Page 8: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/8.jpg)
Representation:
Probabilistic Databases
• Tuple-independent probabilistic databases
• Query: SQL or First Order Logic
Actor:
Name Prob
Brando 0.9
Cruise 0.8
Coppola 0.1
WorkedFor:
Actor Director Prob
Brando Coppola 0.9
Coppola Brando 0.2
Cruise Coppola 0.1
Tutorial UAI 2014 8
Q(x) = ∃y Actor(x)∧WorkedFor(x,y) SELECT Actor.name
FROM Actor, WorkedFor
WHERE Actor.name = WorkedFor.actor
![Page 9: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/9.jpg)
9
Logical
Propositional Relational
Summary
Uncertainty in AI
Databases
Statistical Prob.
DBs
![Page 10: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/10.jpg)
10
Statistical
Logical
Propositional Relational
Representations in AI and ML
Cloudy Rain P(Rain|Cloudy)
T T 0.80
T F 0.20
F T 0.01
F F 0.99
Rain ⇒ Cloudy
![Page 11: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/11.jpg)
Relational Representations
Example: First-Order Logic
Logical variables have domain of constants
x,y range over domain People = {Alice,Bob}
Ground formula has no logical variables
Smokes(Alice) ∧ Friends(Alice,Bob) ⇒ Smokes(Bob)
Tutorial UAI 2014 11
∀x,y, Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)
Formula
Logical Variable Atom
![Page 12: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/12.jpg)
12
Statistical
Logical
Propositional Relational
Representations in AI and ML
Cloudy Rain P(Rain|Cloudy)
T T 0.80
T F 0.20
F T 0.01
F F 0.99
∀x, ∀y, Smokes(x) ∧ Friends(x,y)
⇒ Smokes(y) Rain ⇒ Cloudy
![Page 13: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/13.jpg)
Why Statistical Relational Models?
Probabilistic graphical models
Quantify uncertainty and noise
Not very expressive Rules of chess in ~100,000 pages
First-order logic Very expressive Rules of chess in 1 page
Good match for abundant relational data
Hard to express uncertainty and noise
Tutorial UAI 2014 13
![Page 14: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/14.jpg)
Example: Markov Logic
Weighted First-Order Logic
Ground atom/tuple = random variable in {true,false}
e.g., Smokes(Alice), Friends(Alice,Bob), etc.
Ground formula = factor in propositional factor graph
Friends(Alice,Bob)
Smokes(Alice) Smokes(Bob)
Friends(Bob,Alice)
f1 f2 Friends(Alice,Alice) Friends(Bob,Bob)
f3 f4
3.14 Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)
FOL Formula Weight or Probability
[Richardson‟06]
![Page 15: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/15.jpg)
15
Cloudy Rain P(Rain|Cloudy)
T T 0.80
T F 0.20
F T 0.01
F F 0.99
Statistical
Logical
Propositional Relational
Representations in AI and ML
3.14 Smokes(x) ∧ Friends(x,y)
⇒ Smokes(y)
∀x, ∀y, Smokes(x) ∧ Friends(x,y)
⇒ Smokes(y) Rain ⇒ Cloudy
![Page 16: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/16.jpg)
15
Cloudy Rain P(Rain|Cloudy)
T T 0.80
T F 0.20
F T 0.01
F F 0.99
Statistical
Logical
Propositional Relational
Representations in AI and ML
3.14 Smokes(x) ∧ Friends(x,y)
⇒ Smokes(y)
∀x, ∀y, Smokes(x) ∧ Friends(x,y)
⇒ Smokes(y) Rain ⇒ Cloudy
![Page 17: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/17.jpg)
16
Logical
Propositional Relational
Summary
Uncertainty in AI
Databases
Statistical StarAI
![Page 18: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/18.jpg)
17
Logical
Propositional Relational
Summary
Uncertainty in AI
Databases
Statistical
![Page 19: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/19.jpg)
17
Logical
Propositional Relational
Summary
Uncertainty in AI
Databases
Statistical
![Page 20: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/20.jpg)
Lifted Inference
• Main idea: exploit high level relational
representation to speed up reasoning
• Let‟s see an example…
Tutorial UAI 2014 18
![Page 21: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/21.jpg)
A Simple Reasoning Problem
52 playing cards
Let us ask some simple questions
...
Tutorial UAI 2014 19
![Page 22: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/22.jpg)
...
A Simple Reasoning Problem
?
Probability that Card1 is Q?
Tutorial UAI 2014 20
![Page 23: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/23.jpg)
...
A Simple Reasoning Problem
?
Probability that Card1 is Q? 1/13
Tutorial UAI 2014 20
![Page 24: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/24.jpg)
...
A Simple Reasoning Problem
?
Probability that Card1 is Hearts?
Tutorial UAI 2014 21
![Page 25: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/25.jpg)
...
A Simple Reasoning Problem
?
Probability that Card1 is Hearts? 1/4
Tutorial UAI 2014 21
![Page 26: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/26.jpg)
...
A Simple Reasoning Problem
?
Probability that Card1 is Hearts
given that Card1 is red?
Tutorial UAI 2014 22
![Page 27: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/27.jpg)
...
A Simple Reasoning Problem
?
Probability that Card1 is Hearts
given that Card1 is red? 1/2
Tutorial UAI 2014 22
![Page 28: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/28.jpg)
A Simple Reasoning Problem
...
?
Probability that Card52 is Spades
given that Card1 is QH?
Tutorial UAI 2014 23
![Page 29: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/29.jpg)
A Simple Reasoning Problem
...
?
Probability that Card52 is Spades
given that Card1 is QH? 13/51
Tutorial UAI 2014 23
![Page 30: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/30.jpg)
Automated Reasoning
Let us automate this:
1. Probabilistic graphical model (e.g., factor graph)
2. Probabilistic inference algorithm
(e.g., variable elimination or junction tree)
Tutorial UAI 2014 24
![Page 31: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/31.jpg)
Reasoning in Propositional Models
A key result: Treewidth Why?
A
B C
D E
F
A
B C
D E
F
A
B C
D E
F
Tree Sparse Graph Dense Graph
25
![Page 32: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/32.jpg)
Reasoning in Propositional Models
A key result: Treewidth Why?
A
B C
D E
F
A
B C
D E
F
A
B C
D E
F
Conditional Independence!
Tree Sparse Graph Dense Graph
25
![Page 33: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/33.jpg)
Reasoning in Propositional Models
A key result: Treewidth Why?
A
B C
D E
F
A
B C
D E
F
A
B C
D E
F
P(A|C,E) = P(A|C) P(A|B,E,F) = P(A|B,E) P(A|B,E,F) ≠ P(A|B,E)
Conditional Independence!
Tree Sparse Graph Dense Graph
25
![Page 34: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/34.jpg)
Is There Conditional Independence?
...
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
26
![Page 35: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/35.jpg)
Is There Conditional Independence?
...
?
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
? ≟ ?
26
![Page 36: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/36.jpg)
Is There Conditional Independence?
...
?
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
13/51 ≟ ?
26
![Page 37: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/37.jpg)
Is There Conditional Independence?
...
?
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
13/51 ≟ ?
26
![Page 38: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/38.jpg)
Is There Conditional Independence?
...
?
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
13/51 ≠ 12/50
26
![Page 39: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/39.jpg)
Is There Conditional Independence?
...
?
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
13/51 ≠ 12/50
P(Card52 | Card1) ≠ P(Card52 | Card1, Card2)
26
![Page 40: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/40.jpg)
Is There Conditional Independence?
...
?
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
13/51 ≠ 12/50
P(Card52 | Card1, Card2) ≟ P(Card52 | Card1, Card2, Card3)
P(Card52 | Card1) ≠ P(Card52 | Card1, Card2)
26
![Page 41: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/41.jpg)
Is There Conditional Independence?
...
?
P(Card52 | Card1) ≟ P(Card52 | Card1, Card2)
13/51 ≠ 12/50
P(Card52 | Card1, Card2) ≟ P(Card52 | Card1, Card2, Card3)
P(Card52 | Card1) ≠ P(Card52 | Card1, Card2)
12/50 ≠ 12/49
P(Card52 | Card1, Card2) ≠ P(Card52 | Card1, Card2, Card3)
26
![Page 42: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/42.jpg)
Automated Reasoning
(artist's impression)
Let us automate this:
1. Probabilistic graphical model (e.g., factor graph) is fully connected!
2. Probabilistic inference algorithm (e.g., variable elimination or junction tree) builds a table with 1352 rows
27
![Page 43: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/43.jpg)
...
What's Going On Here?
?
Probability that Card52 is Spades
given that Card1 is QH?
Tutorial UAI 2014 28
![Page 44: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/44.jpg)
...
What's Going On Here?
?
Probability that Card52 is Spades
given that Card1 is QH? 13/51
Tutorial UAI 2014 28
![Page 45: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/45.jpg)
What's Going On Here?
?
...
Probability that Card52 is Spades
given that Card2 is QH?
Tutorial UAI 2014 29
![Page 46: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/46.jpg)
What's Going On Here?
?
...
Probability that Card52 is Spades
given that Card2 is QH? 13/51
Tutorial UAI 2014 29
![Page 47: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/47.jpg)
What's Going On Here?
?
...
Probability that Card52 is Spades
given that Card3 is QH?
Tutorial UAI 2014 30
![Page 48: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/48.jpg)
What's Going On Here?
?
...
Probability that Card52 is Spades
given that Card3 is QH? 13/51
Tutorial UAI 2014 30
![Page 49: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/49.jpg)
...
Tractable Probabilistic Inference
Which property makes inference tractable?
Traditional belief: Independence (conditional/contextual)
What's going on here?
31 [Niepert‟14]
![Page 50: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/50.jpg)
...
Tractable Probabilistic Inference
Which property makes inference tractable?
Traditional belief: Independence (conditional/contextual)
What's going on here?
⇒ Lifted Inference
High-level reasoning
Symmetry
Exchangeability
31 [Niepert‟14]
![Page 51: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/51.jpg)
...
Tractable Probabilistic Inference
Which property makes inference tractable?
Traditional belief: Independence (conditional/contextual)
What's going on here?
⇒ Lifted Inference
High-level reasoning
Symmetry
Exchangeability See AAAI talk on Tuesday!
31 [Niepert‟14]
![Page 52: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/52.jpg)
Automated Reasoning
Let us automate this:
Relational model
Lifted probabilistic inference algorithm
Tutorial UAI 2014 32
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
![Page 53: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/53.jpg)
Other Examples of Lifted Inference
implies
First-order resolution
Tutorial UAI 2014 33
∀x, Human(x) ⇒ Mortal(x)
∀x, Greek(x) ⇒ Human(x)
∀x, Greek(x) ⇒ Mortal(x)
![Page 54: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/54.jpg)
Other Examples of Lifted Inference
First-order resolution
Reasoning about populations We are investigating a rare disease. The disease is more rare in
women, presenting only in one in every two billion women
and one in every billion men. Then, assuming there are 3.4
billion men and 3.6 billion women in the world, the probability
that more than five people have the disease is
![Page 55: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/55.jpg)
Lifted Inference in SRL
Statistical relational model (e.g., MLN)
As a probabilistic graphical model:
26 pages; 728 variables; 676 factors
1000 pages; 1,002,000 variables;
1,000,000 factors
Highly intractable?
– Lifted inference in milliseconds!
Tutorial UAI 2014 35
3.14 FacultyPage(x) ∧ Linked(x,y) ⇒ CoursePage(y)
![Page 56: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/56.jpg)
Summary of Motivation
• Relational data is everywhere:
– Databases in industry
– Databases in sciences
– Knowledge bases
• Lifted inference:
– Use relational structure during reasoning
– Very efficient where traditional methods break
This tutorial: Lifted Inference in Relational Models
Tutorial UAI 2014 36
![Page 57: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/57.jpg)
Outline
• Part 1: Motivation
• Part 2: Probabilistic Databases
• Part 3: Weighted Model Counting
• Part 4: Lifted Inference for WFOMC
• Part 5: The Power of Lifted Inference
• Part 6: Conclusion/Open Problems
Tutorial UAI 2014 37
![Page 58: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/58.jpg)
What Everyone Should Know about
Databases
• Database = several relations (a.k.a. tables)
• SQL Query = FO Formula
• Boolean Query = FO Sentence
Tutorial UAI 2014 38
![Page 59: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/59.jpg)
What Everyone Should Know about
Databases
X Y
Alice 2009
Alice 2010
Bob 2009
Carol 2010
Smoker X Z
Alice Bob
Alice Carol
Bob Carol
Carol Bob
Friend
D =
Database: relations (= tables)
![Page 60: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/60.jpg)
What Everyone Should Know about
Databases
X Y
Alice 2009
Alice 2010
Bob 2009
Carol 2010
Smoker X Z
Alice Bob
Alice Carol
Bob Carol
Carol Bob
Friend
D =
Z
Bob
Carol
Database: relations (= tables)
Query answer: Q(D) =
Q(z) = ∃x (Smoker(x,‟2009‟)∧Friend(x,z)) Query: First Order Formula
Find friends of smokers in 2009
Conjunctive Queries CQ = FO(∃, ∧)
Union of Conjunctive Queries UCQ = FO(∃, ∧, ∨)
![Page 61: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/61.jpg)
What Everyone Should Know about
Databases
X Y
Alice 2009
Alice 2010
Bob 2009
Carol 2010
Smoker X Z
Alice Bob
Alice Carol
Bob Carol
Carol Bob
Friend
Boolean Query: FO Sentence Q = ∃x (Smoker(x,‟2009‟)∧Friend(x,‟Bob‟))
Query answer: Q(D) = TRUE
D =
Z
Bob
Carol
Database: relations (= tables)
Query answer: Q(D) =
Q(z) = ∃x (Smoker(x,‟2009‟)∧Friend(x,z)) Query: First Order Formula
Find friends of smokers in 2009
Conjunctive Queries CQ = FO(∃, ∧)
Union of Conjunctive Queries UCQ = FO(∃, ∧, ∨)
![Page 62: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/62.jpg)
What Everyone Should Know about
Databases
Declarative Query Query Plan
“what” “how”
![Page 63: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/63.jpg)
What Everyone Should Know about
Databases
Declarative Query Query Plan
“what” “how”
Q(z) = ∃x (Smoker(x,‟2009‟) ∧ Friend(x,z))
![Page 64: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/64.jpg)
What Everyone Should Know about
Databases
Declarative Query Query Plan
“what” “how”
Q(z) = ∃x (Smoker(x,‟2009‟) ∧ Friend(x,z))
Πz
⋈x
σy=„2009‟
Smoker(x,y) Friend(x,z)
Logical Query Plan
![Page 65: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/65.jpg)
What Everyone Should Know about
Databases
Declarative Query Query Plan
“what” “how”
Q(z) = ∃x (Smoker(x,‟2009‟) ∧ Friend(x,z))
Πz
⋈x
σy=„2009‟
Smoker(x,y) Friend(x,z)
Logical Query Plan
Πz
⋈x
σy=„2009‟
Smoker(x,y) Friend(x,z)
Logical Query Plan
Optimize
![Page 66: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/66.jpg)
What Everyone Should Know about
Databases
Declarative Query Query Plan
“what” “how”
Q(z) = ∃x (Smoker(x,‟2009‟) ∧ Friend(x,z))
Πz
⋈x
σy=„2009‟
Smoker(x,y) Friend(x,z)
Logical Query Plan
Index-scan
Merge-join
Hash-based
deduplication
Physical Query Plan
Πz
⋈x
σy=„2009‟
Smoker(x,y) Friend(x,z)
Logical Query Plan
Optimize
![Page 67: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/67.jpg)
What Every Researcher Should
Know about Databases
Problem: compute Q(D)
Moshe Vardi [Vardi‟82]
2008 ACM SIGMOD Contribution Award
Tutorial UAI 2014 41 [Vardi‟82]
![Page 68: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/68.jpg)
What Every Researcher Should
Know about Databases
Problem: compute Q(D)
• Data complexity:
fix Q, complexity = f(D)
Moshe Vardi [Vardi‟82]
2008 ACM SIGMOD Contribution Award
Tutorial UAI 2014 41 [Vardi‟82]
![Page 69: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/69.jpg)
What Every Researcher Should
Know about Databases
Problem: compute Q(D)
• Data complexity:
fix Q, complexity = f(D)
Query complexity: (expression complexity)
fix D, complexity = f(Q)
• Combined complexity:
complexity = f(D,Q)
Moshe Vardi [Vardi‟82]
2008 ACM SIGMOD Contribution Award
Tutorial UAI 2014 41 [Vardi‟82]
![Page 70: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/70.jpg)
Probabilistic Databases
• A probabilistic database = relational
database where each tuple has an
associated probability
• Semantics = probability distribution over
possible worlds (deterministic databases)
• In this talk: tuples are independent events
Tutorial UAI 2014 42
[Suciu‟11]
![Page 71: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/71.jpg)
Example
x y P
A B p1
A C p2
B C p3
Probabilistic database D: Friend
Tutorial UAI 2014 43
![Page 72: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/72.jpg)
x y
A B
A C
B C
Example
x y P
A B p1
A C p2
B C p3
Possible worlds semantics:
p1p2p3
Probabilistic database D: Friend
Tutorial UAI 2014 43
![Page 73: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/73.jpg)
x y
A B
A C
B C
Example
x y P
A B p1
A C p2
B C p3
Possible worlds semantics:
p1p2p3
(1-p1)p2p3
Probabilistic database D: Friend
x y
A C
B C
Tutorial UAI 2014 43
![Page 74: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/74.jpg)
x y
A B
A C
B C
Example
x y P
A B p1
A C p2
B C p3
Possible worlds semantics:
p1p2p3
(1-p1)p2p3
(1-p1)(1-p2)(1-p3)
Probabilistic database D: Friend
x y
A C
B C
x y
A B
A C
x y
A B
B C
x y
A B x y
A C x y
B C x y
Tutorial UAI 2014 43
![Page 75: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/75.jpg)
Query Semantics
Fix a Boolean query Q
Fix a probabilistic database D:
P (Q | D) = marginal probability of Q
on possible words of D
Tutorial UAI 2014 44
![Page 76: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/76.jpg)
x y P
A D q1 Y1
A E q2 Y2
B F q3 Y3
B G q4 Y4
B H q5 Y5
Friend
x P
A p1 X1
B p2 X2
C p3 X3
Smoker
P(Q | D) =
An Example
Q = ∃x∃y Smoker(x)∧Friend(x,y)
![Page 77: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/77.jpg)
x y P
A D q1 Y1
A E q2 Y2
B F q3 Y3
B G q4 Y4
B H q5 Y5
Friend
x P
A p1 X1
B p2 X2
C p3 X3
Smoker
P(Q | D) = 1-(1-q1)*(1-q2)
An Example
Q = ∃x∃y Smoker(x)∧Friend(x,y)
![Page 78: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/78.jpg)
x y P
A D q1 Y1
A E q2 Y2
B F q3 Y3
B G q4 Y4
B H q5 Y5
Friend
x P
A p1 X1
B p2 X2
C p3 X3
Smoker
P(Q | D) = 1-(1-q1)*(1-q2) p1*[ ]
An Example
Q = ∃x∃y Smoker(x)∧Friend(x,y)
![Page 79: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/79.jpg)
x y P
A D q1 Y1
A E q2 Y2
B F q3 Y3
B G q4 Y4
B H q5 Y5
Friend
x P
A p1 X1
B p2 X2
C p3 X3
Smoker
P(Q | D) = 1-(1-q1)*(1-q2) p1*[ ]
1-(1-q3)*(1-q4)*(1-q5)
An Example
Q = ∃x∃y Smoker(x)∧Friend(x,y)
![Page 80: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/80.jpg)
x y P
A D q1 Y1
A E q2 Y2
B F q3 Y3
B G q4 Y4
B H q5 Y5
Friend
x P
A p1 X1
B p2 X2
C p3 X3
Smoker
P(Q | D) = 1-(1-q1)*(1-q2) p1*[ ]
1-(1-q3)*(1-q4)*(1-q5) p2*[ ]
An Example
Q = ∃x∃y Smoker(x)∧Friend(x,y)
![Page 81: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/81.jpg)
x y P
A D q1 Y1
A E q2 Y2
B F q3 Y3
B G q4 Y4
B H q5 Y5
Friend
x P
A p1 X1
B p2 X2
C p3 X3
Smoker
P(Q | D) = 1-(1-q1)*(1-q2) p1*[ ]
1-(1-q3)*(1-q4)*(1-q5) p2*[ ]
1- {1- } *
{1- }
An Example
Q = ∃x∃y Smoker(x)∧Friend(x,y)
![Page 82: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/82.jpg)
x y P
A D q1 Y1
A E q2 Y2
B F q3 Y3
B G q4 Y4
B H q5 Y5
Friend
x P
A p1 X1
B p2 X2
C p3 X3
Smoker
P(Q | D) = 1-(1-q1)*(1-q2) p1*[ ]
1-(1-q3)*(1-q4)*(1-q5) p2*[ ]
1- {1- } *
{1- }
An Example
Q = ∃x∃y Smoker(x)∧Friend(x,y)
One can compute P(Q | D) in PTIME
in the size of the database D
![Page 83: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/83.jpg)
An Example Q = ∃x∃y Smoker(x)∧Friend(x,y)
x y P
A D q1
A E q2
B F q3
B G q4
B H q5
x P
A p1
B p2
C p3
⋈
ΠΦ
Friend(x,y)
Πx
Smoker(x)
Use the SQL engine
to compute the query!
Aggregate on probabilities.
![Page 84: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/84.jpg)
An Example Q = ∃x∃y Smoker(x)∧Friend(x,y)
x y P
A D q1
A E q2
B F q3
B G q4
B H q5
x P
A p1
B p2
C p3
⋈
ΠΦ
Friend(x,y)
Πx
x P
A 1-(1-q1)(1-q2)
B 1-(1-q4)(1-q5) (1-q6)
Smoker(x)
Use the SQL engine
to compute the query!
Aggregate on probabilities.
![Page 85: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/85.jpg)
An Example Q = ∃x∃y Smoker(x)∧Friend(x,y)
x y P
A D q1
A E q2
B F q3
B G q4
B H q5
x P
A p1
B p2
C p3
⋈
ΠΦ
Friend(x,y)
Πx
x P
A 1-(1-q1)(1-q2)
B 1-(1-q4)(1-q5) (1-q6)
1-{1-p1[1-(1-q1)(1-q2)]}*
{1-p2[1-(1-q4)(1-q5) (1-q6)]}
Smoker(x)
Use the SQL engine
to compute the query!
Aggregate on probabilities.
![Page 86: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/86.jpg)
Problem Statement
Given: probabilistic database D, query Q
Compute: P(Q | D)
Data complexity: fix Q, complexity = f(|D|)
Tutorial UAI 2014 47
![Page 87: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/87.jpg)
Approaches to Compute P(Q | D)
• Propositional inference:
– Ground the query Q FQ,D, compute P(FQ,D)
– This is Weighted Model Counting (later…)
– Works for every query Q
– But: may be exponential in |D| (data complexity)
• Lifted inference:
– Compute a query plan for Q, execute plan on D
– Always polynomial time in |D| (data complexity)
– But: does not work for all queries Q
Tutorial UAI 2014 48 [Olteanu‟08, Jha‟13, Dalvi‟04, Dalvi‟12]
![Page 88: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/88.jpg)
The Lifted Inference Rules
• If Q1, Q2 are independent:
AND-rule: P(Q1 ∧ Q2) = P(Q1)P(Q2)
OR-rule: P(Q1 ∨ Q2) =1 – (1 – P(Q1))(1 – P(Q2))
![Page 89: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/89.jpg)
The Lifted Inference Rules
• If Q1, Q2 are independent:
AND-rule: P(Q1 ∧ Q2) = P(Q1)P(Q2)
OR-rule: P(Q1 ∨ Q2) =1 – (1 – P(Q1))(1 – P(Q2))
• If Q[C1/x], Q[C2/x], … are independent
∀-Rule: P(∀z Q) = ΠC ∈Domain P(Q[C/z])
∃-Rule: P(∃z Q) = 1 – ΠC ∈Domain (1– P(Q[C/z])
![Page 90: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/90.jpg)
The Lifted Inference Rules
• If Q1, Q2 are independent:
AND-rule: P(Q1 ∧ Q2) = P(Q1)P(Q2)
OR-rule: P(Q1 ∨ Q2) =1 – (1 – P(Q1))(1 – P(Q2))
• If Q[C1/x], Q[C2/x], … are independent
∀-Rule: P(∀z Q) = ΠC ∈Domain P(Q[C/z])
∃-Rule: P(∃z Q) = 1 – ΠC ∈Domain (1– P(Q[C/z])
• Inclusion/Exclusion formula:
P(Q1 ∨ Q2) = P(Q1) + P(Q2)- P(Q1 ∧ Q2)
P(Q1 ∧ Q2) = P(Q1) + P(Q2)- P(Q1 ∨ Q2)
![Page 91: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/91.jpg)
The Lifted Inference Rules
• If Q1, Q2 are independent:
AND-rule: P(Q1 ∧ Q2) = P(Q1)P(Q2)
OR-rule: P(Q1 ∨ Q2) =1 – (1 – P(Q1))(1 – P(Q2))
• If Q[C1/x], Q[C2/x], … are independent
∀-Rule: P(∀z Q) = ΠC ∈Domain P(Q[C/z])
∃-Rule: P(∃z Q) = 1 – ΠC ∈Domain (1– P(Q[C/z])
• Inclusion/Exclusion formula:
P(Q1 ∨ Q2) = P(Q1) + P(Q2)- P(Q1 ∧ Q2)
P(Q1 ∧ Q2) = P(Q1) + P(Q2)- P(Q1 ∨ Q2)
• Negation: P(¬Q) = 1 – P(Q)
![Page 92: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/92.jpg)
Example
Q = ∀x∀y (Smoker(x) ∨ Friend(x,y)) =∀x (Smoker(x) ∨∀y Friend(x,y))
P(Q) = ΠA ∈ Domain P(Smoker(A) ∨ ∀y Friend(A,y))
∀-Rule
Check independence:
Smoker(Alice) ∨ ∀y Friend(Alice,y)
Smoker(Bob) ∨ ∀y Friend(Bob,y)
![Page 93: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/93.jpg)
Example
Q = ∀x∀y (Smoker(x) ∨ Friend(x,y)) =∀x (Smoker(x) ∨∀y Friend(x,y))
P(Q) = ΠA ∈ Domain P(Smoker(A) ∨ ∀y Friend(A,y))
∀-Rule
P(Q) = ΠA ∈ Domain (1– P(Smoker(A))) × (1 – P(∀y Friend(A,y))) ∨-Rule
Check independence:
Smoker(Alice) ∨ ∀y Friend(Alice,y)
Smoker(Bob) ∨ ∀y Friend(Bob,y)
![Page 94: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/94.jpg)
Example
Q = ∀x∀y (Smoker(x) ∨ Friend(x,y)) =∀x (Smoker(x) ∨∀y Friend(x,y))
P(Q) = ΠA ∈ Domain P(Smoker(A) ∨ ∀y Friend(A,y))
∀-Rule
P(Q) = ΠA ∈ Domain (1– P(Smoker(A))) × (1 – P(∀y Friend(A,y))) ∨-Rule
P(Q) = ΠA ∈ Domain (1– P(Smoker(A))) × (1 – ΠB ∈ Domain P(Friend(A,B)))
∀-Rule
Check independence:
Smoker(Alice) ∨ ∀y Friend(Alice,y)
Smoker(Bob) ∨ ∀y Friend(Bob,y)
![Page 95: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/95.jpg)
Example
Q = ∀x∀y (Smoker(x) ∨ Friend(x,y)) =∀x (Smoker(x) ∨∀y Friend(x,y))
Lookup the probabilities
in the database
P(Q) = ΠA ∈ Domain P(Smoker(A) ∨ ∀y Friend(A,y))
∀-Rule
P(Q) = ΠA ∈ Domain (1– P(Smoker(A))) × (1 – P(∀y Friend(A,y))) ∨-Rule
P(Q) = ΠA ∈ Domain (1– P(Smoker(A))) × (1 – ΠB ∈ Domain P(Friend(A,B)))
∀-Rule
Check independence:
Smoker(Alice) ∨ ∀y Friend(Alice,y)
Smoker(Bob) ∨ ∀y Friend(Bob,y)
![Page 96: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/96.jpg)
Example
Q = ∀x∀y (Smoker(x) ∨ Friend(x,y)) =∀x (Smoker(x) ∨∀y Friend(x,y))
Lookup the probabilities
in the database Runtime = O(n2).
P(Q) = ΠA ∈ Domain P(Smoker(A) ∨ ∀y Friend(A,y))
∀-Rule
P(Q) = ΠA ∈ Domain (1– P(Smoker(A))) × (1 – P(∀y Friend(A,y))) ∨-Rule
P(Q) = ΠA ∈ Domain (1– P(Smoker(A))) × (1 – ΠB ∈ Domain P(Friend(A,B)))
∀-Rule
Check independence:
Smoker(Alice) ∨ ∀y Friend(Alice,y)
Smoker(Bob) ∨ ∀y Friend(Bob,y)
![Page 97: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/97.jpg)
Discussion: CNF vs. DNF
Databases KR/AI
Conjunctive
Queries CQ
FO(∃, ∧) Positive Clause FO(∀, ∨)
Union of Conjunctive
Queries UCQ
FO(∃, ∧, ∨) =
∃ Positive-DNF
Positive FO FO(∀, ∧, ∨) =
∀ Positive-CNF
UCQ with “safe
negation” UCQ¬
∃ DNF First Order CNF ∀ CNF
Q = ∃x,∃y, Smoker(x)∧Friend(x,y) Q = ∀x∀y (Smoker(x) ∨ Friend(x,y))
By duality we can reduce one problem to the other:
∃x,∃y, Smoker(x) ∧ Friend(x,y) = ¬ ∀x,∀y, (¬Smoker(x) ∨ ¬Friend(x,y))
![Page 98: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/98.jpg)
Discussion
H0= ∀x∀y (Smoker(x)∨Friend(x,y)∨Jogger(y))
No rule applies here!
The ∀-rule does not apply, because H0[Alice/x] and H0[Bob/x] are dependent:
H0[Alice/x] = ∀y (Smoker(Alice) ∨ Friend(Alice,y) ∨ Jogger(y))
H0[Bob/x] = ∀y (Smoker(Bob) ∨ Friend(Bob,y) ∨ Jogger(y)) Dependent
Lifted Inference Sometimes Fails
![Page 99: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/99.jpg)
Discussion
H0= ∀x∀y (Smoker(x)∨Friend(x,y)∨Jogger(y))
No rule applies here!
The ∀-rule does not apply, because H0[Alice/x] and H0[Bob/x] are dependent:
H0[Alice/x] = ∀y (Smoker(Alice) ∨ Friend(Alice,y) ∨ Jogger(y))
H0[Bob/x] = ∀y (Smoker(Bob) ∨ Friend(Bob,y) ∨ Jogger(y)) Dependent
Theorem. [Dalvi‟04] Computing P(H0 | D) is #P-hard in |D|
Proof: later…
Lifted Inference Sometimes Fails
![Page 100: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/100.jpg)
Discussion
H0= ∀x∀y (Smoker(x)∨Friend(x,y)∨Jogger(y))
No rule applies here!
The ∀-rule does not apply, because H0[Alice/x] and H0[Bob/x] are dependent:
H0[Alice/x] = ∀y (Smoker(Alice) ∨ Friend(Alice,y) ∨ Jogger(y))
H0[Bob/x] = ∀y (Smoker(Bob) ∨ Friend(Bob,y) ∨ Jogger(y)) Dependent
Theorem. [Dalvi‟04] Computing P(H0 | D) is #P-hard in |D|
Proof: later…
Consequence: assuming PTIME ≠ #P, H0 is not liftable!
Lifted Inference Sometimes Fails
![Page 101: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/101.jpg)
Summary
• Database D = relations
• Query Q = FO
• Query plans, query optimization
• Data complexity: fix Q, complexity f(D)
• Probabilistic DB‟s = independent tuples
• Lifted inference: simple, but fails sometimes
Next: Weighted Model Counting = Unified framework for inference
Later: Are rules complete? Yes! (sort of): Power of Lifted Inference
![Page 102: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/102.jpg)
Outline
• Part 1: Motivation
• Part 2: Probabilistic Databases
• Part 3: Weighted Model Counting
• Part 4: Lifted Inference for WFOMC
• Part 5: The Power of Lifted Inference
• Part 6: Conclusion/Open Problems
54
![Page 103: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/103.jpg)
Weighted Model Counting
• Model = solution to a propositional logic formula Δ
• Model counting = #SAT
Rain Cloudy Model?
T T Yes
T F No
F T Yes
F F Yes
#SAT = 3
+
Δ = (Rain ⇒ Cloudy)
![Page 104: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/104.jpg)
Weighted Model Counting
• Model = solution to a propositional logic formula Δ
• Model counting = #SAT
Rain Cloudy Model?
T T Yes
T F No
F T Yes
F F Yes
#SAT = 3
Weight
1 * 3 = 3
0
2 * 3 = 6
2 * 5 = 10
• Weighted model counting (WMC) – Weights for assignments to variables
– Model weight is product of variable weights w(.)
+
Δ = (Rain ⇒ Cloudy)
Rain
w(R) w(¬R)
1 2
Cloudy
w(C) w(¬C)
3 5
![Page 105: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/105.jpg)
Weighted Model Counting
• Model = solution to a propositional logic formula Δ
• Model counting = #SAT
Rain Cloudy Model?
T T Yes
T F No
F T Yes
F F Yes
#SAT = 3
Weight
1 * 3 = 3
0
2 * 3 = 6
2 * 5 = 10
WMC = 19
• Weighted model counting (WMC) – Weights for assignments to variables
– Model weight is product of variable weights w(.)
+ +
Δ = (Rain ⇒ Cloudy)
Rain
w(R) w(¬R)
1 2
Cloudy
w(C) w(¬C)
3 5
![Page 106: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/106.jpg)
Weighted Model Counting @ UAI
• Assembly language for non-lifted inference
• Reductions to WMC for inference in – Bayesian networks [Chavira‟05, Sang‟05 , Chavira‟08]
– Factor graphs [Choi‟13]
– Relational Bayesian networks [Chavira‟06]
– Probabilistic logic programs [Fierens‟11, Fierens‟13]
– Probabilistic databases [Olteanu‟08, Jha‟13]
• State-of-the-art solvers
– Knowledge compilation (WMC → d-DNNF → AC)
Winner of the UAI’08 exact inference competition!
– DPLL search
![Page 107: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/107.jpg)
Weighted First-Order Model Counting Model = solution to first-order logic formula Δ
Δ = ∀d (Rain(d)
⇒ Cloudy(d))
[V.d.Broeck‟11a, Gogate‟11]
Days = {Monday}
![Page 108: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/108.jpg)
Weighted First-Order Model Counting Model = solution to first-order logic formula Δ
Rain(M) Cloudy(M) Model?
T T Yes
T F No
F T Yes
F F Yes
#SAT = 3
+
Δ = ∀d (Rain(d)
⇒ Cloudy(d))
[V.d.Broeck‟11a, Gogate‟11]
Days = {Monday}
![Page 109: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/109.jpg)
Weighted First-Order Model Counting Model = solution to first-order logic formula Δ
Rain(M) Cloudy(M) Rain(T) Cloudy(T) Model?
T T T T Yes
T F T T No
F T T T Yes
F F T T Yes
T T T F No
T F T F No
F T T F No
F F T F No
T T F T Yes
T F F T No
F T F T Yes
F F F T Yes
T T F F Yes
T F F F No
F T F F Yes
F F F F Yes
Δ = ∀d (Rain(d)
⇒ Cloudy(d))
Days = {Monday
Tuesday}
[V.d.Broeck‟11a, Gogate‟11]
![Page 110: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/110.jpg)
Weighted First-Order Model Counting Model = solution to first-order logic formula Δ
Rain(M) Cloudy(M) Rain(T) Cloudy(T) Model?
T T T T Yes
T F T T No
F T T T Yes
F F T T Yes
T T T F No
T F T F No
F T T F No
F F T F No
T T F T Yes
T F F T No
F T F T Yes
F F F T Yes
T T F F Yes
T F F F No
F T F F Yes
F F F F Yes
#SAT = 9
+
Δ = ∀d (Rain(d)
⇒ Cloudy(d))
Days = {Monday
Tuesday}
[V.d.Broeck‟11a, Gogate‟11]
![Page 111: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/111.jpg)
Weighted First-Order Model Counting Model = solution to first-order logic formula Δ
Rain(M) Cloudy(M) Rain(T) Cloudy(T) Model?
T T T T Yes
T F T T No
F T T T Yes
F F T T Yes
T T T F No
T F T F No
F T T F No
F F T F No
T T F T Yes
T F F T No
F T F T Yes
F F F T Yes
T T F F Yes
T F F F No
F T F F Yes
F F F F Yes
#SAT = 9
+
Δ = ∀d (Rain(d)
⇒ Cloudy(d))
Rain
d w(R(d)) w(¬R(d))
M 1 2
T 4 1
Cloudy
d w(C(d)) w(¬C(d))
M 3 5
T 6 2
Days = {Monday
Tuesday}
[V.d.Broeck‟11a, Gogate‟11]
![Page 112: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/112.jpg)
Weighted First-Order Model Counting Model = solution to first-order logic formula Δ
Weight
1 * 3 * 4 * 6 = 72
0
2 * 3 * 4 * 6 = 144
2 * 5 * 4 * 6 = 240
0
0
0
0
1 * 3 * 1 * 6 = 18
0
2 * 3 * 1 * 6 = 36
2 * 5 * 1 * 6 = 60
1 * 3 * 1 * 2 = 6
0
2 * 3 * 1 * 2 = 12
2 * 5 * 1 * 2 = 20
Rain(M) Cloudy(M) Rain(T) Cloudy(T) Model?
T T T T Yes
T F T T No
F T T T Yes
F F T T Yes
T T T F No
T F T F No
F T T F No
F F T F No
T T F T Yes
T F F T No
F T F T Yes
F F F T Yes
T T F F Yes
T F F F No
F T F F Yes
F F F F Yes
#SAT = 9
+
Δ = ∀d (Rain(d)
⇒ Cloudy(d))
Rain
d w(R(d)) w(¬R(d))
M 1 2
T 4 1
Cloudy
d w(C(d)) w(¬C(d))
M 3 5
T 6 2
Days = {Monday
Tuesday}
[V.d.Broeck‟11a, Gogate‟11]
![Page 113: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/113.jpg)
Weighted First-Order Model Counting Model = solution to first-order logic formula Δ
Weight
1 * 3 * 4 * 6 = 72
0
2 * 3 * 4 * 6 = 144
2 * 5 * 4 * 6 = 240
0
0
0
0
1 * 3 * 1 * 6 = 18
0
2 * 3 * 1 * 6 = 36
2 * 5 * 1 * 6 = 60
1 * 3 * 1 * 2 = 6
0
2 * 3 * 1 * 2 = 12
2 * 5 * 1 * 2 = 20
WFOMC = 608
+
Rain(M) Cloudy(M) Rain(T) Cloudy(T) Model?
T T T T Yes
T F T T No
F T T T Yes
F F T T Yes
T T T F No
T F T F No
F T T F No
F F T F No
T T F T Yes
T F F T No
F T F T Yes
F F F T Yes
T T F F Yes
T F F F No
F T F F Yes
F F F F Yes
#SAT = 9
+
Δ = ∀d (Rain(d)
⇒ Cloudy(d))
Rain
d w(R(d)) w(¬R(d))
M 1 2
T 4 1
Cloudy
d w(C(d)) w(¬C(d))
M 3 5
T 6 2
Days = {Monday
Tuesday}
[V.d.Broeck‟11a, Gogate‟11]
![Page 114: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/114.jpg)
Weighted First-Order
Model Counting @ UAI
• Assembly language for lifted inference
• Reduction to WFOMC for lifted inference in
– Markov logic networks [V.d.Broeck‟11a,Gogate‟11]
– Parfactor graphs [V.d.Broeck‟13a]
– Probabilistic logic programs [V.d.Broeck‟14]
– Probabilistic databases [Gribkoff‟14]
![Page 115: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/115.jpg)
x y
A B
A C
B C
From Probabilities to Weights
Friend
x y P
A B p1
A C p2
B C p3
p1p2p3
(1-p1)p2p3
(1-p1)(1-p2)(1-p3)
x y
A C
B C
x y
A B
A C
x y
A B
B C
x y
A B x y
A C x y
B C x y
![Page 116: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/116.jpg)
x y
A B
A C
B C
From Probabilities to Weights
Friend
x y P
A B p1
A C p2
B C p3
p1p2p3
(1-p1)p2p3
(1-p1)(1-p2)(1-p3)
x y
A C
B C
x y
A B
A C
x y
A B
B C
x y
A B x y
A C x y
B C x y
![Page 117: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/117.jpg)
x y
A B
A C
B C
From Probabilities to Weights
Friend
x y P
A B p1
A C p2
B C p3
p1p2p3
(1-p1)p2p3
(1-p1)(1-p2)(1-p3)
x y
A C
B C
x y
A B
A C
x y
A B
B C
x y
A B x y
A C x y
B C x y
x y w(Friend(x,y)) w(¬Friend(x,y))
A B w1 = p1 w1 = 1-p1
A C w2 = p2 w2 = 1-p2
B C w3 = p3 w3 = 1-p3
A A w4 = 0 w4 = 1
A C w5 = 0 w5 = 1
… …
Also for
missing
tuples!
w1w2w3
w1w2w3
w1w2w3
![Page 118: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/118.jpg)
Discussion
• Simple idea: replace p, 1-p by w, w
• Query computation becomes WFOMC
• To obtain a probability space, divide the weight of each world by Z = sum of weights of all worlds: Z = (w1+w1) (w2+w2) (w3+w3) …
• Why weights instead of probabilities? They can describe complex correlations (next)
![Page 119: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/119.jpg)
Markov Logic
Capture knowledge through constraints (a.k.a. “features”):
Soft constraint,
weight = exp(3.73)
Hard constraint
∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x)∧Friend(x,y) ⇒ Smoker(y)
[Richardson‟06]
![Page 120: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/120.jpg)
Markov Logic
Capture knowledge through constraints (a.k.a. “features”):
Soft constraint,
weight = exp(3.73)
Hard constraint
An MLN is a set of constraints (w, Γ(x)), where w=weight, Γ(x)=FO formula
∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x)∧Friend(x,y) ⇒ Smoker(y)
[Richardson‟06]
![Page 121: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/121.jpg)
Markov Logic
Capture knowledge through constraints (a.k.a. “features”):
Soft constraint,
weight = exp(3.73)
Hard constraint
An MLN is a set of constraints (w, Γ(x)), where w=weight, Γ(x)=FO formula
Weight of a world = product of exp(w), for all MLN rules (w, Γ(x))
and grounding Γ(a) that hold in that world
∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x)∧Friend(x,y) ⇒ Smoker(y)
[Richardson‟06]
![Page 122: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/122.jpg)
Markov Logic
Capture knowledge through constraints (a.k.a. “features”):
Soft constraint,
weight = exp(3.73)
Hard constraint
An MLN is a set of constraints (w, Γ(x)), where w=weight, Γ(x)=FO formula
Weight of a world = product of exp(w), for all MLN rules (w, Γ(x))
and grounding Γ(a) that hold in that world
∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x)∧Friend(x,y) ⇒ Smoker(y)
Probability of a world = Weight / Z
Z = sum of weights of all worlds (no longer a simple expression!)
[Richardson‟06]
![Page 123: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/123.jpg)
Problem Statement
0.7 Actor(a) ⇒ ¬Director(a)
1.2 Director(a) ⇒ ¬WorkedFor(a,b)
1.4 InMovie(m,a) ∧ WorkedFor(a,b) ⇒ InMovie(m,b)
Actor:
Name w
Brando 2.9
Cruise 3.8
Coppola 1.1
WorkedFor:
Actor Director w
Brando Coppola 2.5
Coppola Brando 0.2
Cruise Coppola 1.7
MLN:
Database tables (if missing, then w = 1)
P(InMovie(GodFather, Brando) = ??
Given:
Compute:
![Page 124: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/124.jpg)
Discussion
• Probabilistic databases = independence MLN = complex correlations
• To translate weights to probabilities we need to divide by Z, which often is difficult to compute
• However, we can reduce the Z-computation problem to WFOMC (next)
![Page 125: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/125.jpg)
Z WFOMC(Δ)
1. Formula Δ
2. Weight function w(.)
![Page 126: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/126.jpg)
Z WFOMC(Δ)
1. Formula Δ
Δ = ∧ (∞,Γ(x))∈MLN (∀x Γ(x)) If all MLN constraints are hard:
2. Weight function w(.)
![Page 127: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/127.jpg)
Z WFOMC(Δ)
1. Formula Δ
Δ = ∧ (∞,Γ(x))∈MLN (∀x Γ(x)) If all MLN constraints are hard:
If (wi, Γi (x)) is a soft MLN constraint, then:
a) Remove (wi, Γi (x)) from the MLN
b) Add new probabilistic relation Fi(x)
c) Add hard constraint (∞, ∀x (Fi(x) ⬄ Γi (x)))
2. Weight function w(.)
![Page 128: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/128.jpg)
Z WFOMC(Δ)
1. Formula Δ
Δ = ∧ (∞,Γ(x))∈MLN (∀x Γ(x)) If all MLN constraints are hard:
If (wi, Γi (x)) is a soft MLN constraint, then:
a) Remove (wi, Γi (x)) from the MLN
b) Add new probabilistic relation Fi(x)
c) Add hard constraint (∞, ∀x (Fi(x) ⬄ Γi (x)))
Better rewritings in
[Jha‟12],[V.d.Broeck‟14]
2. Weight function w(.)
For all constants A, relations Fi,
set w(Fi(A)) = exp(wi), w(¬Fi(A)) = 1
![Page 129: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/129.jpg)
Z WFOMC(Δ)
1. Formula Δ
Δ = ∧ (∞,Γ(x))∈MLN (∀x Γ(x)) If all MLN constraints are hard:
If (wi, Γi (x)) is a soft MLN constraint, then:
a) Remove (wi, Γi (x)) from the MLN
b) Add new probabilistic relation Fi(x)
c) Add hard constraint (∞, ∀x (Fi(x) ⬄ Γi (x)))
Better rewritings in
[Jha‟12],[V.d.Broeck‟14]
2. Weight function w(.)
For all constants A, relations Fi,
set w(Fi(A)) = exp(wi), w(¬Fi(A)) = 1
Theorem: Z = WFOMC(Δ)
![Page 130: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/130.jpg)
Example
1. Formula Δ
2. Weight function w(.)
![Page 131: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/131.jpg)
Example
1. Formula Δ ∞ Smoker(x) ⇒ Person(x)
2. Weight function w(.)
![Page 132: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/132.jpg)
Example
1. Formula Δ ∞ Smoker(x) ⇒ Person(x)
2. Weight function w(.)
Δ = ∀x (Smoker(x) ⇒ Person(x))
![Page 133: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/133.jpg)
Example
1. Formula Δ ∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x) ∧ Friend(x,y) ⇒ Smoker(y)
2. Weight function w(.)
Δ = ∀x (Smoker(x) ⇒ Person(x))
![Page 134: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/134.jpg)
Example
1. Formula Δ ∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x) ∧ Friend(x,y) ⇒ Smoker(y)
2. Weight function w(.)
Δ = ∀x (Smoker(x) ⇒ Person(x)) Δ = ∀x (Smoker(x) ⇒ Person(x))
∧ ∀x∀y (F(x,y) ⬄ [Smoker(x) ∧ Friend(x,y) ⇒ Smoker(y)])
![Page 135: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/135.jpg)
Example
1. Formula Δ ∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x) ∧ Friend(x,y) ⇒ Smoker(y)
2. Weight function w(.)
F
x y w(F(x,y)) w(¬F(x,y))
A A exp(3.75) 1
A B exp(3.75) 1
A C exp(3.75) 1
B A exp(3.75) 1
. . . . . .
Note: if no tables given
for Smoker, Person, etc,
(i.e. no evidence)
then set their w = w = 1
Δ = ∀x (Smoker(x) ⇒ Person(x)) Δ = ∀x (Smoker(x) ⇒ Person(x))
∧ ∀x∀y (F(x,y) ⬄ [Smoker(x) ∧ Friend(x,y) ⇒ Smoker(y)])
![Page 136: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/136.jpg)
Example
1. Formula Δ ∞ Smoker(x) ⇒ Person(x)
3.75 Smoker(x) ∧ Friend(x,y) ⇒ Smoker(y)
2. Weight function w(.)
F
x y w(F(x,y)) w(¬F(x,y))
A A exp(3.75) 1
A B exp(3.75) 1
A C exp(3.75) 1
B A exp(3.75) 1
. . . . . .
Note: if no tables given
for Smoker, Person, etc,
(i.e. no evidence)
then set their w = w = 1
Z = WFOMC(Δ)
Δ = ∀x (Smoker(x) ⇒ Person(x)) Δ = ∀x (Smoker(x) ⇒ Person(x))
∧ ∀x∀y (F(x,y) ⬄ [Smoker(x) ∧ Friend(x,y) ⇒ Smoker(y)])
![Page 137: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/137.jpg)
Lessons
• Weighed Model Counting: – Unified framework for probabilistic inference tasks
– Independent variables
• Weighed FO Model Counting: – Formula described by a concise FO sentence
– Still independent variables
• MLN: – Formulas plus weights
– Correlations!
– Can be converted to WFOMC
![Page 138: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/138.jpg)
Symmetric vs. Asymmetric
Symmetric WFOMC:
• In every relation R, all tuples have same weight
• Example: converting MLN “without evidence” into WFOMC leads to a symmetric weight function
Asymmetric WFOMC:
• Each relation R is given explicitly
• Example: Probabilistic Databases
• Example: MLN‟s plus evidence
[Gribkoff‟14]
![Page 139: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/139.jpg)
Terminology
MLNs Prob. DBs
Random variable is a Ground atom DB Tuple
Weights w associated with Formulas DB Tuples
Typical query Q is a Single atom FO formula/SQL
Data is encoded into Evidence (Query) Distribution
Correlations induced by Model formulas Query
Model generalizes across domains? Yes No
Query generalizes across domains? No Yes
Sum of weights of worlds is 1 (normalized)? No Yes
![Page 140: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/140.jpg)
Outline
• Part 1: Motivation
• Part 2: Probabilistic Databases
• Part 3: Weighted Model Counting
• Part 4: Lifted Inference for WFOMC
• Part 5: The Power of Lifted Inference
• Part 6: Conclusion/Open Problems
70
![Page 141: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/141.jpg)
Informal: Exploit symmetries, Reason at first-order level, Reason about groups of objects,
Scalable inference, High-level probabilistic reasoning, etc.
A formal definition: Domain-lifted inference
Defining Lifted Inference
Polynomial in #people, #webpages, #cards
Not polynomial in #predicates, #formulas, #logical variables
Related to data complexity in databases
Inference runs in time polynomial
in the number of objects in the domain.
[V.d.Broeck‟11b, Jaeger‟12]
![Page 142: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/142.jpg)
Informal: Exploit symmetries, Reason at first-order level, Reason about groups of objects,
Scalable inference, High-level probabilistic reasoning, etc.
A formal definition: Domain-lifted inference
Defining Lifted Inference
[V.d.Broeck‟11b, Jaeger‟12]
![Page 143: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/143.jpg)
Informal: Exploit symmetries, Reason at first-order level, Reason about groups of objects,
Scalable inference, High-level probabilistic reasoning, etc.
A formal definition: Domain-lifted inference
Defining Lifted Inference
[V.d.Broeck‟11b, Jaeger‟12]
Lifted inference = ∃Query Plan = ∃FO Compilation
Alternative in this tutorial:
![Page 144: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/144.jpg)
Rules for Asymmetric WFOMC
• If Δ1, Δ 2 are independent:
AND-rule: WMC(Δ1 ∧ Δ2) = WMC(Δ 1) * WMC(Δ 2) OR-rule: WMC(Δ1 ∨ Δ2) = Z - (Z1-WMC(Δ1)) * (Z2-WMC(Δ2))
![Page 145: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/145.jpg)
Rules for Asymmetric WFOMC
• If Δ1, Δ 2 are independent:
AND-rule: WMC(Δ1 ∧ Δ2) = WMC(Δ 1) * WMC(Δ 2) OR-rule: WMC(Δ1 ∨ Δ2) = Z - (Z1-WMC(Δ1)) * (Z2-WMC(Δ2))
Normalization constants
(easy to compute)
![Page 146: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/146.jpg)
Rules for Asymmetric WFOMC
• If Δ1, Δ 2 are independent:
AND-rule: WMC(Δ1 ∧ Δ2) = WMC(Δ 1) * WMC(Δ 2) OR-rule: WMC(Δ1 ∨ Δ2) = Z - (Z1-WMC(Δ1)) * (Z2-WMC(Δ2))
• If Δ[c1/x], Δ[c2/x], … are independent
∀-Rule: WMC(∀z Δ) = Πc ∈Domain WMC(Δ[c/z]) ∃-Rule: WMC(∃z Δ) = Z – Πc ∈Domain (Zc-WMC(Δ[c/z])
Normalization constants
(easy to compute)
![Page 147: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/147.jpg)
Rules for Asymmetric WFOMC
• If Δ1, Δ 2 are independent:
AND-rule: WMC(Δ1 ∧ Δ2) = WMC(Δ 1) * WMC(Δ 2) OR-rule: WMC(Δ1 ∨ Δ2) = Z - (Z1-WMC(Δ1)) * (Z2-WMC(Δ2))
• If Δ[c1/x], Δ[c2/x], … are independent
∀-Rule: WMC(∀z Δ) = Πc ∈Domain WMC(Δ[c/z]) ∃-Rule: WMC(∃z Δ) = Z – Πc ∈Domain (Zc-WMC(Δ[c/z])
• Inclusion/Exclusion formula: WMC(Δ1 ∨ Δ2) = WMC(Δ1) + WMC(Δ2) - WMC(Δ1 ∧ Δ2)
WMC(Δ1 ∧ Δ2) = WMC(Δ1) + WMC(Δ2) - WMC(Δ1 ∨ Δ2)
Normalization constants
(easy to compute)
![Page 148: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/148.jpg)
Rules for Asymmetric WFOMC
• If Δ1, Δ 2 are independent:
AND-rule: WMC(Δ1 ∧ Δ2) = WMC(Δ 1) * WMC(Δ 2) OR-rule: WMC(Δ1 ∨ Δ2) = Z - (Z1-WMC(Δ1)) * (Z2-WMC(Δ2))
• If Δ[c1/x], Δ[c2/x], … are independent
∀-Rule: WMC(∀z Δ) = Πc ∈Domain WMC(Δ[c/z]) ∃-Rule: WMC(∃z Δ) = Z – Πc ∈Domain (Zc-WMC(Δ[c/z])
• Inclusion/Exclusion formula: WMC(Δ1 ∨ Δ2) = WMC(Δ1) + WMC(Δ2) - WMC(Δ1 ∧ Δ2)
WMC(Δ1 ∧ Δ2) = WMC(Δ1) + WMC(Δ2) - WMC(Δ1 ∨ Δ2)
• Negation: WMC(¬Δ) = Z - WMC(Δ)
Normalization constants
(easy to compute)
![Page 149: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/149.jpg)
Symmetric WFOMC Rules
• Simplifications:
If Δ[c1/x], Δ[c2/x], … are independent
∀-Rule: WMC(∀z Δ) = WMC(Δ[c1/z])|Domain|
∃-Rule: WMC(∃z Δ) = Z – (Zc1-WMC(Δ[c1/z])|Domain|
• A powerful new inference rule: atom counting
Only possible with symmetric weights
Intuition: Remove unary relations
[V.d.Broeck‟11a]
![Page 150: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/150.jpg)
Symmetric WFOMC Rules
• Simplifications:
If Δ[c1/x], Δ[c2/x], … are independent
∀-Rule: WMC(∀z Δ) = WMC(Δ[c1/z])|Domain|
∃-Rule: WMC(∃z Δ) = Z – (Zc1-WMC(Δ[c1/z])|Domain|
The workhorse of
Symmetric WFOMC
• A powerful new inference rule: atom counting
Only possible with symmetric weights
Intuition: Remove unary relations
[V.d.Broeck‟11a]
![Page 151: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/151.jpg)
Symmetric WFOMC Rules: Example
• FO-Model Counting: w(R) = w(¬R) = 1
• Apply inference rules backwards (step 4-3-2-1)
![Page 152: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/152.jpg)
Symmetric WFOMC Rules: Example
4.
• FO-Model Counting: w(R) = w(¬R) = 1
• Apply inference rules backwards (step 4-3-2-1)
Δ = (Stress(Alice) ⇒ Smokes(Alice)) Domain = {Alice}
![Page 153: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/153.jpg)
Symmetric WFOMC Rules: Example
4.
• FO-Model Counting: w(R) = w(¬R) = 1
• Apply inference rules backwards (step 4-3-2-1)
WMC(¬Stress(Alice) ∨ Smokes(Alice))) =
= Z – WMC(Stress(Alice)) × WMC(¬Smokes(Alice))
= 4 – 1 × 1 = 3 models
Δ = (Stress(Alice) ⇒ Smokes(Alice)) Domain = {Alice}
OR-rule
![Page 154: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/154.jpg)
Symmetric WFOMC Rules: Example
4.
• FO-Model Counting: w(R) = w(¬R) = 1
• Apply inference rules backwards (step 4-3-2-1)
WMC(¬Stress(Alice) ∨ Smokes(Alice))) =
= Z – WMC(Stress(Alice)) × WMC(¬Smokes(Alice))
= 4 – 1 × 1 = 3 models
Δ = (Stress(Alice) ⇒ Smokes(Alice)) Domain = {Alice}
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
OR-rule
![Page 155: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/155.jpg)
Symmetric WFOMC Rules: Example
4.
→ 3n models
• FO-Model Counting: w(R) = w(¬R) = 1
• Apply inference rules backwards (step 4-3-2-1)
WMC(¬Stress(Alice) ∨ Smokes(Alice))) =
= Z – WMC(Stress(Alice)) × WMC(¬Smokes(Alice))
= 4 – 1 × 1 = 3 models
∀-Rule
Δ = (Stress(Alice) ⇒ Smokes(Alice)) Domain = {Alice}
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
OR-rule
![Page 156: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/156.jpg)
Symmetric WFOMC Rules: Example
→ 3n models
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
![Page 157: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/157.jpg)
Symmetric WFOMC Rules: Example
→ 3n models
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
2. Δ = ∀y, (ParentOf(y) ∧ Female ⇒ MotherOf(y)) D = {n people}
![Page 158: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/158.jpg)
Symmetric WFOMC Rules: Example
OR-Rule
→ 3n models
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
2. Δ = ∀y, (ParentOf(y) ∧ Female ⇒ MotherOf(y)) D = {n people}
WMC(Δ ) = WMC(¬ Female ∨ ∀y, (ParentOf(y) ⇒ MotherOf(y)))
= 2 * 2n * 2n - (2 – 1) * (2n * 2n – WMC(∀y, (ParentOf(y) ⇒ MotherOf(y))))
= 2 * 4n – (4n – 3n)
![Page 159: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/159.jpg)
→ 3n + 4n models
Symmetric WFOMC Rules: Example
OR-Rule
→ 3n models
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
2. Δ = ∀y, (ParentOf(y) ∧ Female ⇒ MotherOf(y)) D = {n people}
WMC(Δ ) = WMC(¬ Female ∨ ∀y, (ParentOf(y) ⇒ MotherOf(y)))
= 2 * 2n * 2n - (2 – 1) * (2n * 2n – WMC(∀y, (ParentOf(y) ⇒ MotherOf(y))))
= 2 * 4n – (4n – 3n)
![Page 160: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/160.jpg)
→ 3n + 4n models
Symmetric WFOMC Rules: Example
OR-Rule
→ 3n models
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
2. Δ = ∀y, (ParentOf(y) ∧ Female ⇒ MotherOf(y))
1. Δ = ∀x,y, (ParentOf(x,y) ∧ Female(x) ⇒ MotherOf(x,y)) D = {n people}
D = {n people}
WMC(Δ ) = WMC(¬ Female ∨ ∀y, (ParentOf(y) ⇒ MotherOf(y)))
= 2 * 2n * 2n - (2 – 1) * (2n * 2n – WMC(∀y, (ParentOf(y) ⇒ MotherOf(y))))
= 2 * 4n – (4n – 3n)
![Page 161: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/161.jpg)
→ 3n + 4n models
→ (3n + 4n)n models
Symmetric WFOMC Rules: Example
∀-Rule
OR-Rule
→ 3n models
3. Δ = ∀x, (Stress(x) ⇒ Smokes(x)) Domain = {n people}
2. Δ = ∀y, (ParentOf(y) ∧ Female ⇒ MotherOf(y))
1. Δ = ∀x,y, (ParentOf(x,y) ∧ Female(x) ⇒ MotherOf(x,y)) D = {n people}
D = {n people}
WMC(Δ ) = WMC(¬ Female ∨ ∀y, (ParentOf(y) ⇒ MotherOf(y)))
= 2 * 2n * 2n - (2 – 1) * (2n * 2n – WMC(∀y, (ParentOf(y) ⇒ MotherOf(y))))
= 2 * 4n – (4n – 3n)
![Page 162: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/162.jpg)
Atom Counting: Example
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 163: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/163.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 164: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/164.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 165: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/165.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 166: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/166.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 167: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/167.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 168: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/168.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 169: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/169.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 170: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/170.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 171: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/171.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 172: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/172.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
→ models
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 173: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/173.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
If we know that there are k smokers?
→ models
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 174: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/174.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
If we know that there are k smokers?
→ models
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
→ models
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 175: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/175.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
If we know that there are k smokers?
In total…
→ models
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
→ models
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 176: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/176.jpg)
Atom Counting: Example
If we know precisely who smokes, and there are k smokers?
k
n-k
k
n-k
If we know that there are k smokers?
In total…
→ models
Database:
Smokes(Alice) = 1
Smokes(Bob) = 0
Smokes(Charlie) = 0
Smokes(Dave) = 1
Smokes(Eve) = 0
...
→ models
→ models
Smokes Smokes Friends
Δ = ∀x,y, (Smokes(x) ∧ Friends(x,y) ⇒ Smokes(y)) Domain = {n people}
![Page 177: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/177.jpg)
Augment Rules with Logical Rewritings
![Page 178: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/178.jpg)
Augment Rules with Logical Rewritings
1. Remove constants (shattering)
Δ = ∀x (Friend(Alice, x) ∨ Friend(x, Bob))
![Page 179: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/179.jpg)
Augment Rules with Logical Rewritings
1. Remove constants (shattering)
Δ = ∀x (Friend(Alice, x) ∨ Friend(x, Bob))
Δ = ∀x (F1(x) ∨ F2(x)) ∧ (F3 ∨ F4) ∧ (F4 ∨ F5)
F1(x) = Friend(Alice,x)
F2(x) = Friend(x,Bob)
F3 = Friend(Alice, Alice)
F4 = Friend(Alice,Bob)
F5 = Friend(Bob,Bob)
![Page 180: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/180.jpg)
Augment Rules with Logical Rewritings
1. Remove constants (shattering)
Δ = ∀x (Friend(Alice, x) ∨ Friend(x, Bob))
Δ = ∀x (F1(x) ∨ F2(x)) ∧ (F3 ∨ F4) ∧ (F4 ∨ F5)
F1(x) = Friend(Alice,x)
F2(x) = Friend(x,Bob)
F3 = Friend(Alice, Alice)
F4 = Friend(Alice,Bob)
F5 = Friend(Bob,Bob)
2. “Rank” variables (= occur in the same order in each atom)
Δ = (Friend(x,y) ∨ Enemy(x,y)) ∧ (Friend(x,y) ∨ Enemy(y,x)) Wrong order
![Page 181: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/181.jpg)
Augment Rules with Logical Rewritings
1. Remove constants (shattering)
Δ = ∀x (Friend(Alice, x) ∨ Friend(x, Bob))
Δ = ∀x (F1(x) ∨ F2(x)) ∧ (F3 ∨ F4) ∧ (F4 ∨ F5)
F1(x) = Friend(Alice,x)
F2(x) = Friend(x,Bob)
F3 = Friend(Alice, Alice)
F4 = Friend(Alice,Bob)
F5 = Friend(Bob,Bob)
2. “Rank” variables (= occur in the same order in each atom)
Δ = (Friend(x,y) ∨ Enemy(x,y)) ∧ (Friend(x,y) ∨ Enemy(y,x)) Wrong order
F1(u,v) = Friend(u,v),u<v
F2(u) = Friend(u,u) F3(u,v) = Friend(v,u),v<u
E1(u,v) = Friend(u,v),u<v
E2(u) = Friend(u,u) E3(u,v) = Friend(v,u),v<u
Δ = (F1(x,y) ∨ E1(x,y)) ∧ (F1(x,y) ∨ E3(x,y))
∧ (F2(x) ∨ E2(x))
∧ (F3(x,y) ∨ E3(x,y)) ∧ (F3(x,y) ∨ E1(x,y))
![Page 182: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/182.jpg)
Augment Rules with Logical Rewritings
3. Perform Resolution [Gribkoff‟14]
See UAI Poster
on Saturday!
4. Skolemization [V.d.Broeck‟14]
Δ = ∀p, ∃c, Card(p,c)
Mix ∀/∃ in encodings of MLNs with quantifiers and probabilistic programs
Input: Mix ∀/∃ Output: Only ∀
Δ = ∀x∀y (R(x) ∨¬S(x,y)) ∧ ∀x∀y (S(x,y) ∨ T(y)) Rules stuck…
Resolution: Δ ∧ ∀x∀y (R(x) ∨T(y)) Now apply I/E!
![Page 183: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/183.jpg)
Skolemization: Example
Δ = ∀p, ∃c, Card(p,c)
[V.d.Boeck‟14]
![Page 184: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/184.jpg)
Skolemization
Skolemization: Example
Δ = ∀p, ∃c, Card(p,c)
Δ‟ = ∀p, ∀c, Card(p,c) ⇒ S(p)
[V.d.Boeck‟14]
![Page 185: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/185.jpg)
w(S) = 1 and w(¬S) = -1
Skolemization
Skolem predicate
Skolemization: Example
Δ = ∀p, ∃c, Card(p,c)
Δ‟ = ∀p, ∀c, Card(p,c) ⇒ S(p)
[V.d.Boeck‟14]
![Page 186: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/186.jpg)
∃c, Card(p,c) = true
Consider one position p:
w(S) = 1 and w(¬S) = -1
∃c, Card(p,c) = false
Skolemization
Skolem predicate
Skolemization: Example
Δ = ∀p, ∃c, Card(p,c)
Δ‟ = ∀p, ∀c, Card(p,c) ⇒ S(p)
[V.d.Boeck‟14]
![Page 187: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/187.jpg)
∃c, Card(p,c) = true
S(p) = true Also model of Δ, weight * 1
Consider one position p:
w(S) = 1 and w(¬S) = -1
∃c, Card(p,c) = false
Skolemization
Skolem predicate
Skolemization: Example
Δ = ∀p, ∃c, Card(p,c)
Δ‟ = ∀p, ∀c, Card(p,c) ⇒ S(p)
[V.d.Boeck‟14]
![Page 188: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/188.jpg)
∃c, Card(p,c) = true
S(p) = true Also model of Δ, weight * 1
Consider one position p:
w(S) = 1 and w(¬S) = -1
∃c, Card(p,c) = false
S(p) = true No model of Δ, weight * 1
S(p) = false No model of Δ, weight * -1
Extra models Cancel out
Skolemization
Skolem predicate
Skolemization: Example
Δ = ∀p, ∃c, Card(p,c)
Δ‟ = ∀p, ∀c, Card(p,c) ⇒ S(p)
[V.d.Boeck‟14]
![Page 189: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/189.jpg)
...
Playing Cards Revisited
Let us automate this:
Relational model
Lifted probabilistic inference algorithm
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
![Page 190: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/190.jpg)
Playing Cards Revisited
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
![Page 191: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/191.jpg)
Playing Cards Revisited
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Skolemization
![Page 192: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/192.jpg)
∀p, ∀c, Card(p,c) ⇒ S1(p)
∀c, ∀p, Card(p,c) ⇒ S2(c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Playing Cards Revisited
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Skolemization
![Page 193: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/193.jpg)
∀p, ∀c, Card(p,c) ⇒ S1(p)
∀c, ∀p, Card(p,c) ⇒ S2(c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Playing Cards Revisited
w(S1) = 1 and w(¬S1) = -1
w(S2) = 1 and w(¬S2) = -1
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Skolemization
![Page 194: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/194.jpg)
∀p, ∀c, Card(p,c) ⇒ S1(p)
∀c, ∀p, Card(p,c) ⇒ S2(c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Playing Cards Revisited
w(S1) = 1 and w(¬S1) = -1
w(S2) = 1 and w(¬S2) = -1
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Skolemization
Atom counting
![Page 195: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/195.jpg)
∀p, ∀c, Card(p,c) ⇒ S1(p)
∀c, ∀p, Card(p,c) ⇒ S2(c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Playing Cards Revisited
w(S1) = 1 and w(¬S1) = -1
w(S2) = 1 and w(¬S2) = -1
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Skolemization
Atom counting
![Page 196: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/196.jpg)
∀p, ∀c, Card(p,c) ⇒ S1(p)
∀c, ∀p, Card(p,c) ⇒ S2(c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Playing Cards Revisited
w(S1) = 1 and w(¬S1) = -1
w(S2) = 1 and w(¬S2) = -1
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
∀-Rule
Skolemization
Atom counting
![Page 197: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/197.jpg)
∀p, ∀c, Card(p,c) ⇒ S1(p)
∀c, ∀p, Card(p,c) ⇒ S2(c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Playing Cards Revisited
w(S1) = 1 and w(¬S1) = -1
w(S2) = 1 and w(¬S2) = -1
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
∀c, ∀c‟, Card(c) ∧ Card(c‟) ⇒ c = c‟
∀-Rule
Skolemization
Atom counting
![Page 198: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/198.jpg)
∀p, ∀c, Card(p,c) ⇒ S1(p)
∀c, ∀p, Card(p,c) ⇒ S2(c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
Playing Cards Revisited
w(S1) = 1 and w(¬S1) = -1
…
w(S2) = 1 and w(¬S2) = -1
Tutorial UAI 2014 82
∀p, ∃c, Card(p,c)
∀c, ∃p, Card(p,c)
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
∀p, ∀c, ∀c‟, Card(p,c) ∧ Card(p,c‟) ⇒ c = c‟
∀c, ∀c‟, Card(c) ∧ Card(c‟) ⇒ c = c‟
∀-Rule
Skolemization
Atom counting
![Page 199: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/199.jpg)
...
Playing Cards Revisited
Let us automate this:
Lifted probabilistic inference algorithm
Computed in time polynomial in n Tutorial UAI 2014 83
![Page 200: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/200.jpg)
Summary Lifted Inference
• By definition: PTIME data complexity
Also: ∃ FO compilation = ∃ Query Plan
• However: only works for “liftable” queries
• The rules:
– AND/OR-rules, ∀/∃-rules, I/E
(inclusion/exclusion), Atom Counting
– Deceptively simple: the only surprising rules
are I/E and atom counting
Tutorial UAI 2014 84
Next: will show that lifted inference is provably
more powerful than grounded inference
![Page 201: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/201.jpg)
Outline
• Part 1: Motivation
• Part 2: Probabilistic Databases
• Part 3: Weighted Model Counting
• Part 4: Lifted Inference for WFOMC
• Part 5: The Power of Lifted Inference
• Part 6: Conclusion/Open Problems
Tutorial UAI 2014 85
![Page 202: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/202.jpg)
Two Questions
• Q1: Are the lifted rules complete?
– We know that they get stuck on some queries
– Do we need to add more rules?
• Q2: Are lifted rules stronger than grounded?
– Some lifted rules easily correspond to operations
on grounded formulas (e.g. Independent-AND)
– Can we simulate every lifted inference directly on
the grounded formula?
Tutorial UAI 2014 86
![Page 203: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/203.jpg)
Two Questions
• Q1: Are the lifted rules complete?
– We know that they get stuck on some queries
– Do we need to add more rules?
• Q2: Are lifted rules stronger than grounded?
– Some lifted rules easily correspond to operations
on grounded formulas (e.g. Independent-AND)
– Can we simulate every lifted inference directly on
the grounded formula?
Complete for Positive CNF-FO, for UCQ
Symmetric: yes (grounded inference ignores symmetries)
Asymmetric: Strictly stronger than Decision-DNNF & DPLL-based algorithms
![Page 204: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/204.jpg)
1. Are the Lifted Rules Complete?
We use complexity classes
• Inference rules: PTIME data complexity
• Some queries: #P-hard data complexity
Dichotomy Theorem for Positive CNF-FO:
• If lifted rules succeed, then query in PTIME
• If lifted rules fail, then query is #P-hard
Implies lifted rules are complete for Positive CNF-FO
Tutorial UAI 2014 88 Will show in two steps: Small and Big Dichotomy Theorem
![Page 205: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/205.jpg)
NP v.s. #P
• SAT = Satisfiability Problem
• SAT is NP-complete [Cook‟71]
• NP = decision problems polynomial-time, nondeterministic TM
• #SAT = model counting
• #SAT is #P-complete [Valiant‟79]
• #P = numerical functions polynomial-time, nondeterministic TM, answer = #accepting computations
Note: it would be wrong to say “#SAT is NP-complete”
![Page 206: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/206.jpg)
A Simple Propositional Formula
that is Hard
A Positive, Partitioned 2CNF Formula is a formula of the form:
F = ∧(i,j) ∈ E (xi yj)
Where E = the edge set of a bipartite graph
Tutorial UAI 2014 90
![Page 207: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/207.jpg)
A Simple Propositional Formula
that is Hard
1
2
1
2
3
E:
A Positive, Partitioned 2CNF Formula is a formula of the form:
F = ∧(i,j) ∈ E (xi yj)
Where E = the edge set of a bipartite graph
F = (x1 y1) ∧ (x2 y1) ∧ (x2 y3) ∧ (x1 y3) ∧ (x2 y2)
Tutorial UAI 2014 91
![Page 208: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/208.jpg)
A Simple Propositional Formula
that is Hard
Theorem [Provan‟83] #SAT for PP2CNF is #P-hard
1
2
1
2
3
E:
A Positive, Partitioned 2CNF Formula is a formula of the form:
F = ∧(i,j) ∈ E (xi yj)
Where E = the edge set of a bipartite graph
F = (x1 y1) ∧ (x2 y1) ∧ (x2 y3) ∧ (x1 y3) ∧ (x2 y2)
Tutorial UAI 2014 92
![Page 209: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/209.jpg)
A Query That is #P-Hard
H0= ∀x∀y (Smoker(x)∨Friend(x,y)∨Jogger(y))
Theorem. Computing P(H0 | D) is #P-hard in |D|
[Dalvi‟04]
Proof: Reduction from PP2CNF. Given a PP2CNF F defined by edge relation E, set:
P(Friend(a,b)) = 1 if (a,b) ∈ E
P(Friend(a,b)) = 0 if (a,b) ∉ E
Then the grounding of H0 is: ∧(i,j) ∈ E (Smoker(i) Jogger(j)) = F Hence, P(H0 | D) = P(F)
Lesson: no lifted inference rules will ever compute H0
![Page 210: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/210.jpg)
Hierarchical Clause
at(x) = set of atoms containing the variable x
x
y
Non-hierarchical
Smoker Friend
x
z
Hierarchical
y
Q = (Smoker(x,y)∨Friend(x,z)) H0=Smoker(x)∨Friend(x,y)∨Jogger(y)
Definition A clause Q is hierarchical if forall variables x, y:
at(x) at(y) or at(x) ⊇ at(y) or at(x) at(y) =
= ∀x[∀y Smoker(x,y)]∨[∀z Friend(x,z)]
Smoker Friend Jogger
![Page 211: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/211.jpg)
Small Dichotomy Theorem
Theorem [Dalvi‟04] Dichotomy:
• If Q is hierarchical, then Q is liftable (PTIME data complexity)
• If Q is not hierarchical, Q is #P-hard
Note: checking “Q is hierarchical” is in AC0 (expression complexity)
Definition A clause Q is hierarchical if forall variables x, y:
at(x) at(y) or at(x) ⊇ at(y) or at(x) at(y) =
Let Q be a single clause, w/o repeating relation symbols
And, moreover, the
OR-rule and ∀-rule are complete.
![Page 212: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/212.jpg)
Proof Hierarchical PTIME
![Page 213: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/213.jpg)
Proof Hierarchical PTIME
∀-Rule:
P(∀x Q) = Πa P(Q[a/x])
x
Case 1:
Q=
![Page 214: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/214.jpg)
Proof Hierarchical PTIME
∀-Rule:
P(∀x Q) = Πa P(Q[a/x])
x
Case 1:
Q=
∨-Rule:
P(Q) =1 – (1– P(Q1))(1– P(Q2))
Q1 ∨ Q2 Q=
Case 2:
![Page 215: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/215.jpg)
Proof Hierarchical PTIME
Non-hierarchical #P-hard
Reduction from H0:
Q = … R(x, …) ∨ S(x,y,…) ∨ T(y,…), …
x y
R S T
∀-Rule:
P(∀x Q) = Πa P(Q[a/x])
x
Case 1:
Q=
∨-Rule:
P(Q) =1 – (1– P(Q1))(1– P(Q2))
Q1 ∨ Q2 Q=
Case 2:
![Page 216: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/216.jpg)
The Big Dichotomy Theorem
• For Positive CNF-FO the rules are not complete as stated!
• Instead we will revise inclusion/exclusion
• After the revision, the rules are complete
• We start with some non-liftable queries…
Tutorial UAI 2014 97
![Page 217: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/217.jpg)
The Non-liftable Queries Hk
H0= R(x)∨S(x,y)∨T(y)
H1= [R(x0)∨S(x0,y0)] ∧ [S(x1,y1)∨T(y1)]
![Page 218: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/218.jpg)
The Non-liftable Queries Hk
H0= R(x)∨S(x,y)∨T(y)
H2= [R(x0)∨S1(x0,y0)] ∧ [S1(x1,y1)∨S2(x1,y1)] ∧ [S2(x2,y2)∨T(y2)]
H1= [R(x0)∨S(x0,y0)] ∧ [S(x1,y1)∨T(y1)]
![Page 219: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/219.jpg)
The Non-liftable Queries Hk
H0= R(x)∨S(x,y)∨T(y)
H2= [R(x0)∨S1(x0,y0)] ∧ [S1(x1,y1)∨S2(x1,y1)] ∧ [S2(x2,y2)∨T(y2)]
H1= [R(x0)∨S(x0,y0)] ∧ [S(x1,y1)∨T(y1)]
. . .
H3= [R(x0)∨S1(x0,y0)]∧[S1(x1,y1)∨S2(x1,y1)]∧[S2(x2,y2)∨S3(x2,y2)]∧[S3(x3,y3)∨T(y3)]
![Page 220: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/220.jpg)
The Non-liftable Queries Hk
H0= R(x)∨S(x,y)∨T(y)
H2= [R(x0)∨S1(x0,y0)] ∧ [S1(x1,y1)∨S2(x1,y1)] ∧ [S2(x2,y2)∨T(y2)]
H1= [R(x0)∨S(x0,y0)] ∧ [S(x1,y1)∨T(y1)]
. . .
H3= [R(x0)∨S1(x0,y0)]∧[S1(x1,y1)∨S2(x1,y1)]∧[S2(x2,y2)∨S3(x2,y2)]∧[S3(x3,y3)∨T(y3)]
Theorem. [Dalvi‟12] For every k, the query Hk is #P-hard
So far, not very interesting…
![Page 221: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/221.jpg)
The Query QW
QW =
[∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x2∀y2(S2(x2,y2)∨S3(x2,y2))]/* Q1 */
∨ [∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3)] /* Q2 */
∨ [∀x1∀y1(S1(x1,y1)∨S2(x1,y1)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3))] /* Q3 */
QW is a Boolean
combination
of clauses in H3
H3= [R(x0)∨S1(x0,y0)]∧[S1(x1,y1)∨S2(x1,y1)]∧[S2(x2,y2)∨S3(x2,y2)]∧[S3(x3,y3)∨T(y3)]
![Page 222: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/222.jpg)
The Query QW
QW =
[∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x2∀y2(S2(x2,y2)∨S3(x2,y2))]/* Q1 */
∨ [∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3)] /* Q2 */
∨ [∀x1∀y1(S1(x1,y1)∨S2(x1,y1)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3))] /* Q3 */
QW is a Boolean
combination
of clauses in H3
QW is liftable BUT we need to use cancellations!
H3= [R(x0)∨S1(x0,y0)]∧[S1(x1,y1)∨S2(x1,y1)]∧[S2(x2,y2)∨S3(x2,y2)]∧[S3(x3,y3)∨T(y3)]
![Page 223: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/223.jpg)
The Query QW
QW =
[∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x2∀y2(S2(x2,y2)∨S3(x2,y2))]/* Q1 */
∨ [∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3)] /* Q2 */
∨ [∀x1∀y1(S1(x1,y1)∨S2(x1,y1)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3))] /* Q3 */
QW is a Boolean
combination
of clauses in H3
QW is liftable BUT we need to use cancellations!
P(QW) = P(Q1) + P(Q2) + P(Q3) +
- P(Q1 ∧ Q2) - P(Q2 ∧ Q3) – P(Q1 ∧ Q3)
+ P(Q1 ∧ Q2 ∧ Q3)
Also = H3
= H3 (hard !)
Liftable
H3= [R(x0)∨S1(x0,y0)]∧[S1(x1,y1)∨S2(x1,y1)]∧[S2(x2,y2)∨S3(x2,y2)]∧[S3(x3,y3)∨T(y3)]
![Page 224: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/224.jpg)
The Query QW
QW =
[∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x2∀y2(S2(x2,y2)∨S3(x2,y2))]/* Q1 */
∨ [∀x0∀y0(R(x0)∨S1(x0,y0)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3)] /* Q2 */
∨ [∀x1∀y1(S1(x1,y1)∨S2(x1,y1)) ∧ ∀x3∀y3(S3(x3,y3)∨T(y3))] /* Q3 */
The two hard queries cancel out, and what remains is Liftable
QW is a Boolean
combination
of clauses in H3
QW is liftable BUT we need to use cancellations!
P(QW) = P(Q1) + P(Q2) + P(Q3) +
- P(Q1 ∧ Q2) - P(Q2 ∧ Q3) – P(Q1 ∧ Q3)
+ P(Q1 ∧ Q2 ∧ Q3)
Also = H3
= H3 (hard !)
Liftable
H3= [R(x0)∨S1(x0,y0)]∧[S1(x1,y1)∨S2(x1,y1)]∧[S2(x2,y2)∨S3(x2,y2)]∧[S3(x3,y3)∨T(y3)]
![Page 225: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/225.jpg)
Cancellations?
• Cancellations in the inclusion/exclusion
formula are critical! If we fail to do them,
then the rules get stuck
• The mathematical concept that explains
which terms cancel out is the Mobius‟
function (next)
Tutorial UAI 2014 100
![Page 226: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/226.jpg)
August Ferdinand Möbius
1790-1868
• Möbius strip
• Möbius function μ in
number theory
• Generalized to lattices [Stanley‟97]
• And to lifted inference!
101
![Page 227: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/227.jpg)
1
^
The Lattice of a Query Definition. The lattice of Q = Q1 ∧ Q2 ∧ … is:
• Elements are terms of inclusion/exclusion;
• Order is logical implication
1
^
![Page 228: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/228.jpg)
1
^
The Lattice of a Query
Q1 Q2 Q3
Q2∧Q3 Q1∧Q2
Q1∧Q2∧Q3 (= Q1∧ Q3)
Definition. The lattice of Q = Q1 ∧ Q2 ∧ … is:
• Elements are terms of inclusion/exclusion;
• Order is logical implication
Nodes Liftable,
Nodes #P hard.
QW =
[ (R(x0)∨S1(x0,y0)) ∧ (S2(x2,y2)∨S3(x2,y2))]/* Q1 */
∨ [ (R(x0)∨S1(x0,y0)) ∧ (S3(x3,y3)∨T(y3)] /* Q2 */
∨ [ (S1(x1,y1)∨S2(x1,y1)) ∧ (S3(x3,y3)∨T(y3))] /* Q3 */
1
^
![Page 229: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/229.jpg)
The Möbius‟ Function
103
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 230: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/230.jpg)
The Möbius‟ Function
103
1
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 231: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/231.jpg)
The Möbius‟ Function
103
1
-1 -1 -1
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 232: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/232.jpg)
The Möbius‟ Function
103
1
-1 -1 -1
1 1
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 233: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/233.jpg)
The Möbius‟ Function
103
1
-1 -1 -1
1 1
0
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 234: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/234.jpg)
The Möbius‟ Function
-1 -1 -1
1
103
1
-1 -1 -1
1 1
0
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 235: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/235.jpg)
The Möbius‟ Function
-1 -1 -1
2
1
103
1
-1 -1 -1
1 1
0
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 236: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/236.jpg)
The Möbius‟ Function
-1 -1 -1
2
1
New Rule
Inclusion/Exclusion
Mobius‟ Inversion Formula 103
1
-1 -1 -1
1 1
0
Def. The Möbius function:
μ( , ) = 1
μ(u, ) = - Σu < v ≤ μ(v, ) 1
^ 1
^
1
^ 1
^ 1
^
1
^
1
^
Möbius’ Inversion Formula:
P(Q) = - Σ Qi < μ(Qi, ) P(Qi) 1
^ 1
^
![Page 237: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/237.jpg)
The Dichotomy Theorem
104
Dichotomy Theorem [Dalvi‟12] Fix a Positive-CNF Q.
1. If Q is liftable, then P(Q) is in PTIME (obviously)
2. If Q is not liftable, then P(Q) is #P-complete
Note 1: for the theorem to hold one must replace
the inclusion/exclusion rule with the Mobius‟ rule
Note 2: Original formulation for UCQ; holds for
Positive CNF-FO by duality. Tutorial UAI 2014
![Page 238: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/238.jpg)
Discussion
• This answers Question 1: lifted inference
rules are complete for Positive CNF-FO
• Beyond Positive CNF-FO?
– See poster on Saturday
– Take-away: rules+resolution conjectured to be
complete for CNF-FO; strong evidence that
no complete rules exists for FO
Tutorial UAI 2014 105
![Page 239: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/239.jpg)
2. Are lifted rules stronger
than grounded?
Symmetric WFOMC:
Grounded WMC does not
use symmetries.
Query H0 is:
• Liftable on symmetric,
• #P-hard on asymmetric
Asymmetric WFOMC
Query QW is in PTIME:
• DPLL-based search
has exponential time
• Decision-DNNF have
exponential size
Alternative to lifting:
1. Ground the FO sentence
2. Do WMC on the propositional formula
![Page 240: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/240.jpg)
Symmetric WFOMC
H0 = ∀x∀y (Smoker(x)∨Friend(x,y)∨Jogger(y))
We have seen that H0 is #P-hard (over asymmetric spaces!)
But over symmetric spaces it can be lifted:
Lifted inference is strictly more powerful than grounded inference
![Page 241: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/241.jpg)
Symmetric WFOMC
H0 = ∀x∀y (Smoker(x)∨Friend(x,y)∨Jogger(y))
We have seen that H0 is #P-hard (over asymmetric spaces!)
But over symmetric spaces it can be lifted:
Lifted inference is strictly more powerful than grounded inference
Theorem [V.d.Broeck‟14]: every query in FO2 is liftable over symmetric spaces
FO2 includes H0, and some quite complex complex sentences like:
Q = ∀x∀y∀z∀u (Friend(x,y) ∨Enemy(y,z) ∨Friend(z,u) ∨Enemy(u,v))
= ∀x∀y (Friend(x,y) ∨ ∀x (Enemy(y,x) ∨ ∀y (Friend(x,y) ∨ ∀x (Enemy(y,x)))))
![Page 242: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/242.jpg)
Asymmetric WFOMC
• Lifted inference does no longer have a
fundamental reason to be stronger than
grounded WMC
• However, we can prove that lifted
inference is stronger than WMC algorithms
used in practice today:
– DPLL search (with caching; with components)
– Decision-DNNF
![Page 243: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/243.jpg)
Basic DPLL
Davis, Putnam, Logemann, Loveland [Davis‟60, ‟62]
Assume uniform distribution for simplicity
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
![Page 244: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/244.jpg)
Basic DPLL
Davis, Putnam, Logemann, Loveland [Davis‟60, ‟62]
F: (xy) (xuw) (xuwz)
Assume uniform distribution for simplicity
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
![Page 245: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/245.jpg)
Basic DPLL
Davis, Putnam, Logemann, Loveland [Davis‟60, ‟62]
x
F: (xy) (xuw) (xuwz)
Assume uniform distribution for simplicity
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
![Page 246: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/246.jpg)
Basic DPLL
Davis, Putnam, Logemann, Loveland [Davis‟60, ‟62]
x
0 1
F: (xy) (xuw) (xuwz)
uwz y(uw)
Assume uniform distribution for simplicity
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
![Page 247: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/247.jpg)
Basic DPLL
Davis, Putnam, Logemann, Loveland [Davis‟60, ‟62]
x
z
0
y
1
u 0
1
1
0
w
1
0
0
1
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
uwz
uw
w
uw
y(uw)
w
Assume uniform distribution for simplicity
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
![Page 248: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/248.jpg)
Basic DPLL
Davis, Putnam, Logemann, Loveland [Davis‟60, ‟62]
x
z
0
y
1
u 0
1
1
0
w
1
0
0
1
1 0
u 1 1
1
0
w
1
0
0
1
1 0
1 0 1 0
0 1
1 1
F: (xy) (xuw) (xuwz)
uwz
uw
w
uw
½
¾ ¾
y(uw) 3/8 7/8
5/8
w ½
Assume uniform distribution for simplicity
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
![Page 249: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/249.jpg)
Basic DPLL x
z
0
y
1
u 0
1
1
0
w
1
0
0
1
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
The trace is a
Decision-Tree for F
![Page 250: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/250.jpg)
Caching
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
x
z
0
y
1
u 0
1
1
0
w
1
0
0
1
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
uwz
uw
w
uw
y(uw)
w
// DPLL with caching:
Cache F and P(F);
look it up before computing
![Page 251: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/251.jpg)
Caching
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
x
z
0
y
1
u 0
1
1
0
w
1
0
0
1
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
uwz
uw
w
uw
y(uw)
w
// DPLL with caching:
Cache F and P(F);
look it up before computing
![Page 252: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/252.jpg)
Caching
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
x
z
0
y
1
u 0
1
1
0
w
1
0
0
1
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
uwz
uw
w
uw
y(uw)
w
// DPLL with caching:
Cache F and P(F);
look it up before computing
![Page 253: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/253.jpg)
Caching
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
x
z
0
y
1
0
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
uwz
uw
w
y(uw)
// DPLL with caching:
Cache F and P(F);
look it up before computing
![Page 254: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/254.jpg)
Caching & FBDDs
x
z
0
y
1
0
1 0
u 1 1
1
0
w
1
0
0
1
1 0
The trace is a decision-DAG for F
![Page 255: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/255.jpg)
Caching & FBDDs
x
z
0
y
1
0
1 0
u 1 1
1
0
w
1
0
0
1
1 0
The trace is a decision-DAG for F
FBDD (Free Binary Decision Diagram)
or
ROBP (Read Once Branching Program)
![Page 256: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/256.jpg)
Caching & FBDDs
x
z
0
y
1
0
1 0
u 1 1
1
0
w
1
0
0
1
1 0
The trace is a decision-DAG for F
FBDD (Free Binary Decision Diagram)
or
ROBP (Read Once Branching Program)
• Every variable is tested at most once
on any path
![Page 257: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/257.jpg)
Caching & FBDDs
x
z
0
y
1
0
1 0
u 1 1
1
0
w
1
0
0
1
1 0
The trace is a decision-DAG for F
FBDD (Free Binary Decision Diagram)
or
ROBP (Read Once Branching Program)
• Every variable is tested at most once
on any path
• All internal nodes are decision-nodes
Decision-Node
![Page 258: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/258.jpg)
Component Analysis
x
z
0
y
1
0
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
uwz
uw
w
y (uw)
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
// DPLL with component analysis
(and caching):
if F = G H
where G and H have disjoint set
of variables
P(F) = P(G) × P(H)
![Page 259: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/259.jpg)
Component Analysis
x
z
0
y
1
0
1 0
u 1 1
1
0
w
1
0
0
1
1 0
F: (xy) (xuw) (xuwz)
uwz
uw
w
y (uw)
// basic DPLL:
Function P(F):
if F = false then return 0
if F = true then return 1
select a variable x, return
½ P(FX=0) + ½ P(FX=1)
// DPLL with component analysis
(and caching):
if F = G H
where G and H have disjoint set
of variables
P(F) = P(G) × P(H)
![Page 260: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/260.jpg)
Components & Decision-DNNF
x
z
1 u 1
1
1
0
w
1
0
0
1
1 0
uwz
w
y (uw)
0
y
1
0
F: (xy) (xuw) (xuwz)
y
0 1
uw
![Page 261: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/261.jpg)
Components & Decision-DNNF
x
z
1 u 1
1
1
0
w
1
0
0
1
1 0
0
y
1
0
The trace is a Decision-DNNF [Huang‟05, ‟07]
FBDD + “Decomposable” AND-nodes
(Two sub-DAGs do not share variables)
Decision Node
0 1 AND Node
![Page 262: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/262.jpg)
New Queries From Hk
Consider the k+1 clauses that form Hk
Hk0= ∀x0∀y0 (R(x0)∨S1(x0,y0))
Hk1= ∀x1∀y1 (S1(x1,y1)∨S2(x1,y1))
Hk2= ∀x2∀y2 (S2(x2,y2)∨S3(x2,y2))
Hkk= ∀xk∀yk (Sk(xk,yk)∨T(yk))
…
Tutorial UAI 2014 115
![Page 263: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/263.jpg)
Asymmetric WFOMC
Theorem. [Beame‟14] If the query Q is any Boolean combination
of the formulas Hk0, …, Hkk then:
• Any DPLL-based algorithm takes time Ω(2√n) time
• Any Decision-DNNF has Ω(2√n) nodes.
Tutorial UAI 2014 116
For example, QW is a Boolean combination of H30, H31, H32, H33.
Liftable (hence PTIME), yet grounded WMC takes exponential time.
![Page 264: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/264.jpg)
Discussion
• This answers question 2: there exists queries that (a) are liftable, and (b) grounded algorithms like DPLL search or Decision-DNNF run in exponential time
• Perhaps there are more powerful grounded algorithms? We don‟t know. Open problem: do d-DNNFs compute these queries in PTIME?
Tutorial UAI 2014 117
![Page 265: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/265.jpg)
Möbius Über Alles
#P-hard
PTIME
Poly-size d-DNNF
Poly-size FBDD,
DecisionDNNF
Poly-size OBDD
Read Once
QJ
QV
QW
Q9
H0
H1
H2
?
QU
Conjectured
H3
hierarchical
Non-hierarchical
Open
[Suciu‟11]
![Page 266: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/266.jpg)
Outline
• Part 1: Motivation
• Part 2: Probabilistic Databases
• Part 3: Weighted Model Counting
• Part 4: Lifted Inference for WFOMC
• Part 5: The Power of Lifted Inference
• Part 6: Conclusion/Open Problems
Tutorial UAI 2014 119
![Page 267: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/267.jpg)
Summary
• Relational models = the vast majority of
data today, plus probabilistic Databases
• Weighted Model Counting = Uniform
approach to Probabilistic Inference
• Lifted Inference = really simple rules
• The Power of Lifted Inference = we can
prove that lifted inference is better
Tutorial UAI 2014 120
![Page 268: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/268.jpg)
Lifted Algorithms (in the AI community)
Exact Probabilistic Inference First-Order Variable Elimination [Poole‟03, Braz‟05, Milch‟08, Taghipour‟13]
First-Order Knowledge Compilation [V.d.Broeck‟11a, „11b, ‟12a, ‟13a]
Probabilistic Theorem Proving [Gogate‟11]
Approximate Probabilistic Inference Lifted Belief Propagation [Jaimovich‟07, Singla‟08, Kersting‟09]
Lifted Bisimulation/Mini-buckets [Sen‟08, „09]
Lifted Importance Sampling [Gogate‟11, ‟12]
Lifted Relax, Compensate & Recover [V.d.Broeck‟12b]
Lifted MCMC [Niepert‟12, Niepert‟13, Venugopal‟12]
Lifted Variational Inference [Choi‟12, Bui‟12]
Lifted MAP-LP [Mladenov‟14, Apsel‟14]
Special-Purpose Inference: Lifted Kalman Filter [Ahmadi‟11, Choi‟11]
Lifted Linear Programming [Mladenov‟12]
[Kimmig‟14]
![Page 269: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/269.jpg)
“But my application has
no symmetries?”
1. Statistical relational models have abundant symmetries
2. Some tasks do not require symmetries in data Weight learning, partition functions, single marginals, etc.
3. Symmetries of computation are not symmetries of data Belief propagation and MAP-LP require weaker automorphisms
4. Over-symmetric approximations Approximate P(Q|DB) by P(Q|DB')
DB' has more symmetries than DB (is more liftable)
Very high speed improvements
Low approximation error
[Kersting‟09, Mladenov‟14, V.d.Broeck‟13b]
![Page 270: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/270.jpg)
Open Problems
Symmetric spaces:
• Prove hardness for ANY lifted inference task. Likely needed: #P1-hardness.
• Are lifted inference rules complete beyond FO2?
Asymmetric spaces:
• Prove completeness for CNF FO formulas
• Extend lifted inference algorithms beyond liftable formulas (need approximations)
• Measure of complexity as a function of the FO formula AND the database D. E.g. if D has bounded treewidth then tractable
![Page 271: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/271.jpg)
Final Thoughts
Long-term outlook: probabilistic inference
exploits
• 1988: conditional independence
• 2000: contextual independence (local
structure)
201?: Exchangeability/Symmetries
Need lifted inference!
![Page 272: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/272.jpg)
Thank You!
Questions?
![Page 273: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/273.jpg)
Thank You!
Questions?
Probabilistic
Inference
Inside!
[Suciu‟11]
StarAI Workshop
@ AAAI on Sunday
![Page 274: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/274.jpg)
References
[Gartner‟06]
Gartner (2006). Market Share: Relational Database Management Systems by Operating System, Worldwide [Carlson‟10]
Carlson, A., Betteridge, J., Kisiel, B., Settles, B., Hruschka Jr, E. R., & Mitchell, T. M. (2010, July). Toward an Architecture for Never-Ending Language Learning. In AAAI (Vol. 5, p. 3).
[Dong‟14] Dong, X. L., Murphy, K., Gabrilovich, E., Heitz, G., Horn, W., Lao, N., Strohmann, T., Sun, S. & Zhang, W.
(2014). Knowledge Vault: A Web-scale approach to probabilistic knowledge fusion.
[Niu‟12] Niu, F., Zhang, C., Ré, C., & Shavlik, J. W. (2012). DeepDive: Web-scale Knowledge-base Construction using Statistical Learning and Inference. In VLDS (pp. 25-28).
[Richardson‟06]
Richardson, M., & Domingos, P. (2006). Markov logic networks. Machine learning, 62(1-2), 107-136. [Niepert‟14]
Niepert, M., & Van den Broeck, G. (2014). Tractability through exchangeability: A new perspective on efficient probabilistic inference. Proceedings of AAAI.
[Vardi‟82] Vardi, M. Y. (1982). The complexity of relational query languages. In Proceedings of the fourteenth annual
ACM symposium on Theory of computing (pp. 137-146). ACM.
![Page 275: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/275.jpg)
References
[Suciu‟11]
Suciu, D., Olteanu, D., Ré, C., & Koch, C. (2011). Probabilistic databases. Synthesis Lectures on Data Management, 3(2), 1-180
[Olteanu‟08] Olteanu, D., & Huang, J. (2008). Using OBDDs for efficient query evaluation on probabilistic databases. In
Scalable Uncertainty Management (pp. 326-340). Springer Berlin Heidelberg. [Jha‟13]
Jha, A., & Suciu, D. (2013). Knowledge compilation meets database theory: compiling queries to decision diagrams. Theory of Computing Systems, 52(3), 403-440.
[Dalvi‟04] Dalvi, N. and Suciu, D. 2004. Efficient query evaluation on probabilistic databases. In VLDB.
[V.d.Broeck‟11a]
Van den Broeck, G., Taghipour, N., Meert, W., Davis, J., & De Raedt, L. (2011, July). Lifted probabilistic inference by first-order knowledge compilation. In Proceedings of the Twenty-Second international joint conference on Artificial Intelligence (pp. 2178-2185). AAAI Press.
[Dalvi‟12]
Dalvi, N., & Suciu, D. (2012). The dichotomy of probabilistic inference for unions of conjunctive queries. Journal of the ACM (JACM), 59(6), 30.
![Page 276: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/276.jpg)
References
[Chavira‟05]
Chavira, M., & Darwiche, A. (2005). Compiling Bayesian networks with local structure. In IJCAI (Vol. 5, pp. 1306-1312).
[Sang‟05] Sang, T., Beame, P., & Kautz, H. A. (2005, July). Performing Bayesian inference by weighted model
counting. In AAAI (Vol. 5, pp. 475-481). [Chavira‟08]
Chavira, M., & Darwiche, A. (2008). On probabilistic inference by weighted model counting. Artificial Intelligence, 172(6), 772-799.
[Choi‟13] Choi, A., Kisa, D., & Darwiche, A. (2013). Compiling probabilistic graphical models using sentential decision
diagrams. In Symbolic and Quantitative Approaches to Reasoning with Uncertainty (pp. 121-132). Springer Berlin Heidelberg.
[Chavira‟06] Chavira, M., Darwiche, A., & Jaeger, M. (2006). Compiling relational Bayesian networks for exact inference.
International Journal of Approximate Reasoning, 42(1), 4-20.
[Fierens‟11] Fierens, D., Broeck, G. V. D., Thon, I., Gutmann, B., & De Raedt, L. (2011). Inference in probabilistic logic programs using weighted CNF's. in UAI.
![Page 277: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/277.jpg)
References
[Fierens‟13]
Fierens, D., Van den Broeck, G., Renkens, J., Shterionov, D., Gutmann, B., Thon, I., ... & De Raedt, L. (2013). Inference and learning in probabilistic logic programs using weighted boolean formulas. Theory and Practice of Logic Programming, 1-44.
[Gogate‟11]
Gogate, V., & Domingos, P. (2012). Probabilistic theorem proving. Proceedings of Uncertainty in AI.
[V.d.Boeck‟13a] Van den Broeck, G. (2013). Lifted Inference and Learning in Statistical Relational Models (Doctoral dissertation, Ph. D. Dissertation, KU Leuven).
[V.d.Boeck‟14]
Van den Broeck, G., Meert, W., & Darwiche, A. (2013). Skolemization for weighted first-order model counting. Proceedings of KR.
[Gribkoff‟14] Gribkoff, E., Van den Broeck, G., & Suciu, D. (2014). Understanding the Complexity of Lifted Inference and
Asymmetric Weighted Model Counting. Proceedings of Uncertainty in AI. [Jha‟12]
Jha, A., & Suciu, D. (2012). Probabilistic databases with MarkoViews. Proceedings of the VLDB Endowment, 5(11), 1160-1171.
![Page 278: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/278.jpg)
References
[V.d.Broeck‟11b]
Van den Broeck, G. (2011). On the completeness of first-order knowledge compilation for lifted probabilistic inference. In Advances in Neural Information Processing Systems (pp. 1386-1394).
[Jaeger-‟12] Jaeger, M., & Van den Broeck, G. (2012, August). Liftability of probabilistic inference: Upper and lower
bounds. In Proceedings of the 2nd International Workshop on Statistical Relational AI .
[Cook‟71] Cook, S. A. (1971, May). The complexity of theorem-proving procedures. In Proceedings of the third annual ACM symposium on Theory of computing (pp. 151-158). ACM.
[Valiant‟79]
Valiant, L. G. (1979). The complexity of computing the permanent. Theoretical computer science, 8(2), 189-201.
[Provan‟83] Provan, J. S., & Ball, M. O. (1983). The complexity of counting cuts and of computing the probability that a
graph is connected. SIAM Journal on Computing, 12(4), 777-788. [Stanley‟97]
Stanley, R. P. 1997. Enumerative Combinatorics. Cambridge University Press.
![Page 279: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/279.jpg)
References
[Davis‟60]
Davis, M., & Putnam, H. (1960). A computing procedure for quantification theory. Journal of the ACM (JACM), 7(3), 201-215.
[Davis‟62] Davis, M., Logemann, G., & Loveland, D. (1962). A machine program for theorem-proving. Communications
of the ACM, 5(7), 394-397. [Huang‟05]
Huang, J., & Darwiche, A. (2005, July). DPLL with a trace: From SAT to knowledge compilation. In IJCAI (Vol. 5, pp. 156-162).
[Huang‟07] Huang, J., & Darwiche, A. (2007). The Language of Search. J. Artif. Intell. Res.(JAIR), 29, 191-219.
[Beame‟14]
Beame, P., Li, J., Roy, S., & Suciu, D. (2014). Counting of Query Expressions: Limitations of Propositional Methods. In ICDT (pp. 177-188).
[Poole‟03] Poole, D. (2003, August). First-order probabilistic inference. In IJCAI (Vol. 3, pp. 985-991).
[Braz‟05]
Braz, R., Amir, E., & Roth, D. (2005, July). Lifted first-order probabilistic inference. In Proceedings of the 19th international joint conference on Artificial intelligence (pp. 1319-1325).
![Page 280: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/280.jpg)
References
[Milch‟08]
Milch, B., Zettlemoyer, L. S., Kersting, K., Haimes, M., & Kaelbling, L. P. (2008, July). Lifted Probabilistic Inference with Counting Formulas. In AAAI (Vol. 8, pp. 1062-1068).
[Taghipour‟13] Taghipour, N., Fierens, D., Davis, J., & Blockeel, H. (2014). Lifted variable elimination: Decoupling the
operators from the constraint language. JAIR
[V.d.Broeck‟12a] Van den Broeck, G., & Davis, J. (2012, July). Conditioning in First-Order Knowledge Compilation and Lifted Probabilistic Inference. In AAAI.
[Jaimovich‟07]
Jaimovich, A., Meshi, O., & Friedman, N. (2007). Template based inference in symmetric relational Markov random fields. Proceedings of Uncertainty in AI
[Singla‟08] Singla, P., & Domingos, P. (2008, July). Lifted First-Order Belief Propagation. In AAAI (Vol. 8, pp. 1094-
1099). [Kersting‟09]
Kersting, K., Ahmadi, B., & Natarajan, S. (2009, June). Counting belief propagation. In Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence (pp. 277-284). AUAI Press.
![Page 281: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/281.jpg)
References [Sen‟08]
Sen, P., Deshpande, A., & Getoor, L. (2008). Exploiting shared correlations in probabilistic databases. Proceedings of the VLDB Endowment, 1(1), 809-820.
[Sen‟09] Sen, P., Deshpande, A., & Getoor, L. (2009). Bisimulation-based approximate lifted inference. In
Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence (pp. 496-505). AUAI Press.
[Gogate‟12] Gogate, V., Jha, A. K., & Venugopal, D. (2012). Advances in Lifted Importance Sampling. In AAAI.
[V.d.Broeck‟12b]
Van den Broeck, G., Choi, A., & Darwiche, A. (2012). Lifted relax, compensate and then recover: From approximate to exact lifted probabilistic inference. Proceedings of Uncertainty in AI
[Nieper‟12] Niepert, M. (2012). Markov chains on orbits of permutation groups. Proceedings of Uncertainty in AI
[Niepert‟13] Niepert, M. (2013). Symmetry-Aware Marginal Density Estimation. Proceedings of AAAI.
[Venugopal‟12] Venugopal, D., & Gogate, V. (2012). On lifting the gibbs sampling algorithm. In Advances in Neural
Information Processing Systems (pp. 1655-1663).
![Page 282: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/282.jpg)
References [Choi‟12]
Choi, J., & Amir, E. (2012). Lifted relational variational inference. Proceedings of Uncertainty in AI [Bui‟12]
Bui, H. H., Huynh, T. N., & Riedel, S. (2012). Automorphism groups of graphical models and lifted variational inference. StarAI
[Mladenov‟14] Mladenov, M., Kersting, K., & Globerson, A. (2014). Efficient Lifting of MAP LP Relaxations Using k-Locality.
In Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics (pp. 623-632).
[Apsel‟14] Apsel, U., Kersting, K., & Mladenov, M. (2014). Lifting Relational MAP-LPs using Cluster Signatures.
Proceedings of AAAI
[Ahmadi‟11] Ahmadi, B., Kersting, K., & Sanner, S. (2011, July). Multi-evidence lifted message passing, with application
to pagerank and the kalman filter. In IJCAI Proceedings-International Joint Conference on Artificial Intelligence (Vol. 22, No. 1, p. 1152).
[Choi‟11] Choi, J., Guzman-Rivera, A., & Amir, E. (2011, June). Lifted Relational Kalman Filtering. In IJCAI (pp. 2092-
2099).
![Page 283: Lifted Probabilistic Inference in Relational Modelsweb.cs.ucla.edu/~guyvdb/slides/UAI14.pdf–a logical approach to Lifted Inference • The tutorial is NOT an exhaustive overview](https://reader036.vdocuments.us/reader036/viewer/2022071401/60ea64bd69da5a21a8382e9f/html5/thumbnails/283.jpg)
References [Mladenov‟12]
Mladenov, M., Ahmadi, B., & Kersting, K. (2012). Lifted linear programming. In International Conference on Artificial Intelligence and Statistics (pp. 788-797).
[Kimmig‟14] Kimmig, A., Mihalkova, L., & Getoor, L. (2014). Lifted graphical models: a survey. Machine Learning, 1-45.
[V.d.Broeck‟13b] Van den Broeck, G., & Darwiche, A. (2013). On the complexity and approximation of binary evidence in lifted
inference. In Advances in Neural Information Processing Systems (pp. 2868-2876).