approximating probabilistic inference3-universal hashing partition 2n space into 2m cells variables:...

69
Approximating Probabilistic Inference Kuldeep S. Meel PhD Student CAVR Group Joint work with Supratik Chakraborty (IITB), Daniel J. Fremont (UCB), Sanjit A. Seshia (UCB), Moshe Y. Vardi (Rice) 1

Upload: others

Post on 25-Sep-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Approximating Probabilistic Inference

Kuldeep S. Meel

PhD Student CAVR Group

Joint work with Supratik Chakraborty (IITB), Daniel J. Fremont (UCB), Sanjit A. Seshia (UCB), Moshe Y. Vardi (Rice)

1

Page 2: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

IoT: Internet of Things

2

Page 3: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

The Era of Data

How to make inferences from data

3

Page 4: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Probabilistic Inference

Given that Mary (aged 65) called 911, what is the probability of the burglary in the house?

Pr [event|evidence]

4

Page 5: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Probabilistic Inference

Given that Mary (aged 65) called 911, what is the probability of the burglary in the house?

Pr [event|evidence]

5

Page 6: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Probabilistic Inference

Given that Mary (aged 65) called 911, what is the probability of the burglary in the house?

Pr [event|evidence]

6

Page 7: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Graphical Models !

!

Bayesian Networks

7

Page 8: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

B E

A

M P

C

Burglary Earthquake

Alarm

PhoneWorkingMaryWakes

Call8

Page 9: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

B E

A

M P

C

Burglary Earthquake

Alarm

PhoneWorkingMaryWakes

Call9

Page 10: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

B E

A

M P

C

Burglary Earthquake

Alarm

PhoneWorkingMaryWakes

Call

T 0.05F 0.95

T 0.01F 0.99

T 0.8F 0.2

M P C Pr

T T T 0.99T T F 0.01T F T 0T F F 1F T T 0F T F 1F F T 0.1F F F 0.9

A M PrT T 0.7T F 0.3F T 0.1F F 0.9

B E A Pr

T T T 0.88T T F 0.12T F T 0.91T F F 0.09F T T 0.97F T F 0.03F F T 0.02F F F 0.98What is Pr [ Burglary | Call ] ?

10

Page 11: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Bayes’ Rule to the Rescue

Pr[Burglary \ Call] = Pr[B,E,A,M,P,C] + Pr[B, E, A,M,P,C] + · · ·

Pr[Burglary|Call] =Pr[Burglary \ Call]

Pr[Call]

11

Page 12: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

B E

A

M P

C

Burglary Earthquake

Alarm

PhoneWorkingMaryWakes

Call

Pr [B,E,A,M,P,C] = Pr[B] · Pr[E] · Pr[A|B,E] · Pr[M|A] · Pr[C|M,P]

B E A Pr

T T T 0.88T T F 0.12T F T 0.91T F F 0.09F T T 0.97F T F 0.03F F T 0.02F F F 0.98

T 0.01F 0.99

T 0.05F 0.95

A M PrT T 0.7T F 0.3F T 0.1F F 0.9

T 0.8F 0.2

M P C Pr

T T T 0.99T T F 0.01T F T 0T F F 1F T T 0F T F 1F F T 0.1F F F 0.9

12

Page 13: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

B E

A

M P

C

Burglary Earthquake

Alarm

PhoneWorkingMaryWakes

Call

Pr [B, ,A,M,P,C]E

T 0.01F 0.99

T 0.8F 0.2

B E A Pr

T T T 0.88T T F 0.12T F T 0.91T F F 0.09F T T 0.97F T F 0.03F F T 0.02F F F 0.98

A M PrT T 0.7T F 0.3F T 0.1F F 0.9

T 0.05F 0.95

M P C Pr

T T T 0.99T T F 0.01T F T 0T F F 1F T T 0F T F 1F F T 0.1F F F 0.9

13

Page 14: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

So we are done?

• We can enumerate all paths

• Compute probability for every path

• Sum up all the probabilities

Page 15: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Where is the catch?

Exponential number of paths

Page 16: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Prior Work

Quality/Guarantees

Scal

abilit

yBP, MCMC

Exact Methods

16

C = f

C ? f

Page 17: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Our Contribution

Quality/Guarantees

Scal

abilit

y

Approximation Guarantees

WeightMC

Exact Methods

17

C = f

C ? f

BP, MCMC

Page 18: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Approximation Guarantees

Pr[f

1 + ✏ C f(1 + ✏)] � 1� �

Input: , Pr[f

1 + ✏ C f(1 + ✏)] � 1� �Pr[

f

1 + ✏ C f(1 + ✏)] � 1� � Output: C

Page 19: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

An Idea for a new paradigm?

Partition space of paths into “small” and “equal weighted” cells

19

# of paths in a cell is not large (bounded by a constant)

“equal weighted”: All the cells have equal weight

Page 20: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Outline

• Reduction to SAT

• Weighted Model Counting

• Looking forward

20

Page 21: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Boolean Satisfiability• SAT: Given a Boolean formula F over variables V,

determine if F is true for some assignment to V

• F = (a ⋁ b)

• RF = {(0,1),(1,0),(1,1)}

• SAT is NP-Complete (Cook 1971)

21

Page 22: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Model CountingGiven:

• CNF Formula F, Solution Space: RF

!

Problem (MC): What is the total number of satisfying assignments (models) i.e. |RF|?

Example

F = (a ⋁ b); RF = {[0,1], [1,0], [1,1]}

!

|RF| = 3

Page 23: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Weighted Model CountingGiven: ▪ CNF Formula F, Solution Space: RF ▪ Weight Function W(.) over assignments

Problem (WMC): What is the sum of weights of satisfying assignments i.e. W(RF) ?

!Example F = (a ⋁ b) RF = {[0,1], [1,0], [1,1]} W([0,1]) = W([1,0]) = 1/3 W([1,1]) = W([0,0]) = 1/6 !

W(RF) = 1/3 + 1/3 + 1/6 = 5/6

Page 24: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Weighted SAT• Boolean formula F

• Weight function over variables (literals)

• Weight of assignment = product of wt of literals

• F = (a ⋁ b); W(a=0) = 0.4; W(a = 1) = 1-0.4 = 0.6 W(b=0) = 0.3; W(b = 1) = 0.7

• W[(0,1)] = W(a = 0) W(b = 1) = 0.4 0.7 = 0.28

24

Page 25: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Reduction to W-SAT

Bayesian Network SAT Formula

Nodes Variables

Rows of CPT Variables

Probabilities in CPT Weights

Event and Evidence Constraints

25

Page 26: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Reduction to W-SAT• Every satisfying assignment = A valid path in the

network

• Satisfies the constraint (evidence)

• Probability of path = Weight of satisfying assignment = Product of weight of literals = Product of conditional probabilities

• Sum of probabilities = Weighted Sum

26

Page 27: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Why SAT?• SAT stopped being NP-complete in practice!

• zchaff (Malik, 2001) started the SAT revolution

• SAT solvers follow Moore’s law

27

Page 28: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

28

Speed-up of 2012 solver over other solvers

1

10

100

1,000

Grasp (2

000)

zChaff

(200

1)

BerkMin (2

002-0

3)

zChaff

(200

3-04)

Siege (

2004

)

Minisat +

SatElite

(200

5)

Minisat2

(2006

)

Rsat +

SatElite

(200

7)

Precosa

t (200

9)

Cryptominisa

t (201

0)

Glucose

2.0 (

2011

)

Glucose

2.1 (

2012

)

Solver

Spee

d-up

(log

sca

le)

Page 29: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Why SAT?• SAT stopped being NP-complete in practice!

• zchaff (Malik, 2001) started the SAT revolution

• SAT solvers follow Moore’s law

• “Symbolic Model Checking without BDDs”: most influential paper in the first 20 years of TACAS

• A simple input/output interface

29

Page 30: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Riding the SAT revolution

SAT

Probabilistic Inference

Page 31: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Where is the catch?

• Model counting is very hard (#P hard)

• #P: Harder than whole polynomial hierarchy

• Exact algorithms do not scale to large formulas

• Approximate counting algorithms do not provide theoretical guarantees

Page 32: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Outline

• Reduction to SAT

• Approximate Weighted Model Counting

• Looking forward

32

Page 33: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Counting through Partitioning

33

Page 34: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Counting through Partitioning

34

Page 35: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Counting through Partitioning

35

Pick a random cell

Estimated Weight of solutions= Weight of the cell * total # of cells

Page 36: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Approximate Counting

36

………….…

t

Partitioning

690 710 730 730 731 834831

Page 37: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

37

Algorithm

690 710 730 730 731 834………….…

t

Median

Scaling the confidence

831

Page 38: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

How to Partition?

How to partition into roughly equal small cells of solutions without knowing the distribution of solutions?

3-Universal Hashing [Carter-Wegman 1979, Sipser 1983] 38

Page 39: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

XOR-Based Hashing

▪ 3-universal hashing

▪ Partition 2n space into 2m cells

▪ Variables: X1, X2, X3, …, Xn

▪ Pick every variable with prob. ½, XOR them and equate to 0/1 with prob. ½

▪ X1+X3+X6+…+ Xn-1 = 0

▪ m XOR equations → 2m cells

39

Page 40: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Counting through Partitioning

40

Page 41: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Partitioning

How large the cells should be?

41

Page 42: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Size of cell

▪ Too large => Hard to enumerate

▪ Too small => Variance can be very high

▪ More tight bounds => larger cell

42

pivot = 5(1 + 1/")2

Page 43: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Dependence on distribution

▪ Normalized weight of a solution y = W(y)/Wmax

!

▪ Maximum weight of a cell = pivot

!

▪ Maximum # of solutions in cell = pivot*Wmax/Wmin

!

▪ Tilt = Wmax/Wmin

43

Page 44: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

▪ Approximation: WeightMC( ), returns C s.t.

!

!

▪ Complexity: # of calls to SAT solver is linear in

and polynomial in

Strong Theoretical Guarantees

44

⇢, log ��1, |F |, 1/", 1/⇢

Pr[f

1 + ✏ C f(1 + ✏)] � 1� �

B, ✏, �

Page 45: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Handling Large Tilt

45

Tilt: 992

.992

.001

.002

.001

.001 .002

.001 .001

.002

Page 46: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Handling Large Tilt

46

Tilt: 992 Tilt for each region: 2

.992

.001 .002 .001

.001 .002

.001

.001

.002 .001 ≤ wt < .002

.50 ≤ wt < 1 Requires Pseudo-Boolean solver: Still a SAT problem not Optimization

Page 47: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Main Contributions

▪ Novel parameter, tilt ( ρ ), to characterize complexity

▪ ρ = Wmax / Wmin over satisfying assignments

▪ Small Tilt ( ρ ) ▪ Efficient hashing-based technique requires only SAT

solver

▪ Large Tilt ( ρ ) ▪ Divide-and-conquer using Pseudo-Boolean solver

47

Page 48: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

▪ Approximation: WeightMC( ), returns C s.t.

!

!

▪ Complexity: # of calls to SAT solver is linear in

and polynomial in

Strong Theoretical Guarantees

48

⇢, log ��1, |F |, 1/", 1/

Pr[f

1 + ✏ C f(1 + ✏)] � 1� �

B, ✏, �

log ⇢

Page 49: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Significantly Faster than SDD

49

0.25

1

4

16

64

256

1024

4096

16384

65536

or-50

or-60

s526

a32

s526

157

s953

a32

s119

6a15

7

s123

8a74

Squar

ing1

Squar

ing7

LoginSer

vice2

Sort

Karats

uba

Enqueu

eSeq

TreeM

ax

LLRever

se

Run

Tim

e (s

econ

ds)

Benchmarks

WeightMC

SDD

# of variables ——>

Page 50: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Mean Error: 4% (Allowed: 80%)

50

0.001 0.01

0.1 1

10 100

1000 10000

100000

0 5 10 15 20 25

Wei

ghte

d Co

unt

Benchmarks

WeightMCSDD*1.8SDD/1.8

Page 51: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Outline

• Reduction to SAT

• Weighted Model Counting

• Looking forward

51

Page 52: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Distribution-Aware SamplingGiven: ▪ CNF Formula F, Solution Space: RF ▪ Weight Function W(.) over assignments !

Problem (Sampling): Pr (Solution y is generated) = W(y)/W(RF) !

Example: F = (a ⋁ b); RF = {[0,1], [1,0], [1,1]} !W( [0,1] ) = W([1,0]) = 1/3 W([1,1]) = W([0,0]) = 1/6 !Pr ([0,1] is generated]) = (1/3) / (5/6) = 2/5

52

Page 53: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

53

Partitioning into equal (weighted) “small” cells

Page 54: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

54

Pick a random cell

Pick a solution according to its weight

Partitioning into equal (weighted) “small” cells

Page 55: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Sampling Distribution

55

•  Benchmark: case110.cnf; #var: 287; #clauses: 1263 •  Total Runs: 4x106; Total Solutions : 16384

0

20

40

60

80

100

120

140

160

180

247 271 291 311 331 351 371 391 411 431 451 471 491 512

Fre

quen

cy

#Solutions

WtGen

IdealGen

Page 56: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Sampling Distribution

56

•  Benchmark: case110.cnf; #var: 287; #clauses: 1263 •  Total Runs: 4x106; Total Solutions : 16384

0

20

40

60

80

100

120

140

160

180

247 271 291 311 331 351 371 391 411 431 451 471 491 512

Fre

quen

cy

#Solutions

WtGen

IdealGen

Page 57: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Classification

• What kind of problems have small tilt?

• How to predict tilt?

Page 58: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Tackling Tilt

• What kind of problems have low tilt?

• How to handle CNF+PBO+XOR

• Current PBO solvers can’t handle XOR

• SAT solver can’t handle PBO queries

Page 59: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Extension to More Expressive Domains (SMT, CSP)

▪ Efficient 3-independent hashing schemes ▪ Extending bit-wise XOR to SMT provides

guarantees but no advantage of SMT progress

!

▪ Solvers to handle F + Hash efficiently ▪ CryptoMiniSAT has fueled progress for SAT

domain ▪ Similar solvers for other domains?

59

Page 60: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Conclusion

▪ Inference is key to the Internet of Things (IoT)

▪ Current inference methods either do not scale or do not provide any approximation guarantees

▪ A novel scalable approach that provides theoretical guarantee of approximation

▪ Significantly better than state-of-the-art tools

▪ Exciting opportunities ahead!

60

Page 61: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

To sum up ….

61

Page 62: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Collaborators

62

Page 63: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

63

EXTRA SLIDES

Page 64: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Complexity

!

• Tilt captures the ability of hiding a large weight solution.

• Is it possible to remove tilt from complexity?

64

Page 65: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Exploring CNF+XOR

▪ Very little understanding as of now

!

▪ Can we observe phase transition?

!

▪ Eager/Lazy approach for XORs?

!

▪ How to reduce size of XORs further?

65

Page 66: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Outline

• Reduction to SAT

• Partition-based techniques via (unweighted) model counting

• Extension to Weighted Model Counting

• Discussion on hashing

• Looking forward

66

Page 67: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

XOR-Based Hashing

▪ 3-universal hashing

▪ Partition 2n space into 2m cells

▪ Variables: X1, X2, X3,….., Xn

▪ Pick every variable with prob. ½ ,XOR them and equate to 0/1 with prob. ½

▪ X1+X3+X6+…. Xn-1 = 0 (Cell ID: 0/1)

▪ m XOR equations -> 2m cells

▪ The cell: F && XOR (CNF+XOR)

67

Page 68: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

XOR-Based Hashing

▪ CryptoMiniSAT: Efficient for CNF+XOR

▪ Avg Length : n/2

▪ Smaller the XORs, better the performance

!

!

How to shorten XOR clauses?

68

Page 69: Approximating Probabilistic Inference3-universal hashing Partition 2n space into 2m cells Variables: X 1, X 2, X 3, …, X n Pick every variable with prob. ½, XOR them and equate

Independent Variables

▪ Set of variables such that assignments to these uniquely determine assignments to rest of variables for formula to be true

▪ (a V b = c) ➔ Independent Support: {a, b}

▪ # of auxiliary variables introduced: 2-3 orders of magnitude

▪ Hash only on the independent variables (huge speedup)

69