focused belief propagation for query-specific inference

24
Carnegie Mellon Focused Belief Propagation for Query-Specific Inference Anton Chechetka Carlos Guestrin 14 May 2010

Upload: chacha

Post on 24-Feb-2016

68 views

Category:

Documents


0 download

DESCRIPTION

Focused Belief Propagation for Query-Specific Inference. Anton Chechetka Carlos Guestrin. 14 May 2010. Query-Specific inference problem. q uery . known. n ot interesting. Using information about the query to speed up convergence of belief propagation for the query marginals. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Focused Belief Propagation  for Query-Specific Inference

Carnegie Mellon

Focused Belief Propagation for Query-Specific Inference

Anton ChechetkaCarlos Guestrin

14 May 2010

Page 2: Focused Belief Propagation  for Query-Specific Inference

2

Query-Specific inference problemknownquery not interesting

Using information about the queryto speed up convergence of belief propagation

for the query marginals

Eij

jiij XXfP )()(X

Page 3: Focused Belief Propagation  for Query-Specific Inference

3

Graphical modelsTalk focus: pairwise Markov random fieldsPaper: arbitrary factor graphs (more general)Probability is a product of local potentials

Graphical structure

Compact representation Intractable inference

Approximate inference often works well in practice

Eij

jiij XXfP )()(X

variable

potential r

sj

ki

h

u

Page 4: Focused Belief Propagation  for Query-Specific Inference

4

(loopy) Belief PropagationPassing messages along edges

Variable belief:

Update rule:

Result: all single-variable beliefs

ikEkjj

tjkji

xiji

tij xmxxfxm

j ,

)()1( )()()(

Eij

itiji

t xmxP )()(~ )()(

kim r

sj

ki

h

u

Page 5: Focused Belief Propagation  for Query-Specific Inference

5

(loopy) Belief PropagationUpdate rule:

Message dependencies are local:

Round–robin scheduleFix message orderApply updates in that order until convergence

ikEkjj

tjkji

xiji

tij xmxxfxm

j ,

)()1( )()()(

r

sj

ki

h

u

dependence

dependence

dep.

Page 6: Focused Belief Propagation  for Query-Specific Inference

6

Dynamic update prioritization

Fixed update sequence is not the best optionDynamic update scheduling can speed up convergence

Tree-Reweighted BP [Wainwright et. al., AISTATS 2003]Residual BP [Elidan et. al. UAI 2006]

Residual BP apply the largest change first

1

informative update

2

wasted computation

large change large

change

large change

small change

small change

small change

Page 7: Focused Belief Propagation  for Query-Specific Inference

7

Residual BP [Elidan et. al., UAI 2006]

Update rule:

Pick edge with largest residual

Update

oldnew

)()(max OLDij

NEWij mm

More effort on the difficult parts of the model

ikEkj

jOLDjkji

xiji

NEWij xmxxfxm

j ,

)()( )()()(

)(OLDijm

)(NEWijm

)(NEWijm

But no query

Page 8: Focused Belief Propagation  for Query-Specific Inference

8

Our contributionsUsing weighted residuals to prioritize updates

Define message weights reflecting the importance of the message to the query

Computing importance weights efficiently

Experiments: faster convergence on large relational models

Page 9: Focused Belief Propagation  for Query-Specific Inference

9

• Residual BP updates• no influence on the query• wasted computation

Why edge importance weights?query

residual < residualwhich to update??

• want to update • influence on the query in the future

Residual BP max immediate residual reduction

Our work max approx. eventual effect on P(query)

Page 10: Focused Belief Propagation  for Query-Specific Inference

10

Query-Specific BP

Update rule:

Pick edge with

Update

oldnew

ijOLDij

NEWij Amm max )()(

ikEkj

jOLDjkji

xiji

NEWij xmxxfxm

j ,

)()( )()()(

)(OLDijm

)(NEWijm

)(NEWijm

Rest of the talk: defining and computing edge importance

edgeimportance

the only change!

Page 11: Focused Belief Propagation  for Query-Specific Inference

Edge importance base casePick edge with

approximate eventual update effect on P(Q)

queryr

sj

ki

h

u

ijOLDij

NEWij Amm max )()(

||P(NEW)(Q) P(OLD)(Q)|| ||m(NEW) m(OLD)||change in query belief change in message

tight bound

1

Base case: edge directly connected to the query Aji=??

1ji ji

||P(Q)|| ||m ||ji

Page 12: Focused Belief Propagation  for Query-Specific Inference

mjisup mrj

Edge one step away from the query: Arj=??

mjisup mrj

Edge importance one step away

queryr

sj

ki

h

u

||P(Q)|| ||m ||

change in query belief

change in message

can compute in closed formlooking at only fji [Mooij, Kappen; 2007]

message importance

ji

rj||m ||

over values ofall other messages

Page 13: Focused Belief Propagation  for Query-Specific Inference

One step away: Arj=

Edge importance general case

queryr

sj

ki

h

u

||P(Q)||||msh|| P(Q)msh

sup

P(Q)msh

sup

Base case: Aji=1

mhrsup msh

mrjsup mhr

mjisup mrj

mjisup mrj

sensitivity(): max impact along the path

Generalization? expensive to compute bound may be infinite

Page 14: Focused Belief Propagation  for Query-Specific Inference

Edge importance general case

queryr

sj

ki

h

u

1

P(Q)msh

sup

mhrsup msh

mrjsup mhr

mjisup mrj

sensitivity(): max impact along the path

2

Ash = max all paths from to query

sensitivity()h

There are a lot of paths in a graph,trying out every one is intractable

Page 15: Focused Belief Propagation  for Query-Specific Inference

15

Efficient edge importance computation

A = max all paths from to query sensitivity()

There are a lot of paths in a graph,trying out every one is intractable

always 1

always decreases as the path grows

mhrsup msh

mrjsup mhr

mjisup mrj

sensitivity( hrji ) =

always 1

always 1

decomposes into individual edge contributions

Dijkstra’s (shortest paths) alg. will efficiently find max-sensitivity paths

for every edge

Page 16: Focused Belief Propagation  for Query-Specific Inference

16

Aji = max all paths from i to query sensitivity()

Query-Specific BPRun Dijkstra’s alg starting at query to get edge weights

Pick edge with largest weighted residual

Update

ijOLDij

NEWij Amm max )()(

)(OLDijm

)(NEWijm )(NEWijm

More effort on the difficult parts of the model

Takes into account not only graphical structure, but also strength of dependencies

and relevant

Page 17: Focused Belief Propagation  for Query-Specific Inference

17

OutlineUsing weighted residuals to prioritize updates

Define message weights reflecting the importance of the message to the query

Computing importance weights efficientlyAs an initialization step before residual BP

Restore anytime behavior of residual BP

Experiments: faster convergence on large relational models

Page 18: Focused Belief Propagation  for Query-Specific Inference

18

Big picture

? ?

anytimeproperty

long initialization

Dijkstra

BP

This is broken!

Before After

all marginals

all marginal

s

all marginal

s

query marginals

Page 19: Focused Belief Propagation  for Query-Specific Inference

19

Interleaving BP and Dijkstra’sDijkstra’s expands the highest weight edges first

Can pause it at any time and get the most relevant submodel

Dijkstra

BP

Dijkstra

Dijkstra

BP BP

When to stop BP and resume Dijkstra’s?

queryfull model

Page 20: Focused Belief Propagation  for Query-Specific Inference

20

Interleaving Dijkstra’s expands the highest weight edges first queryexpanded on

previous iteration just expandednot yet expanded

min expanded edges A A

suppose

M min expanded A

no need to expand further at this point

upper bound on priorityactual priority of

)()( max OLD

ijNEWijEDGESALLij mmM

ijOLDij

NEWijEXPANDEDij Amm )()(max

Page 21: Focused Belief Propagation  for Query-Specific Inference

21

Any-Time query-specific BPquery

Dijkstra’s alg. BP updatesQuery-specific BP:

Anytime QSBP:

same BP update sequence!

r

sj

ki

Page 22: Focused Belief Propagation  for Query-Specific Inference

22

Experiments – the setupRelational model: semi-supervised people recognition in collection of images [with Denver Dash and Matthai Philipose @ Intel]

One variable per personAgreement potentials for people who look alikeDisagreement potentials for people in the same image2K variables, 900K factors

Error measure: KL from the fixed point of BPagreedisagree

Page 23: Focused Belief Propagation  for Query-Specific Inference

23

Experiments – convergence speed

better

Residual BPQuery-Specific BP

Anytime Query-Specific BP(with Dijkstra’s interleaving)

Page 24: Focused Belief Propagation  for Query-Specific Inference

24

ConclusionsPrioritize updates by weighted residuals

Takes query info into account

Importance weights depend on both the graphical structure and strength of local dependencies

Efficient computation of importance weights

Much faster convergence on large relational models

Thank you!