preparing for the worst but hoping for the best: robust …apa522/slides-rbp.pdf · 2020. 2....

46
Preparing for the Worst but Hoping for the Best: Robust (Bayesian) Persuasion Piotr Dworczak Alessandro Pavan February 2020

Upload: others

Post on 14-Sep-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Preparing for the Worst but Hoping for the Best:Robust (Bayesian) Persuasion

Piotr Dworczak Alessandro Pavan

February 2020

Page 2: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Motivation

Bayesian persuasion/ information design

designer knows agents’ sources of information

trusts her ability to coordinate Receivers on actions most favorable to her

optimal information structure sensitive to fine details of agents’ beliefs

In many problems of interest,

agents’ sources of information (both before and after receiving Sender’sinformation) unknown

Sender may not trust her ability to coordinate Receivers

Quest for robustness

Page 3: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

This Paper

Novel solution concept that accounts for such uncertainty/ambiguity

Lexicographic approach to the problem

Step 1: “Preparing for the worst”

designer seeks to protect herself against possibility that Nature providesinformation and coordinates agents on actions most adversarial todesigner

Step 2: “Hoping for the best”

designer maximizes over all worst-case optimal policies assumingNature and Receivers play favorably

Robust solutions

best-case optimal among worst-case optimal ones

max-max over max-min

(also maximize λ infπ∈Π v(q, π) + (1− λ)v(q, ∅), for λ large)

Page 4: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Results

Separation theorem

Properties of robust solutions

Implications for various persuasion models

Conditionally-independent signals (online supplement)

Page 5: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Literature

Bayesian persuasion

...Calzolari and Pavan (2006), Brocas and Carillo (2007), Rayo-Segal (2010),Kamenica and Gentzkow (2011), Ely (2017), Dworczak-Martini (2019)...Surveys

Bergemann and Morris (2019)Kamenica (2019)

Information design with adversarial coordination

Inostroza and Pavan (2018)Mathevet, Perego, Taneva (2019)Morris et al. (2019)Ziegler (2019)

Persuasion with unknown beliefs

Kolotilin et al. (2017)Laclau and Renou (2017)Guo and Schmaya (2018)Hu and Weng (2019)Kosterina (2019)

Max-max over max-min design

Borgers (2017)

Page 6: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Plan

1 Introduction

2 Model

3 Robust Solutions

4 Separation Theorem

5 Corollaries

6 Applications

7 Conditionally-independent Robust Solutions (another day)

Page 7: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Model

Page 8: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Model: Environment

Payoff-relevant state: ω ∈ Ω (finite)

Prior: µ0 ∈ ∆Ω

Sender’s “signal”

q : Ω→ ∆SS: signal realizations

(Reduced-form description of) Sender’s payoff, given induced posterior µ ∈ ∆Ω

V (µ) : highest payoff

V (µ): lowest payoff

Difference between V and V :

strategy selection (multiple Receivers)

tie-breaking (single Receiver)

Page 9: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Model: Sender’s uncertainty

Nature designs information structure

π : Ω× S → ∆RR: signal realizations

Multiple Receivers

discriminatory disclosures embedded into derivation of V (µ)given common posterior µ, Nature provides (possibly private) signals to theagents and coordinates them on course of action most adversarial to Sender(among those consistent with assumed solution concept)e.g.., Bayes-correlated eq. given µ

Conditioning on Sender’s signal

information acquisition (after hearing from Sender)correlated noisemaximal concern for robustness

Online Appendix: conditionally independent signals

Page 10: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Plan

1 Introduction

2 Model

3 Robust Solutions

4 Separation Theorem

5 Corollaries

6 Applications

7 Conditionally-independent Robust Solutions

Page 11: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Robust Solutions

Page 12: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Robust Solutions

Sender’s expected payoffs when

Sender selects signal q

Nature selects signal π

v(q, π) ≡∑

Ω

∫S

∫R

V (µs,r0 )dπ(r |ω, s)dq(s|ω)µ0(ω)

v(q, π) ≡∑

Ω

∫S

∫R

V (µs,r0 )dπ(r |ω, s)dq(s|ω)µ0(ω)

where µs,r0 is common posterior obtained from (q, π)

Page 13: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Worst-case optimality

Definition 1

Signal q is worst-case optimal if, for all signals q′,

infπ

v(q, π) ≥ infπ

v(q′, π).

maximal payoff guarantee

Page 14: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Worst-case optimality

Given any posterior µ ∈ ∆Ω, Sender’s (lowest) payoff if, starting from µ, statefully revealed

V full(µ) ≡∑

Ω

V (δω)µ(ω)

where δω is Dirac measure assigning prob 1 to ω.

Remark 1Since both Nature and Sender can reveal state, signal q is worst-case optimal iff

infπ

v(q, π) = V full(µ0)

W : set of worst-case optimal signals

non-empty (full disclosure is worst-case optimal)

Page 15: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Robust Solutions

Definition 2Signal qRS is robust solution if it maximizes v(q, ∅) over W .

Lexicographic preferences

max-max over max-min policies

step 1: max-min (worst-case optimal policies)

step 2: max-max (highest payoff if Nature and Receivers play favorably)

Clearly, qRS also maximizes supπ v(q, π) over W

However, Sender prefers to provide information herself rather than counting onNature to do it

Page 16: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Robust Solutions

Lemma 1Signal qRS is robust solution iff distribution over posterior beliefs ρRS ∈ ∆∆Ω that qRS

induces maximizes ∫V (µ)dρ(µ)

over set of distributions over posterior beliefs W ⊂ ∆∆Ω satisfying (a) Bayes plausibility∫µdρ(µ) = µ0

and (b) “worst-case optimality” (WCO)∫lco(V )(µ)dρ(µ) = V full(µ0)

Page 17: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Robust vs Bayesian Solutions

Bayesian solutions:

qBP maximizes v(q, ∅) over Q (feasible signals)induced distribution over posterior beliefs ρBP ∈ ∆∆Ω maximizes∫

V (µ)dρ(µ) over all distributions ρ ∈ ∆∆Ω satisfying Bayes plausibility,∫µdρ(µ) = µ0

Robust solutions:

qRS maximizes v(q, ∅) over W ⊂ Q (worst-case optimal signals)induced distribution over posterior beliefs ρRS ∈ ∆∆Ω maximizes∫

V (µ)dρ(µ) over all distributions ρ ∈ ∆∆Ω satisfying, in addition to Bayesplausibility,

∫µdρ(µ) = µ0, the WCO constraint∫

lco(V )(µ)dρ(µ) = V full(µ0)

Page 18: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Plan

1 Introduction

2 Model

3 Robust Solutions

4 Separation Theorem

5 Corollaries

6 Applications

7 Conditionally-independent Robust Solutions

Page 19: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Separation Theorem

Page 20: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Separation Theorem

Theorem 1Let

F ≡ B ⊆ Ω : V (µ) ≥ V full(µ) ALL µ ∈ ∆B,Then,

W = ρ ∈ ∆∆Ω : ρ satisfies BP and supp(µ) ∈ F ALLµ ∈ supp(ρ)

Therefore, ρRS ∈ ∆∆Ω is robust solution iff ρRS maximizes∫V (µ)dρ(µ)

over all distributions over posterior beliefs ρ ∈ ∆∆Ω satisfying BP and s.t., for anyµ ∈ supp(ρ), supp(µ) ∈ F .

Page 21: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Separation Theorem

Idea:

suppose Sender induces posterior µ with supp(µ) = B for which there existsη ∈ ∆B s.t. V (η) < V full(η)

starting from µ, Nature can induce η w/ strict positive probability

starting from µ, Nature can bring Sender’s payoff strictly below V full(µ)

because Nature can respond to any other posterior µ′ ∈ supp(ρ) by fullydisclosing state, ∫

lco(V )(µ)dρ(µ) < V full(µ0)

policy ρ not worst-case optimal

Page 22: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

KG’s prosecutor example

10

1

Expected payoff

after Nature's disclosure

Expected payoff

before Nature's disclosure

Figure: Prosecutor example

Page 23: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Plan

1 Introduction

2 Model

3 Robust solutions

4 Separation theorem

5 Corollaries

6 Applications

7 Conditionally-independent Robust Solutions

Page 24: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Corollaries

Page 25: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Existence

Corollary 1A robust solution always exists.

existence guaranteed by possibility for Nature to condition on realization ofSender’s signal

Page 26: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

State separation

Corollary 2

Suppose there exist ω, ω′ ∈ Ω and λ ∈ (0, 1) s.t.

V (λδω + (1− λ)δω′) < λV (δω) + (1− λ)V (δω′),

Then any robust solution must separate ω and ω′.

Assumption: there exists some belief supported on ω, ω′ under which Sender’spayoff below full disclosure

Conclusion: ALL posterior beliefs must separate ω and ω′.

Page 27: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Robustness of Bayesian Solutions

Corollary 3

Bayesian solution ρBP is robust iff for any µ ∈ supp(ρBP) and any η ∈ ∆Ω s.t.supp(η) ⊂ supp(µ),

V (η) ≥ V full(η)

Binary state: any robust solution

full disclosure

Bayesian solution

Page 28: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Worst-case optimality preserved under more disclosure

Corollary 4

W closed under Blackwell dominance: If ρ′ ∈W , and ρ Blackwell dominates ρ′, thenρ ∈W .

Result not true in case of conditionally independent signals

Page 29: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Informativeness of Robust vs Bayesian solutions

Corollary 5Given any Bayesian solution ρBP , there exists robust solution ρRS s.t. either ρRS and ρBPnot comparable in Blackwell order, or ρRS Blackwell dominates ρBP .

If Bayesian solution ρBP is Blackwell more informative than robust solution ρRS ,then ρBP also robust

Reason why robustness calls for more disclosure

little to do with indifference on Sender’s part

concealing information gives Nature more room for adversarial design

If Bayesian solution ρBP is not robust and is strictly Blackwell dominated by robustsolution ρRS , then ρRS separates states that are not separated under ρBP

robustness never calls for MPS with same supports

Page 30: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Concavification

Let vlow := minω∈Ω V (δω)− 1

Auxiliary function

VF (µ) =

V (µ) if supp(µ) ∈ F and V (µ) ≥ vlow

vlow otherwise

Corollary 6A feasible distribution ρ ∈ ∆∆Ω is robust iff∫

VF (µ)dρ(µ) = co(VF )(µ0).

Furthermore, there always exists a robust solution ρ with supp(ρ)| ≤ |Ω|.

VF upper-semi-continuous: results follow from arguments similar to those in BPliterature

Page 31: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Plan1 Introduction

2 Model

3 Robust Solutions

4 Separation Theorem

5 Corollaries

6 Applications

7 Conditionally-independent Robust Solutions

Page 32: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Applications

Page 33: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Limits to Third-degree Price Discrimination

Bergemann, Brooks, Morris (2015)

designer segments market to maximize combination of CS and PS

Proposition 1Suppose Pareto weight on seller’s surplus strictly positive. Full disclosure is uniquerobust solution. When, instead, designer cares only about consumer surplus, optimalsignal is BBM solution.

For any posterior µ assigning positive measure to more than one buyer’s value,Nature can construct posterior η inducing seller to ask highest price in supp(µ)thus inducing no trade.

Total surplus under η strictly below full-information

(if weight on PS strictly positive)

Page 34: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Privately Informed Receiver

Guo and Shmaya (2019)

Exogenous price p ∈ (0, 1)

Seller’s payoff is 1 if trade, 0 otherwise

Buyer’s exogenous private information f (t|ω) – MLRP

Proposition 2

Any robust solution separates states ω ≤ p from states ω′ > p

Starting from any µ s.t. supp(µ) ⊃ ω, ω′ with ω < p < ω′, Nature can induceposterior η s.t. Eη[ω] < p thus inducing no trade

Sender’s payoff given η below full information

Given ρ ∈ ∆∆Ω, Nature can bring Sender’s payoff below V full(µ0)

Policy ρ not worst-case optimal

Page 35: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Lemons problem

Seller’s value: ω (known to seller)

Buyer’s value: ω + ∆, with ∆ > 0 (unknown to buyer)

Exogenous price p drawn from U[0, 1]

Trade:

p ≥ ω

Eµ[ω|ω ≤ p] + ∆ > p

Seller designs info structure

Proposition 3

Under any robust solution ρRS , for any µ, µ′ ∈ supp(ρRS),diam(supp(µ)), diam(supp(µ′)) ≤ ∆ but diam(supp(µ) ∪ supp(µ′)) > ∆.

Robust solutions are minimally informative among those that eliminate adverseselection!

Page 36: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Supermodular Games

Continuum of Receivers

ai = 0, 1A ∈ [0, 1]: aggregate “attack”

Payoff from not attacking: 0

Payoff from attacking g > 0 if A ≥ ωb < 0 if A < ω

Designer’s payoff: 1− A

Bayesian solution: Upper censorship

Reveals each ω < 0 w.p. γBP ∈ (0, 1) (w.p. 1− γBP , reveals nothing)Conceals all ω > 0

Proposition 4

Robust solution reveals ω < 0 w.p. γ∗ > γBP , conceals all ω ∈ [0, 1], reveals all ω > 1with certainty.

Revelation of upper dominance region:

else Nature constructs η inducing A = 1 also for ω > 1

Page 37: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Plan

1 Introduction

2 Model

3 Robust Solutions

4 Separation Theorem

5 Corollaries

6 Applications

7 Conditionally-independent Robust Solutions

Page 38: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Conditionally Independent Signals

Page 39: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Conditionally-independent Robust Solutions

Nature cannot condition on realization of Sender’s signal

π : Ω→ ∆R

so far: π : Ω× S → ∆R

Page 40: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Existence of CI-Robust Solutions

A robust solution may fail to exist

A robust solution exists if V is continuous

DefinitionA feasible distribution ρ ∈ ∆∆Ω is a weak CI-robust solution if it maximizes∫

V (µ)dρ(µ)

over cl(WCI ), where cl(WCI ) is closure (in weak∗ topology) of set of CI-worst-casesolutions

TheoremA weak solution exists no matter V .

Page 41: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Separation under CI-Robust Solutions

Sufficient conditions for state separation under CI-robust solutions

weaker than those for robust solutions

whenever ω and ω′ must be separated under CI-robust solutions, they mustbe separated under robust solutions

Page 42: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Full Disclosure and CI-robustness of Bayesian solutions

Sufficient conditions for

full-disclosure to be unique CI-robust solution

all distributions to be CI-worst-case optimal

Page 43: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

CI-robust solutions: Binary state

Unlike robust solutions, CI-robust solutions for binary states need not coincide with

Bayesian solutions

full disclosure

Page 44: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Blackwell Informativeness of CI-robust solutions

Unlike robust solutions, CI-robust solutions need not be Blackwell more informativethan Bayesian solution

Example in which unique Bayesian solution is Blackwell strictly more informativethan all CI-robust solutions

Nature cannot engineer MPS conditional on s

ρRS may be Blackwell less informative than ρBP an yet Nature may not beable to inflict Sender same payoff as under ρBP

Page 45: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Conclusions

Bayesian persuasion when Sender uncertain

Receivers’ informationstrategy selection

Robust solutions

best-case optimal among worst-case optimal onesmax-max over max-min criterion

Separation theorem

any pair of states over which Nature can construct belief yielding less thanfull info are separated

Robustness

more disclosure

but only through more separation (not MPSs over same supp)

Relatively simple two-step design procedure

step 1: Nature designs info to minimize Sender’s payoffstep 2: designer solves standard persuasion problem on restricted set ofworst-case optimal policies

Implications for applications

Page 46: Preparing for the Worst but Hoping for the Best: Robust …apa522/Slides-RBP.pdf · 2020. 2. 28. · Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet,

Most Important Slide

THANKS!