harvard university€¦ · e.g. “series of approx. to english” january 5, 2018 communication...

13
of 13 January 5, 2018 Communication Amid Uncertainty 1 Communication Amid Uncertainty Madhu Sudan Harvard University Based on many joint works …

Upload: others

Post on 27-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13 January 5, 2018 Communication Amid Uncertainty 1

Communication Amid Uncertainty

Madhu Sudan Harvard University

Based on many joint works …

Page 2: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Theories of Communication & Computation

Computing theory: (Turing ‘36) Fundamental principle = Universality You can program your computer to do whatever you want.

Communication principle: (Shannon ‘48) Centralized design (Encoder, Decoder, Compression, IPv4, TCP/IP). You can NOT program your device!

January 5, 2018 Communication Amid Uncertainty 2

Page 3: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Behavior of “intelligent” systems

Players: Humans/Computers Aspects:

Acquisition of knowledge Analysis/Processing Communication/Dissemination

Mathematical Modelling Explains limits Highlights non-trivial phenomena/mechanisms

Limits apply also to human behavior!

January 5, 2018 Communication Amid Uncertainty 3

Page 4: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Contribution of Shannon theory: Entropy!

Thermodynamics (Clausius/Boltzmann): 𝐻𝐻 = ln Ω

Quantum mechanics (von Neumann): 𝑆𝑆 𝜌𝜌 = −Tr 𝜌𝜌 ln 𝜌𝜌

Random Variables (Shannon): 𝐻𝐻 𝑃𝑃 = −∑𝑃𝑃(𝑥𝑥) log𝑃𝑃(𝑥𝑥) Profound impact

On technology of communication/data. On linguistics, philosophy, sociology,

neuroscience See Information by James Gleick.

January 5, 2018 Communication Amid Uncertainty 4

Page 5: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Entropy

Operational view: For random variable 𝑚𝑚 Alice + Bob know distribution 𝑃𝑃 of 𝑚𝑚. Alice observes 𝑚𝑚 ∼ 𝑃𝑃 Alice tasked to communicate 𝑚𝑚 to Bob. How many bits (in expectation) does she need

to send?

Theorem [Shannon/Huffman]: Entropy! 𝐻𝐻 𝑃𝑃 ≤ 𝐶𝐶𝐶𝐶𝑚𝑚𝑚𝑚𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶 ≤ 𝐻𝐻 𝑃𝑃 + 1

January 5, 2018 Communication Amid Uncertainty 5

Page 6: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

“We can also approximate to a natural language by means of a series of simple artificial languages.”

𝐶𝐶th order approx.: Given 𝐶𝐶 − 1 symbols, choose 𝐶𝐶th according to the empirical distribution of the language conditioned on the 𝐶𝐶 − 1 length prefix.

3-order (letter) approximation “IN NO IST LAT WHEY CRATICT FROURE BIRS GROCID PONDENOME OF DEMONSTURES OF THE REPTAGIN IS REGOACTIONA OF CRE.”

Second-order word approximation “THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISH WRITER THAT THE CHARACTER OF THIS POINT IS THEREFORE ANOTHER METHOD FOR THE LETTERS THAT THE TIME OF WHO EVER TOLD THE PROBLEM FOR AN UNEXPECTED.”

“𝐶𝐶𝑡𝑡ℎ order approx. produces plausible sequences of length 2𝐶𝐶”

E.g. “Series of approx. to English”

January 5, 2018 Communication Amid Uncertainty 6

Page 7: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Entropy applies to human communication?

Ideal world: Language = collection of messages we send each other

+ probability distribution over messages. Dictionary = message → words Optimal dictionary would achieve entropy of distribution.

Real world: Context! Language = distribution for every context. Dictionary = (messages,context) → word. Challenge: Context not perfectly shared!

January 5, 2018 Communication Amid Uncertainty 7

Page 8: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Uncertainty in communication

Repeating theme in human communication (and increasingly in devices): Communication task comes w. context Ignore context: Task achievable inefficiently. Use perfectly shared context (designed setting):

Task achievable efficiently. Imperfectly shared context (humans): Task

achievable moderately efficiently? Non-trivial Room for creative (robust) solutions

January 5, 2018 Communication Amid Uncertainty 8

Page 9: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Uncertain Compression

Design encoding/decoding schemes (𝐸𝐸/𝐷𝐷) so that Sender has distribution 𝑃𝑃 on [𝑁𝑁] Receiver has distribution 𝑄𝑄 on [𝑁𝑁] Sender gets 𝑚𝑚 ∈ [𝑁𝑁] Sends 𝐸𝐸(𝑃𝑃,𝑚𝑚) to receiver. Receiver receives 𝑦𝑦 = 𝐸𝐸(𝑃𝑃,𝑚𝑚) Decodes to 𝑚𝑚� = 𝐷𝐷(𝑄𝑄,𝑦𝑦)

Want: 𝑚𝑚 = 𝑚𝑚� (provided 𝑃𝑃,𝑄𝑄 close),

While minimizing 𝔼𝔼xp𝑚𝑚∼𝑃𝑃 |𝐸𝐸(𝑃𝑃,𝑚𝑚)|

January 5, 2018 Communication Amid Uncertainty 9

∆ 𝑃𝑃,𝑄𝑄 = max𝑚𝑚∈[𝑁𝑁]

max log𝑃𝑃 𝑚𝑚𝑄𝑄 𝑚𝑚

, log𝑄𝑄 𝑚𝑚𝑃𝑃 𝑚𝑚

Page 10: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Natural Compression

Dictionary Words: 𝑤𝑤𝑚𝑚,𝑗𝑗|𝑚𝑚 ∈ 𝑁𝑁 , 𝑗𝑗 ∈ ℕ ,𝑤𝑤𝑚𝑚,𝑗𝑗 of length 𝑗𝑗, One word of each length 𝑗𝑗 for each message 𝑚𝑚

Encoding/Expression: Given 𝑚𝑚,𝑃𝑃: pick “large enough” 𝑗𝑗 and send 𝑤𝑤𝑚𝑚,𝑗𝑗

Decoding/Understanding: Given 𝑤𝑤,𝑄𝑄: output 𝑚𝑚 s.t. 𝑤𝑤𝑚𝑚,𝑗𝑗 = 𝑤𝑤 that

maximizes 𝑄𝑄(𝑚𝑚) (where 𝑗𝑗 = |𝑤𝑤|) Theorem [JKKS]: If dictionary is random, then

expected length = 𝐻𝐻 𝑃𝑃 + 2∆(𝑃𝑃,𝑄𝑄) Deterministic dictionary? Open! [Haramaty+S]

January 5, 2018 Communication Amid Uncertainty 10

Page 11: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Other Contexts in Communication

Example 1: Common Randomness. Often shared randomness between sender+receiver makes

communication efficient Context = randomness Imperfect sharing = shared correlations Thm [CGKS]: Communication with imperfect sharing

bounded by communication with perfect sharing! Example 2: Uncertain functionality

Often conversations short if goal of communication is known + incorporated into conversation

Formalized by [Yao’80] What if goal is not perfectly understood by

sender+receiver? Thm [GKKS]: One way communication roughly preserved.

January 5, 2018 Communication Amid Uncertainty 11

Page 12: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Conclusions

Pressing need to understand human communication

Context in communication: HUGE + huge role Uncertainty in context a consequence of

“intelligence” (universality). Injects ambiguity, misunderstanding

vulnerabilities … Needs new exploration to resolve.

January 5, 2018 Communication Amid Uncertainty 12

Page 13: Harvard University€¦ · E.g. “Series of approx. to English” January 5, 2018 Communication Amid Uncertainty 6. of 13 . Entropy applies to human communication?

of 13

Thank You!

January 5, 2018 Communication Amid Uncertainty 13