towards a theory of onion routing aaron johnson yale university 5/27/2008
TRANSCRIPT
Towards a Theory of Onion Routing
Aaron Johnson
Yale University
5/27/2008
Overview
1. Anonymous communication and onion routing
2. Formally model and analyze onion routing(Financial Cryptography 2007)
3. Probabilistic analysis of onion routing(Workshop on Privacy in the Electronic Society 2007)
1
Anonymous Communication:What?
• Setting
2
Anonymous Communication:What?
• Setting– Communication
network
2
Anonymous Communication:What?
• Setting– Communication
network
– Adversary
2
Anonymous Communication:What?
• Setting– Communication
network
– Adversary
• Anonymity
2
Anonymous Communication:What?
• Setting– Communication
network
– Adversary
• Anonymity– Sender anonymity
2
Anonymous Communication:What?
• Setting– Communication
network
– Adversary
• Anonymity– Sender anonymity
– Receiver anonymity
2
Anonymous Communication:What?
• Setting– Communication
network
– Adversary
• Anonymity– Sender anonymity
– Receiver anonymity
w.r.t. amessage
2
Anonymous Communication:What?
• Setting– Communication
network
– Adversary
• Anonymity– Sender anonymity
– Receiver anonymity
– Unlinkability
w.r.t. amessage
2
Anonymous Communication:What?
• Setting– Communication
network
– Adversary
• Anonymity– Sender anonymity
– Receiver anonymity
– Unlinkability
w.r.t. amessage
w.r.t. all communication
2
Anonymous Communication:Why?
3
Anonymous Communication:Why?
• Useful– Individual privacy
online
– Corporate privacy
– Government and foreign intelligence
– Whistleblowers
3
Anonymous Communication:Why?
• Useful– Individual privacy
online
– Corporate privacy
– Government and foreign intelligence
– Whistleblowers
• Interesting– How to define?
– Possible in communication networks?
– Cryptography from anonymity
3
Anonymous Communication Protocols
• Mix Networks (1981)
• Dining cryptographers (1988)
• Onion routing (1999)
• Anonymous buses (2002)
4
Anonymous Communication Protocols
• Mix Networks (1981)
• Dining cryptographers (1988)
• Onion routing (1999)
• Anonymous buses (2002)
• Crowds (1998)
• PipeNet (1998)
• Xor-trees (2000)
4
• Tarzan (2002)
• Hordes (2002)
• Salsa (2006)
• ISDN,pool,Stop-and-Go,timed,cascademixes
• etc.
Deployed Anonymity Systems
• anon.penet.fi
• Freedom
• Mixminion
• Mixmaster
• Tor
• JAP
• FreeNet• anonymizer.com and
other single-hop proxies
• I2P
• MUTE
• Nodezilla
• etc.
5
Onion Routing
• Practical design with low latency and overhead
•
• Open source implementation (http://tor.eff.org)
• Over 1000 volunteer routers
• Estimated 200,000 users
• Sophisticated design
6
Anonymous Communication
Mix Networks
Dining cryptographers
Onion routing
Anonymous buses
Deployed Analyzed
7
A Model of Onion Routing with Provable Anonymity
Johnson, Feigenbaum, and SyversonFinancial Cryptography 2007
• Formally model onion routing using input/output automata
• Characterize the situations that provide possibilistic anonymity
8
How Onion Routing Works
User u running client Internet destination d
Routers running servers
u d
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
{{{m}3}4}1 1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
{{m}3}4
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
{m}3
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
m
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
m’
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
{m’}3
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
{{m’}3}4
1 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged
{{{m’}3}4}11 2
3
45
9
How Onion Routing Works
u d
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged.
4. Stream is closed.
1 2
3
45
9
How Onion Routing Works
u
1. u creates l-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data are exchanged.
4. Stream is closed.
5. Circuit is changed every few minutes.
1 2
3
45
d
9
How Onion Routing Works
u
1 2
3
45
d
10
How Onion Routing Works
u
1 2
3
45
d
11
How Onion Routing Works
u
1 2
3
45
d
Theorem 1: Adversary can only determine parts of a circuit it controls or is next to.
11
How Onion Routing Works
u
1 2
3
45
d
Theorem 1: Adversary can only determine parts of a circuit it controls or is next to.
u 1 2
11
Model
• Constructed with I/O automata(Lynch & Tuttle, 1989)– Models asynchrony– Relies on abstract properties of cryptosystem
• Simplified onion-routing protocol– Each user constructs a circuit to one destination– No separate destinations– No circuit teardowns
• Circuit identifiers
12
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Automata Protocol
u
v
w
13
Creating a Circuit
u 1 2 3
15
Creating a Circuit
[0,{CREATE}1]
1. CREATE/CREATED
u 1 2 3
15
Creating a Circuit
[0,CREATED]
1. CREATE/CREATED
u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
[0,{[EXTEND,2,{CREATE}2]}1]
u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
[l1,{CREATE}2]
u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
[l1,CREATED]u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
[0,{EXTENDED}1]u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
3. [Repeat with layer of encryption]
[0,{{[EXTEND,3,{CREATE}3]}2}1]
u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
3. [Repeat with layer of encryption]
u 1 2 3[l1,{[EXTEND,3,{CREATE}3]}2]
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
3. [Repeat with layer of encryption]
[l2,{CREATE}3]
u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
3. [Repeat with layer of encryption]
[l2,CREATED]u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
3. [Repeat with layer of encryption]
[l1,{EXTENDED}2]u 1 2 3
15
Creating a Circuit
1. CREATE/CREATED
2. EXTEND/EXTENDED
3. [Repeat with layer of encryption]
[0,{{EXTENDED}2}1]u 1 2 3
15
Input/Ouput Automata• States
• Actions transition between states
• Alternating state/action sequence is an execution
• In fair executions actions enabled infinitely often occur infinitely often
• In cryptographic executions no encrypted protocol messages are sent before they are received unless the sender possesses the key
14
I/O Automata Model
• Automata– User
– Server
– Complete network of FIFO Channels
– Adversary replaces some servers with arbitrary automata
• Notation– U is the set of users
– R is the set of routers
– N = U R is the set of all agents
– A N is the adversary
– K is the keyspace
– l is the (fixed) circuit length
– k(u,c,i) denotes the ith key used by user u on circuit c
16
User automaton
17
Server automaton
18
Anonymity
19
Definition (configuration):A configuration is a function URl mapping each user to his circuit.
Anonymity
Definition (indistinguishable executions):Executions and are indistinguishable to adversary A when his actions in are the same as in after possibly applying the following:
: A permutation on the keys not held by A. : A permutation on the messages encrypted by
a key not held by A.
Definition (configuration):A configuration is a function URl mapping each user to his circuit.
19
Anonymity
20
Definition (indistinguishable configurations):Configurations C and D are indistinguishable to adversary A when, for every fair, cryptographic execution C, there exists a fair, cryptographic execution D that is indistinguishable to A.
Anonymity
Definition (unlinkability):User u is unlinkable to d in configuration C with respect to adversary A if there exists an indistinguishable configuration D in which u does not talk to d.
20
Definition (indistinguishable configurations):Configurations C and D are indistinguishable to adversary A when, for every fair, cryptographic execution C, there exists a fair, cryptographic execution D that is indistinguishable to A.
Cu
v
1 2
3
45
21
Main Theorems
32
D
21
Main Theorems
Cu
v
1 2
3
45
21
Main Theorems
Cu
v
1 2
3
45
32
Dv
u
2 25
4
21
Cu
v
1 2
3
45
Main Theorems
Du
v
1 2
3
45
Theorem 1: Let C and D be configurations for which there exists a permutation : UU such that Ci(u) = Di((u)) if Ci(u) or Di((u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable.
21
Main Theorems
Theorem 1: Let C and D be configurations for which there exists a permutation : UU such that Ci(u) = Di((u)) if Ci(u) or Di((u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable.
21
Main Theorems
Theorem 2: Given configuration C, let (ri-1,ri,ri+1) be three consecutive routers in a circuit such that {ri-1,ri,ri+1}A= . Let D be identical to configuration C except ri has been replaced with riA. Then C and D are indistinguishable.
Theorem 1: Let C and D be configurations for which there exists a permutation : UU such that Ci(u) = Di((u)) if Ci(u) or Di((u)) is compromised or is adjacent to a compromised router. Then C and D are indistinguishable.
21
Main Theorems
Theorem 2: Given configuration C, let (ri-1,ri,ri+1) be three consecutive routers in a circuit such that {ri-1,ri,ri+1}A= . Let D be identical to configuration C except ri has been replaced with riA. Then C and D are indistinguishable.
Theorem 3: If configurations C and D are indistinguishable, then D can be reached from C by applying a sequence transformations of the type described in Theorems 1 and 2.
Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.
22
Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C1(u) (C1(v)) in with a message sent or received between v (u) and C1(u) (C1(v)).
22
Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.
Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C1(u) (C1(v)) in with a message sent or received between v (u) and C1(u) (C1(v)). 2. Let the permutation send u to v and v to u and other users to themselves. Apply to the encryption keys.
22
Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.
Proof: Given execution of C, construct : 1. Replace any message sent or received between u (v) and C1(u) (C1(v)) in with a message sent or received between v (u) and C1(u) (C1(v)). 2. Let the permutation send u to v and v to u and other users to themselves. Apply to the encryption keys.
i. is an execution of D.
ii. is fair.
iii. is cryptographic.
iv. is indistinguishable. 22
Lemma: Let u, v be two distinct users such that neither they nor the first routers in their circuits are compromised in configuration C. Let D be identical to C except the circuits of users u and v are switched. C and D are indistinguishable to A.
UnlinkabilityCorollary: A user is unlinkable to its destination when:
23
Unlinkability
23u 4?5?
The last router is unknown.
Corollary: A user is unlinkable to its destination when:
23
OR
Unlinkability
23u 4?5?
The last router is unknown.
12 4The user is unknown and another unknown user has an unknown destination.
5 2?5?
4?
Corollary: A user is unlinkable to its destination when:
23
OR
OR
12 4The user is unknown and another unknown user has a different destination.
5 1 2
Unlinkability
23u 4?5?
The last router is unknown.
12 4The user is unknown and another unknown user has an unknown destination.
5 2?5?
4?
Corollary: A user is unlinkable to its destination when:
23
Model Robustness
• Only single encryption still works
• Can include data transfer
• Can allow users to create multiple circuits
24
A Probabilistic Analysis of Onion Routing in a Black-box Model
Johnson, Feigenbaum, and SyversonWorkshop on Privacy in the Electronic Society 2007
• Use a black-box abstraction to create a probabilistic model of onion routing
• Analyze unlinkability
• Provide upper and lower bounds on anonymity
• Examine a typical case25
Anonymity
u 1 2
3
45
d
1.
2.
3.
4.
v
w
e
f
26
Anonymity
u 1 2
3
45
d
1. First router compromised
2.
3.
4.
v
w
e
f
26
Anonymity
u 1 2
3
45
d
1. First router compromised
2. Last router compromised
3.
4.
v
w
e
f
26
Anonymity
u 1 2
3
45
d
1. First router compromised
2. Last router compromised
3. First and last compromised
4.
v
w
e
f
26
Anonymity
u 1 2
3
45
d
1. First router compromised
2. Last router compromised
3. First and last compromised
4. Neither first nor last compromised
v
w
e
f
26
Black-box Abstraction
u d
v
w
e
f
27
Black-box Abstraction
u d
v
w
e
f
1. Users choose a destination
27
Black-box Abstraction
u d
v
w
e
f
1. Users choose a destination
2. Some inputs are observed
27
Black-box Abstraction
u d
v
w
e
f
1. Users choose a destination
2. Some inputs are observed
3. Some outputs are observed
27
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
28
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
• Any configuration consistent with these observations is indistinguishable to the adversary. 28
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
• Any configuration consistent with these observations is indistinguishable to the adversary. 28
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
• Any configuration consistent with these observations is indistinguishable to the adversary. 28
Probabilistic Black-box
u d
v
w
e
f
29
Probabilistic Black-box
u d
v
w
e
f
• Each user v selects a destination from distribution pv
pu
29
Probabilistic Black-box
u d
v
w
e
f
• Each user v selects a destination from distribution pv
• Inputs and outputs are observed independently with probability b
pu
29
Black Box ModelLet U be the set of users.
Let be the set of destinations.
Configuration C• User destinations CD : U• Observed inputs CI : U{0,1}
• Observed outputs CO : U{0,1}
Let X be a random configuration such that:
Pr[X=C] = u [puCD(u)][bCI(u)(1-b)1-CI(u)][bCO(u)(1-b)1-CO(u)]
30
Probabilistic Anonymityu dvw
ef
u dvw
ef
u dvw
ef
u dvw
ef
Indistinguishable configurations
31
Conditional distribution: Pr[ud] = 1
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Exact Bayesian inference
• Adversary after long-term intersection attack
• Worst-case adversary
Unlinkability given that u visits d:
E[Y | XD(u)=d]
32
Anonymity Bounds
1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu
d
33
Anonymity Bounds
1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu
d
2. Upper bounds:a. pv
=1 for all vu, where pv pv
e for e d
b. pvd=1 for all vu
33
Anonymity Bounds
1. Lower bound:E[Y | XD(u)=d] b2 + (1-b2) pu
d
2. Upper bounds:a. pv
=1 for all vu, where pv pv
e for e d
E[Y | XD(u)=d] b + (1-b) pud + O(logn/n)
b. pvd=1 for all vu
E[Y | XD(u)=d] b2 + (1-b2) pud + O(logn/n)
33
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Proof:
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu
d + (1-b) E[Y | XD(u)=d XI(u)=0]
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu
d + (1-b) E[Y | XD(u)=d XI(u)=0]
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Let Ci be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Let Ci be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]
= i (Pr[Di])2
Pr[Ci] Pr[XD(u)=d]
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Let Ci be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]
= i (Pr[Di])2
Pr[Ci] Pr[XD(u)=d]
(i Pr[Di] Pr[Ci] / Pr[Ci])2
Pr[XD(u)=d]by Cauchy-Schwartz
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Let Ci be the configuration equivalence classes.Let Di be the event Ci XD(u)=d.E[Y | XD(u)=d XI(u)=0]
= i (Pr[Di])2
Pr[Ci] Pr[XD(u)=d]
(i Pr[Di] Pr[Ci] / Pr[Ci])2
Pr[XD(u)=d]
= pud
by Cauchy-Schwartz
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu
d + (1-b) E[Y | XD(u)=d XI(u)=0]
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu
d + (1-b) E[Y | XD(u)=d XI(u)=0] b2 + b(1-b) pu
d + (1-b) pud
34
Lower Bound
Theorem 2: E[Y | XD(u)=d] b2 + (1-b2) pud
Proof:E[Y | XD(u)=d] = b2 + b(1-b) pu
d + (1-b) E[Y | XD(u)=d XI(u)=0] b2 + b(1-b) pu
d + (1-b) pud
= b2 + (1-b2) pud
34
Upper Bound
35
Upper Bound
Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Let pu1 pu
2 pud-1 pu
d+1 … pu
35
Upper Bound
Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Let pu1 pu
2 pud-1 pu
d+1 … pu
Show max. occurs when, for all vu, pv
ev = 1 for
some ev. 35
Show max. occurs when, for all vu,ev = d orev = .
Upper Bound
Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Let pu1 pu
2 pud-1 pu
d+1 … pu
Show max. occurs when, for all vu, pv
ev = 1 for
some ev. 35
Show max. occurs when, for all vu,ev = d orev = .
Upper Bound
Theorem 3: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Let pu1 pu
2 pud-1 pu
d+1 … pu
Show max. occurs when, for all vu, pv
ev = 1 for
some ev.
Show max. occurs when ev=d for all vu, or whenev = for all vu. 35
Upper-bound EstimatesLet n be the number of users.
36
Upper-bound Estimates
Theorem 4: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
Let n be the number of users.
36
Upper-bound Estimates
Theorem 4: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
Theorem 5: When pvd=1 for all vu:
E[Y | XD(u)=d] = b2 + b(1-b)pud +
(1-b) pud/(1-(1- pu
d)b) + O(logn/n)]
Let n be the number of users.
36
Upper-bound Estimates
Theorem 4: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
Let n be the number of users.
36
Upper-bound Estimates
Theorem 4: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
b + (1-b) pud
Let n be the number of users.
For pu small
36
Upper-bound Estimates
Theorem 4: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
b + (1-b) pud
E[Y | XD(u)=d] b2 + (1-b2) pud
Let n be the number of users.
For pu small
36
Upper-bound Estimates
Theorem 4: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
b + (1-b) pud
E[Y | XD(u)=d] b2 + (1-b2) pud
Let n be the number of users.
Increased chance of total compromise from b2 to b.
For pu small
36
Typical Case
Let each user select from the Zipfian distribution: pdi
= 1/(is)
Theorem 6:E[Y | XD(u)=d] = b2 + (1 − b2)pu
d+ O(1/n)
37
Typical Case
Let each user select from the Zipfian distribution: pdi
= 1/(is)
Theorem 6:E[Y | XD(u)=d] = b2 + (1 − b2)pu
d+ O(1/n)E[Y | XD(u)=d] b2 + (1 − b2)pu
d
37
Future Work
• Investigate improved protocols to defeat timing attacks.
• Examine how quickly users distribution are learned.
• Formally analyze scalable, P2P designs.
38
Related work• A Formal Treatment of Onion Routing
Jan Camenisch and Anna LysyanskayaCRYPTO 2005
• A formalization of anonymity and onion routingS. Mauw, J. Verschuren, and E.P. de VinkESORICS 2004
• I/O Automaton Models and Proofs for Shared-Key Communication SystemsNancy LynchCSFW 1999
5
Overview
• Formally model onion routing using input/output automata– Simplified onion-routing protocol– Non-cryptographic analysis
• Characterize the situations that provide anonymity
6
Overview
• Formally model onion routing using input/output automata– Simplified onion-routing protocol– Non-cryptographic analysis
• Characterize the situations that provide anonymity– Send a message, receive a message,
communicate with a destination– Possibilistic anonymity
6
Future Work
• Construct better models of time
• Exhibit a cryptosystem with the desired properties
• Incorporate probabilistic behavior by users
26
Related Work• A Model of Onion Routing with Provable
AnonymityJ. Feigenbaum, A. Johnson, and P. SyversonFC 2007
• Towards an Analysis of Onion Routing SecurityP. Syverson, G. Tsudik, M. Reed, and C. LandwehrPET 2000
• An Analysis of the Degradation of Anonymous ProtocolsM. Wright, M. Adler, B. Levine, and C. ShieldsNDSS 2002