Modeling%20and%20Analysis%20of%20Anonymous-Communication%20Systems - PowerPoint PPT Presentation

About This Presentation
Title:

Modeling%20and%20Analysis%20of%20Anonymous-Communication%20Systems

Description:

Outline Anonymity: What and why ... Over 1000 volunteer routers Estimated 200,000 users User u running client Internet destination d Routers running servers u d 1 2 3 ... – PowerPoint PPT presentation

Number of Views:173
Avg rating:3.0/5.0
Slides: 84
Provided by: A732
Learn more at: http://www.cs.yale.edu
Category:

less

Transcript and Presenter's Notes

Title: Modeling%20and%20Analysis%20of%20Anonymous-Communication%20Systems


1
Modeling and Analysis of Anonymous-Communication
Systems
  • Joan Feigenbaum
  • http//www.cs.yale.edu/homes/jf
  • WITS08 Princeton NJ June 18, 2008
  • Acknowledgement Aaron Johnson

2
Outline
  • Anonymity What and why
  • Examples of anonymity systems
  • Theory Definition and proof
  • Practice Onion Routing
  • Theory meets practice

3
Anonymity What and Why
  • The adversary cannot tell who is communicating
    with whom. Not the same as confidentiality (and
    hence not solved by encryption).
  • Pro Facilitates communication by whistle
    blowers, political dissidents, members of 12-step
    programs, etc.
  • Con Inhibits accountability

4
Outline
  • Anonymity What and why
  • Examples of anonymity systems
  • Theory Definition and proof
  • Practice Onion Routing
  • Theory meets practice

5
Anonymity Systems
  • Remailers / Mix Networks
  • anon.penet.fi
  • MixMaster
  • Mixminion
  • Low-latency communication
  • Anonymous proxies, anonymizer.net
  • Freedom
  • Tor
  • JAP
  • Data Publishing
  • FreeNet

6
Mix Networks
  • First outlined by Chaum in 1981
  • Provide anonymous communication
  • High latency
  • Message-based (message-oriented)
  • One-way or two-way

7
Mix Networks
Users
Mixes
Destinations
8
Mix Networks
Adversary
Users
Mixes
Destinations
9
Mix Networks
Adversary
Users
Mixes
Destinations
Protocol
10
Mix Networks
Adversary
u
d
M1
M2
M3
Users
Mixes
Destinations
Protocol
  1. User selects a sequence of mixes and a
    destination.

11
Mix Networks
Adversary
u
d
M1
M2
M3
Users
Mixes
Destinations
Protocol
  1. User selects a sequence of mixes and a
    destination.
  2. Onion-encrypt the message.

12
Mix Networks
Adversary
u
d
M1
M2
M3
Users
Mixes
Destinations
Protocol
Onion Encrypt
  1. User selects a sequence of mixes and a
    destination.
  2. Onion-encrypt the message.
  1. Proceed in reverse order of the users path.
  2. Encrypt (message, next hop) with the public
    key of the mix.

13
Mix Networks
Adversary
?,dM3,M3M2,M2M1
u
d
M1
M2
M3
Users
Mixes
Destinations
Protocol
Onion Encrypt
  1. User selects a sequence of mixes and a
    destination.
  2. Onion-encrypt the message.
  1. Proceed in reverse order of the users path.
  2. Encrypt (message, next hop) with the public
    key of the mix.

14
Mix Networks
Adversary
?,dM3,M3M2,M2M1
u
d
M1
M2
M3
Users
Mixes
Destinations
Protocol
Onion Encrypt
  1. User selects a sequence of mixes and a
    destination.
  2. Onion-encrypt the message.
  3. Send the message, removing a layer of encryption
    at each mix.
  1. Proceed in reverse order of the users path.
  2. Encrypt (message, next hop) with the public
    key of the mix.

15
Mix Networks
Adversary
?,dM3,M3M2
u
d
M1
M2
M3
Users
Mixes
Destinations
Protocol
Onion Encrypt
  1. User selects a sequence of mixes and a
    destination.
  2. Onion-encrypt the message.
  3. Send the message, removing a layer of encryption
    at each mix.
  1. Proceed in reverse order of the users path.
  2. Encrypt (message, next hop) with the public
    key of the mix.

16
Mix Networks
Adversary
u
d
M1
?,dM3
M2
M3
Users
Mixes
Destinations
Protocol
Onion Encrypt
  1. User selects a sequence of mixes and a
    destination.
  2. Onion-encrypt the message.
  3. Send the message, removing a layer of encryption
    at each mix.
  1. Proceed in reverse order of the users path.
  2. Encrypt (message, next hop) with the public
    key of the mix.

17
Mix Networks
Adversary
u
d
M1
?
M2
M3
Users
Mixes
Destinations
Protocol
Onion Encrypt
  1. User selects a sequence of mixes and a
    destination.
  2. Onion-encrypt the message.
  3. Send the message, removing a layer of encryption
    at each mix.
  1. Proceed in reverse order of the users path.
  2. Encrypt (message, next hop) with the public
    key of the mix.

18
Mix Networks
Adversary
u
d
Users
Mixes
Destinations
  • Anonymity?
  • No one mix knows both source and destination.

19
Mix Networks
Adversary
u
d
v
f
Users
Mixes
Destinations
  • Anonymity?
  • No one mix knows both source and destination.
  • Adversary cannot follow multiple messages through
    the same mix.

20
Mix Networks
Adversary
u
d
v
e
w
f
Users
Mixes
Destinations
  • Anonymity?
  • No one mix knows both source and destination.
  • Adversary cannot follow multiple messages through
    the same mix.
  • More users provides more anonymity.

21
Outline
  • Anonymity What and why
  • Examples of anonymity systems
  • Theory Definition and proof
  • Practice Onion Routing
  • Theory meets practice

22
Provable Anonymity in Mix Networks
Setting
  • N users
  • Passive, local adversary
  • Adversary observes some of the mixes and the
    links.
  • Fraction f of links are not observed by
    adversary.
  • Users and mixes are roughly synchronized.
  • Users choose mixes uniformly at random.

23
Provable Anonymity in Mix Networks
Definition
  • Users should be unlinkable to their
    destinations.
  • Let ? be a random permutation that maps users to
    destinations.
  • Let C be the traffic matrix observed by the
    adversary during the protocol.
    Cei of messages on link e in round
    i

1
2
3
4
5


e1
1
0
0
1
1
e2
0
1
1
0
0
24
Provable Anonymity in Mix Networks
Information-theory background
  • Use information theory to quantify information
    gain from observing C.
  • H(X) ?x -PrXx log(PrXx) is the entropy of
    r.v. X
  • I(X Y) is the mutual information between X and
    Y.
  • I(X Y) H(X) H(X Y) ?x,y
    -PrXx?Yy log(PrXx?Yy)

25
Provable Anonymity in Synchronous Protocols
Definition The protocol is ?(N)-unlinkable if
I(C ?) ? ?(N).
Definition An ?(N)-unlinkable protocol is
efficient if 1. It takes T(N)
O(polylog(N/?(N))) rounds. 2. It uses O(N?T(N))
messages.
Theorem (Berman, Fiat, and Ta-Shma, 2004) The
basic mixnet protocol is ?(N)-unlinkable and
efficient whenT(N) ?(log(N) log2(N/?(N))).
26
Outline
  • Anonymity What and why
  • Examples of anonymity systems
  • Theory Definition and proof
  • Practice Onion Routing
  • Theory meets practice

27
Onion Routing GRS96
  • Practical design with low latency and overhead
  • Connection-oriented, two-way communication
  • Open source implementation (http//tor.eff.org)
  • Over 1000 volunteer routers
  • Estimated 200,000 users

28
How Onion Routing Works
1
2
u
d
3
5
User u running client
Internet destination d
4
Routers running servers
29
How Onion Routing Works
1
2
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).

30
How Onion Routing Works
1
2
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).

31
How Onion Routing Works
1
2
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).

32
How Onion Routing Works
1
2
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.

33
How Onion Routing Works
?341
1
2
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

34
How Onion Routing Works
1
2
u
d
3
5
?34
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

35
How Onion Routing Works
1
2
u
d
3
5
?3
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

36
How Onion Routing Works
1
2
?
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

37
How Onion Routing Works
1
2
u
d
?
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

38
How Onion Routing Works
1
2
u
d
3
5
?3
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

39
How Onion Routing Works
1
2
?34
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

40
How Onion Routing Works
1
2
?341
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.

41
How Onion Routing Works
1
2
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.
  4. Stream is closed.

42
How Onion Routing Works
1
2
u
d
3
5
4
  1. u creates 3-hop circuit through routers
    (u.a.r.).
  2. u opens a stream in the circuit to d.
  3. Data are exchanged.
  4. Stream is closed.
  5. Circuit is changed every few minutes.

43
Adversary
1
2
u
d
3
5
4
Active Local
44
Outline
  • Anonymity What and why
  • Examples of anonymity systems
  • Theory Definition and proof
  • Practice Onion Routing
  • Theory meets practice

45
Formal Analysis(F., Johnson, and Syverson, 2007)
u
1
2
d
v
e
3
5
4
w
f
Timing attacks result in four cases

46
Formal Analysis(F., Johnson, and Syverson, 2007)
u
1
2
d
v
e
3
5
4
w
f
Timing attacks result in four cases
  1. First router compromised

47
Formal Analysis(F., Johnson, and Syverson, 2007)
u
1
2
d
v
e
3
5
4
w
f
Timing attacks result in four cases
  1. First router compromised
  2. Last router compromised

48
Formal Analysis(F., Johnson, and Syverson, 2007)
u
1
2
d
v
e
3
5
4
w
f
Timing attacks result in four cases
  1. First router compromised
  2. Last router compromised
  3. First and last compromised

49
Formal Analysis(F., Johnson, and Syverson, 2007)
u
1
2
d
v
e
3
5
4
w
f
Timing attacks result in four cases
  1. First router compromised
  2. Last router compromised
  3. First and last compromised
  4. Neither first nor last compromised

50
Black-Box, Onion-Routing Model
  • Let U be the set of users.
  • Let ? be the set of destinations.
  • Let the adversary control a fraction b of the
    routers.
  • Configuration C
  • User destinations CD U??
  • Observed inputs CI U?0,1
  • Observed outputs CO U?0,1

Let X be a random configuration such that
PrXC ?u puCD(u)bCI(u)(1-b)1-CI(u)bCO(u)(
1-b)1-CO(u)
51
Indistinguishability
u
d
v
e
w
f
u
d
u
d
u
d
v
e
v
e
v
e
w
f
w
f
w
f
Indistinguishable configurations
52
Indistinguishability
u
d
v
e
w
f
u
d
u
d
u
d
v
e
v
e
v
e
w
f
w
f
w
f
Indistinguishable configurations
Note Indistinguishable configurations form an
equivalence relation.
53
Probabilistic Anonymity
  • The metric Y for the linkability of u and d in C
    is
  • Y(C) PrXD(u)d X?C

54
Probabilistic Anonymity
  • The metric Y for the linkability of u and d in C
    is
  • Y(C) PrXD(u)d X?C

Note This is different from the metric of mutual
information used to analyze mix nets.
55
Probabilistic Anonymity
  • The metric Y for the linkability of u and d in C
    is
  • Y(C) PrXD(u)d X?C
  • Exact Bayesian inference
  • Adversary after long-term intersection attack
  • Worst-case adversary

56
Probabilistic Anonymity
  • The metric Y for the linkability of u and d in C
    is
  • Y(C) PrXD(u)d X?C
  • Exact Bayesian inference
  • Adversary after long-term intersection attack
  • Worst-case adversary

Linkability given that u visits d EY XD(u)d
57
Anonymity Bounds
  1. Lower boundEY XD(u)d ? b2 (1-b2) pud

58
Anonymity Bounds
  • Lower boundEY XD(u)d ? b2 (1-b2) pud
  • Upper bounds
  • pv?1 for all v?u, where pv? ? pve for e ? d
  • pvd1 for all v?u

59
Anonymity Bounds
  • Lower boundEY XD(u)d ? b2 (1-b2) pud
  • Upper bounds
  • pv?1 for all v?u, where pv? ? pve for e ? d
    EY XD(u)d ? b (1-b) pud O(?logn/n)
  • pvd1 for all v?uEY XD(u)d ? b2 (1-b2)
    pud O(?logn/n)

60
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud

61
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud
  • Proof

62
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud
  • Proof

EY XD(u)d b2 b(1-b) pud
(1-b) EY XD(u)d ? XI(u)0
63
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud
  • Proof

EY XD(u)d b2 b(1-b) pud
(1-b) EY XD(u)d ? XI(u)0
64
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud

Let Ci be the configuration equivalence
classes. Let Di be the event Ci ? XD(u)d.
65
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud

Let Ci be the configuration equivalence
classes. Let Di be the event Ci ? XD(u)d. EY
XD(u)d ? XI(u)0 ? ?i (PrDi)2
PrCi PrXD(u)d
66
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud

Let Ci be the configuration equivalence
classes. Let Di be the event Ci ? XD(u)d. EY
XD(u)d ? XI(u)0 ? ?i (PrDi)2
PrCi PrXD(u)d
? ? (?i PrDi ?PrCi / ? PrCi)2
by Cauchy-Schwarz
PrXD(u)d
67
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud

Let Ci be the configuration equivalence
classes. Let Di be the event Ci ? XD(u)d. EY
XD(u)d ? XI(u)0 ? ?i (PrDi)2
PrCi PrXD(u)d
? ? (?i PrDi ?PrCi / ? PrCi)2
by Cauchy-Schwarz
PrXD(u)d
pud
68
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud
  • Proof

EY XD(u)d b2 b(1-b) pud
(1-b) EY XD(u)d ? XI(u)0
69
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud
  • Proof

EY XD(u)d b2 b(1-b) pud
(1-b) EY XD(u)d ? XI(u)0
? b2 b(1-b) pud (1-b) pud
70
Lower Bound
  • Theorem 2 EY XD(u)d ? b2 (1-b2) pud
  • Proof

EY XD(u)d b2 b(1-b) pud
(1-b) EY XD(u)d ? XI(u)0
? b2 b(1-b) pud (1-b) pud
b2
(1-b2) pud
71
Upper Bound
72
Upper Bound
Let pu1 ? pu2 ? pud-1 ? pud1 ? ? pu?
  • Theorem 3 The maximum of EY XD(u)d over
    (pv)v?u occurs when
  • 1. pv?1 for all v?u OR
  • 2. pvd1 for all v?u

73
Upper Bound
Let pu1 ? pu2 ? pud-1 ? pud1 ? ? pu?
  • Theorem 3 The maximum of EY XD(u)d over
    (pv)v?u occurs when
  • 1. pv?1 for all v?u OR
  • 2. pvd1 for all v?u

Show max. occurs when, for all v?u, pvev 1 for
some ev.
74
Upper Bound
Let pu1 ? pu2 ? pud-1 ? pud1 ? ? pu?
  • Theorem 3 The maximum of EY XD(u)d over
    (pv)v?u occurs when
  • 1. pv?1 for all v?u OR
  • 2. pvd1 for all v?u

Show max. occurs when, for all v?u,ev d orev
?.
Show max. occurs when, for all v?u, pvev 1 for
some ev.
75
Upper Bound
Let pu1 ? pu2 ? pud-1 ? pud1 ? ? pu?
  • Theorem 3 The maximum of EY XD(u)d over
    (pv)v?u occurs when
  • 1. pv?1 for all v?u OR
  • 2. pvd1 for all v?u

Show max. occurs when evd for all v?u, or
whenev ? for all v?u.
Show max. occurs when, for all v?u,ev d orev
?.
Show max. occurs when, for all v?u, pvev 1 for
some ev.
76
Upper-bound Estimates
Let n be the number of users.
77
Upper-bound Estimates
Let n be the number of users.
  • Theorem 4 When pv?1 for all v?uEY XD(u)d
    b b(1-b)pud (1-b)2 pud (1-b)/(1-(1-
    pu?)b)) O(?logn/n)

78
Upper-bound Estimates
Let n be the number of users.
  • Theorem 4 When pv?1 for all v?uEY XD(u)d
    b b(1-b)pud (1-b)2 pud (1-b)/(1-(1-
    pu?)b)) O(?logn/n)
  • Theorem 5 When pvd1 for all v?uEY XD(u)d
    b2 b(1-b)pud (1-b) pud/(1-(1- pud)b)
    O(?logn/n)

79
Upper-bound Estimates
Let n be the number of users.
  • Theorem 4 When pv?1 for all v?uEY XD(u)d
    b b(1-b)pud (1-b)2 pud (1-b)/(1-(1-
    pu?)b)) O(?logn/n)

80
Upper-bound Estimates
Let n be the number of users.
  • Theorem 4 When pv?1 for all v?uEY XD(u)d
    b b(1-b)pud (1-b)2 pud (1-b)/(1-(1-
    pu?)b)) O(?logn/n)
  • ? b (1-b) pud

For pu? small
81
Upper-bound Estimates
Let n be the number of users.
  • Theorem 4 When pv?1 for all v?uEY XD(u)d
    b b(1-b)pud (1-b)2 pud (1-b)/(1-(1-
    pu?)b)) O(?logn/n)
  • ? b (1-b) pud
  • EY XD(u)d ? b2 (1-b2) pud

For pu? small
82
Upper-bound Estimates
Let n be the number of users.
  • Theorem 4 When pv?1 for all v?uEY XD(u)d
    b b(1-b)pud (1-b)2 pud (1-b)/(1-(1-
    pu?)b)) O(?logn/n)
  • ? b (1-b) pud
  • EY XD(u)d ? b2 (1-b2) pud

For pu? small
Increased chance of total compromise from b2 to b.
83
Conclusions
  • Many challenges remain in the design,
    implementation, and analysis of
    anonymous-communication systems.
  • It is hard to prove theorems about real systems
    or even to figure out what to prove.
  • Nothing is more practical than a good theory!
    (Tanya Berger-Wolfe, UIC)
Write a Comment
User Comments (0)
About PowerShow.com