Title: Point-Based Trust: Define How Much Privacy is Worth
1Point-Based Trust Define How Much Privacy is
Worth
- Danfeng Yao Keith B. Frikken
- Brown University Miami University
- Mikhail J. Atallah Roberto Tamassia
- Purdue University Brown University
Funded by NSF IIS-0325345, IIS-0219560,
IIS-0312357, and IIS-0242421, ONR
N00014-02-1-0364, CERIAS, and Purdue Discovery
Park
ICICS December, 2006, Raleigh NC
2Outline of the talk
1. Introduction to privacy protection in
authorization
3. Secure 2-party protocol for knapsack problem
2. Point-based authorization and optimal
credential selection
2.1 New York State Division of Motor Vehicle
6-Point Authentication System
4. Applications
2.2 Knapsack problem
3Protecting private information
Alice
Policy Discount requires UID
Cred. BBB (better business bureau)
Policy Releasing UID requires BBB
Cred. UID (student ID)
- Trust negotiation protocols Winsborough Seamons
Jones 00, Yu Ma Winslett 00, Winsborough Li 02,
Li Du Boneh 03
4Our goals
- Prevent pre-mature information leaking by both
parties - Credentials should be exchanged only if services
can be established - Support some kind of cumulative privacy
quantitatively - Disclosing more credentials should incur higher
privacy loss - Support flexible service model
- Allow customized (or personalized) access
policies - Adjustable services based on qualifications
Our ultimate goal is to encourage users to
participate in e-commerce
5What can we learn from New York State DMV?
6-point proof-of-identity for getting NY drivers
license
Credential Points
Passport 5
Utility bill 1
Birth certificate 4
Social security card 3
6Another motivation adjustable services
Membership, Credential Discount
Mastercard 2
Airline frequent flier 1
AAA 0.5
Veteran 0.5
Adjustable services based on the private
information revealed
7Point-based authorization model
- Credential type C1, C2, , Cn
- The service provider defines
- Point values p1, p2, , pn of credentials -----
private - Threshold T for accessing a resource -----
private - The user defines sensitivity scores a1, a2, , an
of credentials ----- private - Credential selection problem
- The user (or client) wants to satisfy threshold T
with the minimum disclosure of privacy
This can be converted to a knapsack problem
8Example
Threshold of accessing a resource 10
Credential College ID Drivers license Credit card SSN
Point value 3 6 8 10
Alices option Point values Sensitivity score
SSN 10 100
College ID, Credit card 11 60
License, Credit card 14 80
Credential College ID Drivers license Credit card SSN
Sensitivity score 10 30 50 100
9Where do points come from?
- Reputation systems Beth Borcherding Klein 94,
Tran Hitchens Varadharajan Watters 05, Zouridaki
Mark Hejmo Thomas 2005 - This is future work, but here is an idea
Evaluate
Evaluate
Evaluate
10Converting CSP into a knapsack problem
- Defines binary vector y1, y2, , yn, where yi
1 xi - ai Private to user
- pi Private to provider
n
Maximize ? ai yi Subject to ? pi yi lt T
i1
n
Let T ? pi - T
n
i1
i1
11Dynamic programming of knapsack problem
- Dynamic programming for 0/1 knapsack problem
- Construct a n-by-T table M, where
n
T ? pi - T
i1
M i, j M i-1, j if j lt pi
max M i-1, j, M i-1, j-pi ai if j
pi
.. .. .. ..
Mi-1, j-pi .. .. Mi-1, j
.. .. .. ?
ai Private to user pi Private to provider
12Overview of privacy-preserving knapsack
computation
- Uses 2-party maximization protocol Frikken
Atallah 04 - Uses homomorphic encryption scheme
- E(x)E(y) E(x y)
- E(x)c E(xc)
- Preserves privacy for both
- Two phases table-filling and traceback
max , - 8 ai
- Add maximization and addition of ai to make the
two computation procedures indistinguishable
13Preliminary 2-party maximization protocol in a
split format
Player Input Output Privacy
Alice Alice1, Alice2 Alices share of max Do not know which is the max
Amazon Amazon1, Amazon2 Amazons share of max Do not know which is the max
Alices share Amazons share max (Alice1
Amazon1, Alice2 Amazon2)
Amazon1
Alice1
Amazon2
Alice2
Max
Alices share
Amazons share
Comparison can be done similarly Frikken Atallah
04
14Our protocol for dynamic programming of 0/1
knapsack problem
M i, j M i-1, j if j lt pi
max Mi-1, j , Mi-1, j-pi ai if
j pi
max , - 8 ai
- Computed entries are encrypted and stored by the
provider - The provider splits the two candidates of Mi, j
- The client and provider engage in a 2-party
private maximization protocol to compute the
maximum - The client encrypts her share of the maximum and
sends it to the provider - The provider computes and stores the encrypted
Mi, j
.. .. .. ..
E(Mi-1, j-pi) .. .. E(Mi-1, j)
.. .. .. ?
ai
E(Mi-1, j)
Alice
Amazon
Max
Alice
Amazon
E(Mi-1, j-pi)
Alices share
Amazons share
15Our protocol for knapsack (Contd)
- At the end of the 2-party dynamic programming,
the provider has a n-by-T table of encrypted
entries - Number of credentials n4
n
T ? pi - T
i1
E(M1, 1) E(M1, 2) E(M1, 3) E(M1, 4) E(M1, 5)
E(M2, 1) E(M2, 2) E(M2, 3) E(M2, 4) E(M2, 5)
E(M3, 1) E(M3, 2) E(M3, 3) E(M3, 4) E(M3, 5)
E(M4, 1) E(M4, 2) E(M4, 3) E(M4, 4) E(M4, 5)
How does the client find out the optimal
selection of credentials?
16Traceback protocol get the optimal credential
selection
Item 0 Item 1 Item 2 Item 3 Item 4
0
E(M1, 1), E(F1, 1) E(M1, 2), E(F1, 2) E(M1, 3), E(F1, 3) E(M1, 4), E(F1, 4) E(M1, 5), E(F1, 5)
E(M2, 1), E(F2, 1) E(M2, 2), E(F2, 2) E(M2, 3), E(F2, 3) E(M2, 4), E(F2, 4) E(M2, 5), E(F2, 5)
E(M3, 1), E(F3, 1) E(M3, 2), E(F3, 2) E(M3, 3), E(F3, 3) E(M3, 4), E(F3, 4) E(M3, 5), E(F3, 5)
E(M4, 1), E(F4, 1) E(M4, 2), E(F4, 2) E(M4, 3), E(F4, 3) E(M4, 4), E(F4, 4) E(M4, 5), E(F4, 5)
E(Fi, j)
Fi, j 0 or 1
- Security in a semi-honest (honest-but-curious)
model
17Security and efficiency of our privacy-preserving
knapsack computation
- Informally, security means that private
information is not leaked - Security definitions
- Semi-honest adversarial model
- A protocol securely implements function f if the
view of participants are simulatable with an
ideal implementation of the protocol - Theorem The basic protocol of the private
two-party dynamic programming computation in the
point-based trust management model is secure in
the semi-honest adversarial model. - Theorem The communication complexity between the
provider and the client of our basic secure
dynamic programming protocol is O(nT'), where n
is the total number of credentials and T' is the
marginal threshold.
18Fingerprint protocol an improved traceback
protocol
- We want to exclude the provider in the traceback
- To prevent tampering and reduce costs
1. Filling knapsack table
2. (Encrypted) last entry
3. Decrypt and identity optimal credential
selection
Fingerprint protocol is a general solution for
traceback in DP
19Fingerprint protocol (contd)
Item No. Privacy score (decimal) Privacy score (binary) Transformed score
1 2 010 010 0001
2 3 011 011 0010
3 5 101 101 0100
4 8 1000 1000 1000
Knapsack result (decimal) Knapsack result (binary) Item numbers in the knapsack
3 0010 2
20 1111 1, 2, 3, 4
20Application of point-based authorization fuzzy
location query in Presence systems
Ex
Where is Alice?
Boss
Mom
Alices mom
Where is Alice?
Alices boss
Where is Alice?
Alices ex
21Related work
- Hidden credentials Bradshaw Holt Seamons 04,
Frikken Li Atallah 06 - Private policy negotiation Kursawe Neven Tuyls
06, Optimizing trust negotiation Chen Clarke
Kurose Towsley 05, Trust negotiation
protocol/framework Winsborough Seamons Jones 00,
Yu Ma Winslett 00, Winsborough Li 02, Li Du Boneh
03, Li Li Winsborough 05 - Anonymous credential approaches Chaum 85,
Camenisch Lysyanskaya 01 - Secure Multiparty Computation Atallah Li 04,
Atallah Du 01 - OCBE Li Li 06
- Manet Zouridaki Mark Hejmo Thomas 05
- Platform for Privacy Preferences (P3P) W3C
22Conclusions and future work
- Our point-based model allows a client to choose
the optimal selection of credentials - We presented private 2-party protocol for
knapsack problem - Our fingerprint protocol is a general solution
for traceback - Future work
- Add typing to credentials
- Reputation systems and points