CSCI283 Fall 2003 Lecture 10 - PowerPoint PPT Presentation

About This Presentation
Title:

CSCI283 Fall 2003 Lecture 10

Description:

YOU ARE EXPECTED TO READ CHAPTER 5 FROM THE TEXT IN ... CS283/Fall03/GWU/Vora/Lecture10. 2. Announcements. No class/office hours day before Thanksgiving ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 53
Provided by: poo69
Category:
Tags: csci283 | fall | lecture

less

Transcript and Presenter's Notes

Title: CSCI283 Fall 2003 Lecture 10


1
Trusted Operating Systems contd.
Assurance Palladium Privacy, Anonymity
  • CSCI283 Fall 2003 Lecture 10
  • GWU
  • Draws extensively from Memon notes and text,
    Chapter 5
  • YOU ARE EXPECTED TO READ CHAPTER 5 FROM THE TEXT
    IN ADDITION TO THIS

2
Announcements
  • No class/office hours day before Thanksgiving
  • Graduate Events (undergrads very welcome too)
    http//www.cs.gwu.edu/studentcorner/graduateEvent
    s/
  • Fri., 21st Nov., 11 am -noon, CS Colloquium,
    David Chaum
  • Exam 630-830, Wed., 17th Dec., this classroom

3
Need
  • Policy description of requirements
  • Model policy representation check if policy can
    be enforced
  • Design implementation of policy
  • Trust based on features and assurance
  • Supposing an OS provider claims to have a secure
    system. How can he demonstrate that the system is
    indeed secure to someone else?

4
Typical OS Flaws
  • I/O
  • Talking to independent, intelligent
    systems/devices
  • Code complex and dependent on h/w interface
  • Might bypass OS functions, might end up bypassing
    security
  • Some OSs eliminate the security features
    associated with transfering a single character
  • Not enough isolation (shared libraries etc.)
  • Incomplete mediation (e.g. access policy checked
    only every I/O operation or process execution
    etc. not continuously)
  • OS hooks for external s/w provide access powers
    identical to that of OS.

5
I/O Exploits
  • Time-of-check to time-of-use mismatch access
    permission checked for user to access particular
    object X. Between the time the access is approved
    to the time the access occurs, user changes the
    designation of the object, so now she accesses an
    unapproved object.
  • I/O source/destination address (often resides in
    user memory) can be changed after checking, while
    I/O in progress.
  • Common system buffers can contain data accessible
    to multiple users
  • Better to use simple security mechanims with
    clearly defined access control policies.

6
Assurance Methods
  • Testing
  • Problems
  • Cannot check for all possible problems
  • Difficult to characterize exactly what is going
    on
  • Testing involves modification which might change
    the system observation changes the observed
  • Budget and time limitations
  • Ethical hacking/Penetration testing
  • Formal verification

7
Formal Verification
  • minimum(A, n)
  • min A1
  • for i1n
  • if Ai lt min
  • min Ai
  • Assertion P (initial conditions)
  • n gt 0
  • Assertion Q (true for all loops)
  • n gt 0 1 ? i ? n min ? A1
  • Assertion R (for a particular loop i)
  • n gt 0 1 ? i ? n
  • j, 1 ? j ? i-1, min ? Aj
  • Assertion S (on loop exit)
  • n gt 0 in1
  • j, 1 ? j ? n, min ? Aj

8
Odds and Ends
  • Validation of Implementation
  • Requirements checking code review system
    testing
  • Open Source

9
Make sure to study from text
  • Virtualization
  • Layered Trust
  • Evaluation

10
Palladium
  • http//download.microsoft.com/download/c/8/0/c80ea
    683-9900-46ff-9c67-d7f14b0d3787/trusted_open_platf
    orm_ieee.pdf

11
Market Need
  • Problem current systems insecure because vast
    code base many vulnerabilities, lay users
  • Examples financial systems vulnerable to
    attacks do not who talking to
  • Need Secure, interoperable, open system
  • Examples of secure closed systems games, set-top
    boxes, smart cards, cell phones. PC will never be
    replaced by these?

12
Commercial Need
  • Open architecture
  • Allows addition of arbitrary s/w and h/w without
    requiring central authority
  • Needs to operate in legacy environment
  • Low cost for modifications

13
NGSCBNext Generation Secure Computing Base
  • Can use in both trusted and untrusted form.
  • Isolation among OSs and processes, particularly
    from I/O using machine monitors. Normal/Trusted
    OS each protected from surveillance by other.
  • h/w and s/w security primitives no tamperproof
    h/w because attacks are mostly s/w attacks (also
    s/w attacks are those that are Break Once Run
    Everywhere (BORE)
  • Authenticated operation All entities say their
    identity and prove it using cryptographic
    techniques program executable code hash is its
    ID. If anything changes, the hash changes.

14
Sealed storage/attestation
  • Sealed Storage
  • Long-lived secrets are sealed by owner, with a
    list of those allowed to unseal. Owners ID
    associated with the secret.
  • Example of sealing
  • Why owners ID associated with secret?
  • Attestation
  • Use public key of generating platform to
    authenticate code
  • Equivalent to code bearing a certificate issued
    by the platform
  • Platform itself bears a certificate provided by a
    CA
  • Can use other anonymous methods

15
Authenticated Boot
  • OS also has to identify itself in this manner
  • h/w must authenticate the boot kernel
  • Cryptographic keys stored securely
  • In cryptographic co-processor does
    authentication operations in h/w for OS
  • OS does similar operations for other applications
  • Crypto co-processor Also boots virtual machine
    monitor has PRNG TCPAs TPM is an example
  • (What is TCPA?)

16
Upgrades and Openness
  • Upgrades
  • What happens when OS upgraded?
  • Need a single sealed secret to have more than one
    kernel allowed access. That kernel can then
    reseal.
  • Openness
  • Manferdelli has announced that the kernel code
    will be available for review
  • Will his management continue to support this?

17
Provides security and authenticity of data
  • Provides privacy only in so much as privacy is
    security of personal information
  • Protects against malware because?
  • Applications
  • secure shopping, banking, taxes
  • rights management of enterprise data (email
    rules, document rules)
  • entertainment media distribution too widespread
    (single media asset in too many places at a given
    time) to benefit from this, but technically can
    be done

18
Trusted by whom?
  • Trusted by authenticator but
  • Public key provides a tracking means
  • Suggested fixes
  • Pseudonyms issued by trusted third party (trusted
    parties might collude, usually few of them)
  • Secret sharing
  • Anonymous credentials (e-cash-like)
  • Privacy community? Conflicts?
  • What else required for a Trusted O/S?

19
Potential Conflicts with Goals of Trustedness
20
Anonymity and Privacy
21
Cookies
  • Post-it notes for the web (typically 4KB)
  • Small files maintained on users hard disk,
    readable only by the site that created them (up
    to 20 per site)
  • Used to
  • Preserve state information about a transaction
  • identify you when you return to a web site so you
    dont have to remember a password
  • help web sites understand how people use them
  • Cookies can be harmful
  • used to profile users and track their activities
    without their knowledge, especially across web
    sites
  • Can be disabled
  • To learn about cookies, see Cookie Central

22
How DoubleClick Works
23
Privacy
  • We take privacy in our daily lives for granted
  • In the internet that is not the case
  • Examples
  • Pentium III chip serial numbers
  • Read via software (ActiveX or Applets)
  • Helps track a user over the web
  • After pressure from privacy activists Intel
    decided to turn it off by default
  • Could be turned on by software
  • Not in later chips

24
Privacy
  • Cookies
  • Used to keep a track of the sites you visit
  • double-click and other advertising agencies are
    main employers of cookies
  • Carnivore sniffer
  • Employed by the FBI
  • Almost all emails can be scanned in real time
  • You could encrypt your message

25
Platform for Privacy Preferences (P3P)
  • P3P
  • Developed by World Wide Web Consortium
  • Protocol allowing users to interrogate websites
    about privacy
  • P3P-enabled site posts machine-readable privacy
    policy summary (IBM P3P editor)
  • User sets up his privacy preferences in his
    browser
  • Users browser examines the summary does not
    allow access to non-compliant sites
  • Compliance is voluntary. Validator available.
  • For more info see http//www.w3.org/P3P/

26
Using P3P on your Web site
  • Formulate privacy policy
  • Translate privacy policy into P3P format
  • Use a policy generator tool
  • Place P3P policy on web site
  • One policy for entire site or multiple policies
    for different parts of the site
  • Associate policy with web resources
  • Place P3P policy reference file (which identifies
    location of relevant policy file) at well-known
    location on server
  • Configure server to insert P3P header with link
    to P3P policy reference file or
  • Insert link to P3P policy reference file in HTML
    content

27
A Simple HTTP transaction
WebServer
SOURCE W3.ORG
28
Transaction with P3P 1.0
WebServer
SOURCE W3.ORG
29
The P3P vocabulary
  • Who is collecting data?
  • What data is collected?
  • For what purpose will data be used?
  • Is there an ability to opt-in or opt-out of some
    data uses?
  • Who are the data recipients (anyone beyond the
    data collector)?
  • To what information does the data collector
    provide access?
  • What is the data retention policy?
  • How will disputes about the policy be resolved?
  • Where is the human-readable privacy policy?

30
Transparency
http//www.att.com/accessatt/
  • P3P clients can check a privacy policy each time
    it changes
  • P3P clients can check privacy policies on all
    objects in a web page, including ads and
    invisible images

http//adforce.imgis.com/?adlink2685231146ADF
ORCE
31
Ways to Achieve Privacy
  • Encryption
  • Privacy of content
  • Compromised end nodes could expose everything
  • CPO (chief Privacy Officer) post in companies
  • Anonymity
  • Privacy of connection
  • Privacy of identifier

32
Why Anonymity?
  • A report by the American Association for the
    Advancement of Science (AAAS) found that
  • Anonymous communication online is a morally
    neutral technology.
  • Anonymous communication should be regarded as a
    strong human right in the U.S. it is a
    constitutional right (2nd amend.).

33
Why Anonymity?
  • The Internet provides previously inconceivable
    opportunities for gathering info about YOU!
  • Anonymous communication would provide ability for
    spamming, deception, and fraud.
  • In reality, most anonymous protocols require
    cooperation of recipient.
  • For good people provides privacy over the net,
    allows anon tips for police and journalists,
    whistle-blowing, discussion groups.

34
What is Anonymity?
  • Anonymus
  • of unknown authorship or origin, lacking
    individuality, distinction, or recognizability
    ltthe anonymous faces in the crowdgt
  • Merriam-Webster's Collegiate Dictionary
  • Anonymity does not mean that you cannot be
    identified.
  • Anonymity means that you are indistinguishable
    from some particular group The likelihood that
    you are the originator of a message is reduced.

35
Terminology
  • Terminology proposed by Pfitzman and Kohntopp
  • Last Modification June 17, 2001
  • Anonymity is the state of being not identifiable
    within a set of subjects, the anonymity set.
  • i.e. A sender will be anonymous among the set of
    possible senders, the same argument goes for the
    recipient.
  • The attacker never forgets anything so the
    anonymity set never increases for the attacker
    but decreases or has no change.
  • But if misinformation is used one would be able
    to introduce uncertainty into the system thus
    increase the anonymity set.

36
Unlinkability
  • Unlinkability of two or more items (e.g.,
    subjects, messages, events, actions, ...) means
    that within this system, these items are no more
    and no less related than they are related
    concerning the a-priori knowledge.
  • e.g.
  • Sender/Receiver (Anonymous Delivery)
  • Merchant/Buyer (Anonymous Authentication or
    electronic cash)

37
Anonymous Delivery
38
Types of Anonymity
  • Pfitzman and Waidner discuss 3 types of
    anonymity
  • Sender Anonymity
  • Receiver Anonymity
  • Unlinkability of Sender and Receiver

39
Levels of Anonymity
  • The probability of x being the initiator
  • The degree of anonymity

Ref. Shields, C. and Levine, B.N. 2000. A
protocol for Anonymous Communication Over the
Internet.
40
Informal Definition
  • Absolute Privacy means that the attacker has no
    way to distinguish the situation in which a
    potential sender actually sent communication and
    those in which it did not.
  • Beyond Suspicion means that the attacker can not
    distinguish between a set of possible senders.

41
Informal Definition
  • Probable Innocence if in the attackers point of
    view, the sender appears no more likely to be the
    originator
  • Possible Innocence from the attackers point of
    view if there is a nontrivial probability that
    the real sender is someone else.

42
Informal Definition
  • Exposed if from the attackers point of view there
    is a high probability about who the sender is.
  • Provably Exposed if the attacker can identify the
    identity of the sender and prove it to everyone
    else.

43
Single Proxy Approach (Trusted Third Party)
  • Anonymizer.com, Lucent personalized web
    assistant.
  • Connections between initiator and responder using
    a proxy.
  • Must trust the proxy!

P
I
R
44
Pseudonymity tools
45
Mix-net
  • Untraceable electronic mail, return addresses,
    and digital pseudonyms
  • Chaum 1981
  • Two Assumptions
  • No correlation between a set of sealed and
    unsealed items
  • Anyone may learn the origin, destination and
    representation of all messages and may inject,
    remove, or modify messages

46
Mix-net
  • Mix A computer node which will process each mail
    before it is delivered
  • Simple case
  • R is a nonce, M the message, and K public key

47
Mix-net
  • What happens if B needs to reply
  • We need an untraceable return address
  • So A includes the following with the message sent
  • Its return address K1(R1, A)
  • Also a public key generated for this occasion Ka
  • R is used for sealing by each mix
  • A has all the R(s) since he generated them
  • Ex.

48
Chaum Mixes (1981)
Sender
Mix C
C
kB
Mix A
C
kB
Mix B
Sender routes message randomly through network
of Mixes, using layered public-key encryption.
49
Crowds
50
Dining Cryptographer (DC) MIX - Chaum
  • n cryptographers at dinner
  • Waiter says bill has been paid either by one of
    them, or the NSA
  • How to determine if one of them paid while not
    knowing which one?
  • Each flips a coin and shows it to the
    cryptographer on the left thus each
    cryptographer sees two coins
  • Each then announces whether the two he saw were
    same or different unless he paid, in which case
    he lies
  • If they are all following the rules, the number
    of different will be even if none of them paid,
    odd if one paid
  • Vulnerable to cheating cryptographers
  • Used for anonymous broadcast

51
Anonymous Authentication
  • Unlinkable
  • Electronic cash
  • Electronic Voting
  • Solution Homomorphic encryption E(a b) E(a)
    E(b)
  • Is homomorphic encryption insecure?

52
CS Colloquium
  • David Chaum on (Paper Receipts in?) Electronic
    Voting
  • 21st Nov. 2003
  • 11 am noon
  • Computer Science Dept. Conference Room
  • Academic Center, Philips Hall 736
  • David is the inventor of electronic cash and
    electronic voting, and one of the very first to
    address electronic privacy. His lecture has been
    designed to be accessible to the lay person
Write a Comment
User Comments (0)
About PowerShow.com