2. Noninterference - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

2. Noninterference

Description:

Security properties based on information flow describe end-to-end behavior of system ... erasing high terms from ci. erasing high memory locations from mi ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 22
Provided by: AND118
Category:

less

Transcript and Presenter's Notes

Title: 2. Noninterference


1
  • 2. Noninterference
  • 4 Sept 2002

2
Information-Flow
  • Security properties based on information flow
    describe end-to-end behavior of system
  • Access control This file is readable only by
    processes I have granted authority to.
  • Information-flow control The information in
    this file may be released only to appropriate
    output channels, no matter how much intervening
    computation manipulates it.

3
Noninterference
  • Goguen Meseguer 1982, 1984
  • Low output of a system is unaffected by high input

L
H1
L
H2
?L
L?
H1?
L?
H2?
4
Security properties
  • Confidentiality is information secret? L
    public, H confidential
  • Integrity is information trustworthy?L
    trusted, H untrusted
  • Partial order L ? H,H ?? L, information
    canonly flow upward inorder
  • Channels ways forinputs to influenceoutputs

L
H1
H1?
L?
5
Formalization
  • No agreement on how to formalize in general
  • GM84 (simplified) system is defined by a
    transition function do SE ? S and low output
    function out S ? O (what the low user can see)
  • S is the set of system states
  • E is the set of events (inputs) either high or
    low
  • Trace t is sequence of state-event
    pairs((s0,e0),(s1,e1), ) where si1
    do(si, ei)
  • Noninterference for all event histories
    (e0,,en) that differ only in high events,
    out(sn) is the same where sn is the final state
    of the corresponding traces
  • Alternatively out(sn) defined by results from a
    purged event history

6
Example
2
h1
h2
l
2
2
3
l
l
hx
3
3
hx
hx
  • Visible output from input sequences (l), (h1,l),
    (h2,l) is 3
  • Visible output from input sequences (), (h1),
    (h2) is 2
  • Low part of input determines visible results

7
Limitations
  • Doesnt deal with all transition functions
  • partial (e.g., nontermination)
  • nondeterministic (e.g., concurrency)
  • sequential input, output assumption

8
A generalization
  • Key idea behaviors of the system C should not
    reveal more information than the low inputs
  • Consider applying C to inputs s
  • Define
  • ?C?s is the result of C applied to s (do)
  • s1 L s2 means inputs s1 and s2 are
    indistinguishable to the low user (same purge)
  • ?C?s1 ?L ?C?s2 means results are
    indistinguishable the low view relation (same
    out)
  • Noninterference for C s1 L s2 ? ?C?s1 ?L
    ?C?s2Low observer doesnt learn anything new

9
Unwinding condition
  • Induction hypothesis for proving noninterference
  • Assume ?C?, ?L defined using traces

l
h
s1?
s1
s1?
s1
L
L
L
(s1?L s1?)
L
(s1L s1?)
l
s2?
s2
s2
  • By induction traces differing only in high
    steps, starting from equivalent states, preserve
    equivalence
  • L must be an equivalenceneed transitivity

10
Example
  • System is a program with a memory
  • if h1 then h2 0
  • else h2 1
  • l 1
  • s ?c, m?
  • ?c1,m1? L ?c2, m2? if identical after
  • erasing high terms from ci
  • erasing high memory locations from mi
  • Choice of L controls what low observer can see
    at a moment in time
  • Current command c included in state to allow
    proof by induction

11
Example
if h1 then h2 0 else h2 1 l 1,h1?0,
h2?1, l?0
L
if h1 then h2 0 else h2 1 l 1, h1?1,
h2?1, l?0
h2 1 l 1, h1?0, h2?1, l?0
L
h2 0 l 1, h1?1, h2?1, l?0
L
l 1, h1?0, h2?1, l?0
l 1, h1?1, h2?0, l?0
L
h1?0, h2?1, l?1
h1?1, h2?0, l?1
12
Nontermination
  • Is this program secure?
  • while h gt 0 do h h1l 1
  • h ?0, l ?0 ?? h ?0, l ?1
  • h ?1, l ?0 ?? h ?i, l ?0 (?igt0)
  • Low observer learns value of h by observing
    nontermination, change to l
  • But might want to ignore this channel to make
    analysis feasible

13
Equivalence classes
  • Equivalence relation L generates equivalence
    classes of states indistinguishable to attacker
  • sL s? s? L s
  • Noninterference ? transitions act uniformly on
    each equivalence class
  • Given trace t (s1, s2, ), low observer sees at
    most (s1L, s2L, )

14
Low views
  • Low view relation ?L on traces modulo L
    determines ability of attacker to observe system
    execution
  • Termination-sensitive but no ability to see
    intermediate states(s1, s2,,sm) ?L (s?1,
    s?2,s?n) if smL s?n all infinite traces are
    related by ?L
  • Termination-insensitive(s1, s2,,sm) ?L (s?1,
    s?2,s?n) if smL s?n infinite traces are
    related by ?L to all traces
  • Timing-sensitive(s1, s2,,sn) ?L (s?1,
    s?2,s?n) if snL s?n all infinite traces are
    related by ?L
  • Not always an equivalence relation!

15
Nondeterminism
  • Two sources of nondeterminism
  • Input nondeterminism
  • Internal nondeterminism
  • GM assume no internal nondeterminism
  • Concurrent systems are nondeterministic

s2 ?? s2? s1 s2?? s1 s2?
s1 ?? s1? s1 s2?? s1? s2
  • Noninterference for nondeterministic
    systems? ?s1, s2 . s1 L s2 ? ?C?s1 ?L ?C?s2

16
Possibilistic security
  • Sutherland 1986, McCullough 1987
  • Result of a system ?C?s is set of possible
    outcomes (traces)
  • Low view relation on traces is lifted to sets of
    traces
  • ?C?s1 ?L ?C?s2 if
  • ?t1??C?s1 . ?t2??C?s2 . t1 ?L t2
  • ?t2??C?s2 . ?t1??C?s1 . t2 ?L t1
  • For any trace produced by C1 there is an
    indistinguishable one produced by C2 (and
    vice-versa)

17
Proving possibilistic security
  • Almost the same induction hypothesis

l
h
s1?
s1
s1?
s1
L
L
L
(s1?L s1?)
L
(s1L s1?)
l
s2?
s2
s2
  • Show that there is a transition that preserves
    state equivalence (for termination-insensitive
    security)

18
Example
  • l true l false l h
  • htrue possible results are
  • h?true, l?false, h?true, l?true
  • h falseh?false, l?false, h?false, l?true
  • Program is possibilistically secure

L
L
19
What is wrong?
l true l false l h
  • Round-robin scheduler program equiv. to lh
  • Random scheduler h most probable value of l
  • System has a refinement with information leak

lh
ltrue
lfalse
20
Refinement attacks
  • Implementations of an abstraction generally
    refine (at least probabilistically) transitions
    allows by the abstraction
  • Attacker may exploit knowledge of implementation
    to learn confidential info.
  • l true l false
  • Is this program secure?

21
Determinism-based security
  • Require that system is deterministic from the low
    viewpoint Roscoe95
  • High information cannot affect low output no
    nondeterminism to refine
  • Another way to generalize noninterference to
    nondeterministic systems dont change
    definition!
  • ?s1, s2 . s1 L s2 ? ?C?s1 ?L ?C?s2
  • Nondeterminism may be present, but not observable
  • More restrictive than possibilistic security
Write a Comment
User Comments (0)
About PowerShow.com