ISA 662 Information System Security - PowerPoint PPT Presentation

1 / 50
About This Presentation
Title:

ISA 662 Information System Security

Description:

... s1, ..., sn such that si r oi and si w oi 1 for all i, 1 i n. When si r oi information flows from oi to si. When si w oi information flows from si to oi 1 ... – PowerPoint PPT presentation

Number of Views:26
Avg rating:3.0/5.0
Slides: 51
Provided by: lwa86
Category:

less

Transcript and Presenter's Notes

Title: ISA 662 Information System Security


1
ISA 662 Information System Security
  • Integrity Policies
  • Chapter 5

2
Overview
  • Background
  • Bibas models
  • Strict Integrity Policy
  • Low-Water-Mark Policy
  • Combining Biba and BLP
  • Lipners model
  • Clark-Wilson model

3
Background
  • Commercial world is very different from military
    world
  • Focus on integrity (is the information
    trustworthy?) rather than confidentiality (is the
    information secret?)
  • Subjects and objects are labeled with integrity
    levels from I
  • i1 i2 means the second level dominates the
    first (recall the definition of dominates from
    last lecture)
  • The higher the level, the more trustworthy
  • Trustworthyhigher integrity
  • Subject program on Windows CD (trusted) vs.
    downloaded Java applet (untrusted)
  • Object system logs (trusted) vs. an email
    attachment from nobody (untrusted)

4
Background (Contd)
  • Integrity policy vs. confidentiality policy
  • Integrity levels ? security levels (they may
    overlap)
  • General w/ secret clearance or secret orders are
    also trusted
  • A system binary is trusted but not secret (any
    user can read)
  • An untrusted Java applet must be secret (only
    admin can read)
  • Here read means read and execute
  • Information flows differently
  • Information is disclosed (flows down) when
  • Read-up a visitor (unclassified) reads personnel
    files (secret)
  • Write-down a general (secret) writes activity
    logs (unclassified)
  • Information is corrupted (flows up) when
  • Read-down IE (trusted) opens a JPEG file with
    virus (untrusted)
  • Write-up a downloaded Java applet (untrusted)
    writes something into Windows registry (trusted)

secret
unclassified
trusted
untrusted
5
Overview
  • Background
  • Bibas models
  • Strict Integrity policy
  • Low-Water-Mark policy
  • Combining Biba and BLP
  • Lipners model
  • Clark-Wilson model

6
Strict Integrity policy
  • a.k.a. Biba model
  • If BLP prevents information from flowing down
    (disclosed)
  • Then BLP-upside-down will prevent information
    from flowing up (corrupted)

Biba
BLP
? or dominate
information flow
? or dominate
High Integrity
Some integrity
Suspicious
Garbage
information flow
7
BLP Upside-down
  • BLP says read-down and write-up, so BLP
    upside-down, a.k.a Biba model, says read-up and
    write-down

Biba
BLP
information flow
High Integrity
read
write
Some integrity
Suspicious
read
write
Garbage
information flow
8
Notations
  • Set of subjects S, objects O, integrity levels I
  • i1 i2 says the latter dominates the former
  • recall the definition of dominate from last
    lecture
  • min(i1 , i2 ) denotes lesser of i1 and i2
  • i (s), i (o) denote integrity level of s ? S and
    o ? O, respectively
  • s r o, s w o, s x s, says s can read o, s can
    write o, s can execute s, respectively

9
Strict Integrity Policy (formal)
  • Bibas Model The Dual of BLP
  • For any s ? S and o ? O
  • s r o iff I (s) I (o) (read-up)
  • s w o iff I (o) I (s) (write-down)
  • s x s2 iff I (s2) I (s1) (execute-up)
  • Regard execute as a special read
  • Can add compartments and discretionary controls
    to get full dual of BLP model

10
Information Flows
  • An information transfer path is a sequence of
    objects o1, ..., on1 and corresponding sequence
    of subjects s1, ..., sn such that si r oi and si
    w oi1 for all i, 1 i n.
  • When si r oi information flows from oi to si
  • When si w oi information flows from si to oi1
  • Idea information can flow from o1 to on1 along
    this path by successive reads and writes

o1
o2
o3
On1
On
Sn
s1
s2
s3

information flow
write
read
11
Information Flow Result
  • If there is any information transfer path from
  • o1 ? O to on1 ? O, then strict integrity policy
    implies that i (on1) i (o1) holds for all n ?
    1.
  • No object can be corrupted, either directly
    (write up) or indirectly (first read down then
    write equal)

o1
high integrity
s1
o2
s2
o3
s3

On
Sn
On1
low integrity
write
read
12
Theorem Information Flow
  • (Theorem 6.1 from the textbook) If there is an
    information transfer path from o1 ? O to on1 ?
    O, then strict integrity policy implies that i
    (on1) i (o1) holds for all n ? 1.
  • Proof By induction
  • For n1
  • Case 1 s1 r o1 and s1 w o2 then by definition,
    i (on1) i (o1)
  • Case 2 s1 w o2 and s2 r o2 Is this possible? No

13
Proof continued.
  • The inductive case
  • Suppose the result is true for n
  • Want to prove for (n1)
  • By the inductive hypothesis, i (on) i (o1)
  • Need to show i (on1) i (o1)
  • Do so by proving i (on1) i (on)
  • Case 1 sn1 r on and sn1 w on1 then by
    definition, i (on1) i (on) - we are done !
  • Case 2 s1 w o2 and s2 r o2 Is this possible? No
    ?

14
Overview
  • Background
  • Bibas models
  • Strict Integrity policy
  • Low-Water-Mark policy
  • Combining Biba and BLP
  • Lipners model
  • Clark-Wilson model

15
Low Water Mark Policy
  • Motivation is to relax the strict integrity
    policy but still have the information flow claim
    valid
  • Two versions
  • Subject low-water-mark policy relaxes the read by
    allowing subjects to read down
  • Object low-water-mark policy relaxes the write by
    allowing subjects to write up

16
Subject Low Water Mark Policy
  • Idea s can read down, but once it does, its
    integrity level drops (so it cannot corrupt other
    objects)
  • Example After a machine reads emails infected
    with worm, the machine is no longer trusted and
    isolated
  • Rules For any s ? S and o ? O
  • s r o and s reads o implies i (s) min(i (s),
    i (o))
  • s w o iff i (o) i (s)
    (write-down)
  • s x s2 iff i (s2) i (s1)
    (execute-up)

17
Object Low-Water-Mark Policy
  • Idea s can write up, but the integrity level of
    any object s writes will drop
  • Example After a virus is detected, whatever
    files written by the virus are no longer trusted
    and deleted
  • Rules For any s ? S and o ? O
  • s r o iff i (s) i (o)
    (read-up)
  • s w o and s writes o implies i (o) min(i (s),
    i (o))
  • s x s2 iff i (s2) i (s1)
    (execute-up)

18
Information Flow Result
  • Theorem With the subject/object low-water-mark
    policy, the information flow result also holds.
    That is i (on1) i (o1) holds in the following
    cases

subject low-water-mark policy prevents s1 from
corrupting o2
object low-water-mark policy detects the
corruption of o2
o2
s1
o2
o1
s1
o1
s1
s1
o2
o2
o1
o1
read
write
19
Problems
  • With subject low-water-mark policy, subjects
    integrity levels never increase
  • Soon no subject will be able to access objects at
    high integrity levels
  • With object low-water-mark policy, objects can be
    easily corrupted
  • Soon all objects will be at the lowest integrity
    level
  • Implementation needs mechanisms to warn subjects
    about corruption (of the subject itself or the
    object being written by it)

20
Overview
  • Background
  • Bibas models
  • Strict Integrity policy
  • Low-Water-Mark policy
  • Combining Biba and BLP
  • Lipners model
  • Clark-Wilson model

21
Combining Biba and BLP
  • Important security levels (BLP) and integrity
    levels (Biba) are two different things
  • BLP MLS Access control
  • Biba Integrity
  • Whether they overlap with each other purely
    depends on applications
  • When they do overlap, the enforcement of BLP and
    Biba may conflict
  • What if they are exactly the same? a homework
    problem in the textbook!

22
Combining Biba and BLP (Contd)
  • What if they are exactly reversed?
  • Secret and un-trusted a downloaded software is
    un-trusted and should not be read/executed by
    everyone
  • Unclassified and trusted system binaries are
    trusted and can be executed by anyone
  • Now both the rules and the levels are dual, so
    BLP and Biba will work in the same way
  • Read-down in BLP becomes read-up in Biba
  • Write-up in BLP becomes write-down in Biba

23
Overview
  • Background
  • Bibas models
  • Strict Integrity policy
  • Low-Water-Mark policy
  • Combining Biba and BLP
  • Lipners model
  • Clark-Wilson model

24
Typical Commercial Requirements
  • Users will not write their own programs, but will
    use existing production programs and databases.
  • Programmers will develop and test programs on a
    non-production system if they do need access to
    production data, they will be given such data via
    a special process and can only use the data on
    development system.
  • A special process must be followed to transfer a
    program from the development system onto the
    production system.
  • The special process in requirement 3 must be
    controlled and audited.
  • The managers and auditors must have access to
    both the system state and the system logs that
    are generated.

25
Lipners Lattice (BLPBiba)
  • A realistic example showing that BLP and Biba can
    be combined to meet commercial requirements
  • How does it combine BLP and Biba?
  • Use disjoint sets of security levels and
    integrity levels
  • BLP goes first, and add in Biba only when
    necessary

26
The BLP Part
  • 2 security clearances/classifications
  • AM (Audit Manager) system audit, management
    functions
  • SL (System Low) any process can read at this
    level
  • 3 Security categories
  • SP (Production) production code, data
  • SD (Development) same as D
  • SSD (System Development) same as old SD
  • Security level(classification,category)

27
The Biba Part
  • 3 integrity classifications
  • ISP(System Program) for system programs
  • IO (Operational) production programs,
    development software
  • ISL (System Low) users get this on log in
  • 2 integrity categories
  • ID (Development) development entities
  • IP (Production) production entities
  • Integrity level(classification,category)

28
Subjects Levels at a Glance
29
Objects Levels at a Glance
30
The Lattice (Lipners Lattice)
S System Managers O Audit Trail
Only 9 out of 192 labels are used
LEGEND S Subjects O Objects
S System Control
S Application Programmers O Development Code
and Data
S System Programmers O System Code in
Development
S Repair S Production Users O Production Data
O Tools
O Repair Code
O Production Code
O System Programs
31
What It Achieves
  • Ordinary users can execute (read) production code
    but cannot alter it
  • Ordinary users can alter and read production data
  • System managers need access to all logs but
    cannot change levels of objects
  • System controllers need to install code (hence
    downgrade capability)
  • Logs are append only, so must dominate subjects
    writing them
  • These meet his aforementioned requirements
    (verify if you want)

32
Overview
  • Background
  • Bibas models
  • Strict Integrity policy
  • Low-Water-Mark policy
  • Combining Biba and BLP
  • Lipners model
  • Clark-Wilson model

33
Clark-Wilson Integrity Model
  • Time-proven accounting practices are extrapolated
    to computer world
  • Integrity policy are given as high-level rules
  • Remember these are policy no need to ask how?
  • Example Bank
  • Objective todays deposits - todays withdrawals
    yesterdays balance todays balance
  • Policy level 1 transactions must meet this
    objective
  • Policy level 2 users execute only those
    transactions
  • Policy level 3 certifiers must ensure users do
    so
  • Policy level 4 logs will monitor certifiers in
    doing so

34
Clark-Wilson Integrity Model (Contd)
  • The key contribution is that this hierarchical
    structure reduces the dependency on special
    trusted subjects
  • Certifiers will enforce users to run only good
    transactions, and logs will in turn monitor
    certifiers
  • But who will then monitor log auditors?
  • Trust is always needed

35
Elements of the model
  • Users Active agents
  • CDIs Constrained Data Items (data that need
    integrity)
  • UDIs Unconstrained Data Items (data that dont
    need integrity)
  • TPs Transformation Procedures (like commands in
    Access Control Maatrices, but rather for debit,
    credit)
  • IVPs Integrity Verification Procedures (run
    periodically to check integrity of CDIs)

36
How The Elements Interact
Verify integrity
USERS
Transform valid ? valid
TPs
IVPs
CDIs
UDIs
37
Enforcement Rules at a Glance
  • Certification Rules
  • CR1 IVPs verify CDI integrity
  • CR2 TPs preserve CDI integrity
  • CR3 Separation of duties for ER2
  • CR4 TPs write to log
  • CR5 TPs upgrade UDIs to CDIs
  • Enforcement Rules
  • ER1 CDIs changed only by authorized TP
  • ER2 TP run only by authorized users
  • ER3 Users are authenticated
  • ER4 Authorizations changed only by certifiers

38
Certification Rules 1,2,3
  • CR1 When any IVP is run, it must ensure all CDIs
    are in a valid state
  • CR2 For some associated set of CDIs, a TP must
    transform those CDIs in a valid state into a
    (possibly different) valid state
  • A relation certified associates a set of CDIs
    with a particular TP
  • Say (before1,after1), (before2, after2)
    (beforen, aftern)
  • Example TP withdraw money, CDIs accounts, in
    bank example
  • CR3 The allowed relations must meet the
    requirements imposed by the principle of
    separation of duty

39
Certification Rules 4 and 5
  • CR4 All TPs must append enough information to
    reconstruct the operation to append-only CDI.
  • Because the auditor needs to be able to determine
    what happened during reviews of transactions
  • CR5 Any TP that takes as input a UDI either
    rejects the UDI or transforms it into a CDI.
  • In bank, deposit amounts entered at keyboard are
    UDIs. TPs must validate numbers (to make them a
    CDI) before using them if validation fails, TP
    rejects UDI

40
Enforcement Rules 1 and 2
  • ER1 The system must maintain the certified
    relations and must ensure that only TPs certified
    to run on a CDI manipulate that CDI.
  • ER2 The system must associate a user with each TP
    and set of CDIs. The TP may access those CDIs on
    behalf of the associated user. The TP cannot
    access that CDI on behalf of a user not
    associated with that TP and CDI.
  • System must maintain, enforce certified relation
  • System must also maintain allowed relation, which
    restricts access based on user ID

41
Enforcement Rules 3 and 4
  • ER3 The system must authenticate each user
    attempting to execute a TP
  • Authentication not required before use of the
    system, but is required before manipulation of
    CDIs
  • ER4 Only the certifier of a TP may change the
    list of entities associated with that TP. No
    certifier of a TP, or of an CDI associated with
    that TP, may ever have execute permission on the
    TP/CDI
  • Enforces separation of duty with respect to
    certified and allowed relations

42
Key Points
  • Commercial world needs integrity
  • Biba model
  • Dual of BLP (or BLP-upside-down)
  • Integrity levels distinct from security levels
  • Information flows differently
  • Can be combined with BLP
  • Lipners lattice combines the two to meet
    commercial requirements
  • Clark-Wilson model
  • Accounting approaches ported to computer world
  • Enforcement hierarchy reduces dependency on trusts

43
Review for the Mid-term
  • First five chapters Role-based access control
  • The nature of the exam
  • 5-6 questions
  • Similar to the homework
  • May have some modeling, some policy, some
    descriptions

44
Review
  • Chapter 1
  • CIA of Information Security
  • What they are
  • Given a set of requirements, can we categorize
    them?
  • Access control matrix
  • Safe state
  • Safe state written as a (pre-condition, post
    condition) pair of read, write and access
    operations
  • Add/delete rights
  • Add/delete subjects, objects and operations

45
Review Chapter 1 Continued
  • Mono Operational Commands
  • Single operations like add make P the owner of
    file Q
  • Written formally as make.owner(p,q)
  • Conditional commands
  • If p owns f, then let p give r rights to q
  • How to write them formally
  • Multiple conditions

46
Review of Chapter 2 Foundations
  • ACM, ACL and capabilities
  • Turing machines
  • Un-decidability
  • HRU Result
  • Is there an algorithm, that given an initially
    safe state halts and say yes/no to the safety
    after granting a generic right r ?
  • Method Encode safety, granting rights etc as
    Turing machine instructions
  • Special cases are decidable
  • Take-grant model

47
Review of Chapter 2 Foundations
  • Capability based systems
  • Lock and key model
  • Lockobject, keysubject
  • Object carries permissions subject presents key
    to unlock object

48
Review of Chapter 3 Policies
  • Formalization of security policy using precise
    policy languages
  • DAC, MAC and RBAC
  • Specification of DAC using subjects objects and
    access rights

49
Review MAC
  • Review and background
  • Lattices
  • Military systems and Dennings Axioms
  • Bell-LaPadula (BLP) Policy
  • Step 1 clearance/classification
  • Step 2 categories
  • Example System DG/UX
  • Tranquility
  • Controversy at a glance

50
Review Role-Based Access Control
  • Overview
Write a Comment
User Comments (0)
About PowerShow.com