CSCE 548 Secure System Standards Risk Management - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

CSCE 548 Secure System Standards Risk Management

Description:

Improve defenses, software, hardware, middleware, physical security, etc. ... Designers: guide for the specification of security requirements. CSCE 548 - Farkas ... – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 52
Provided by: far1
Category:

less

Transcript and Presenter's Notes

Title: CSCE 548 Secure System Standards Risk Management


1
CSCE 548 Secure System StandardsRisk Management
2
Reading
  • This lecture
  • McGraw Chapter 2
  • Recommended
  • Rainbow Series Library, http//www.radium.ncsc.mil
    /tpep/library/rainbow/
  • Common Criteria, http//csrc.nist.gov/cc/
  • Next lecture
  • Software Development Lifecycle Dr. J. Vidal
  • Handout on SDLC and UML

3
Incident Handling
4
How to Response?
  • Actions to avoid further loss from intrusion
  • Terminate intrusion and protect against
    reoccurrence
  • Law enforcement prosecute
  • Enhance defensive security
  • Reconstructive methods based on
  • Time period of intrusion
  • Changes made by legitimate users during the
    effected period
  • Regular backups, audit trail based detection of
    effected components, semantic based recovery,
    minimal roll-back for recovery.

5
Roles and Responsibilities
  • User
  • Vigilant for unusual behavior
  • Report incidents
  • Manager
  • Awareness training
  • Policies and procedures
  • System administration
  • Install safeguards
  • Monitor system
  • Respond to incidents, including preservation of
    evidences

6
Computer Incident Response Team
  • Assist in handling security incidents
  • Formal
  • Informal
  • Incident reporting and dissemination of incident
    information
  • Computer Security Officer
  • Coordinate computer security efforts
  • Others law enforcement coordinator,
    investigative support, media relations, etc.

7
Incident Response Process 1.
  • Preparation
  • Baseline Protection
  • Planning and guidance
  • Roles and Responsibilities Training
  • Incident response team

8
Incident Response Process 2.
  • Identification and assessment
  • Symptoms
  • Nature of incident
  • Identify perpetrator, origin and extent of attack
  • Can be done during attack or after the attack
  • Gather evidences
  • Key stroke monitoring, honey nets, system logs,
    network traffic, etc.
  • Legislations on Monitoring!
  • Report on preliminary findings

9
Incident Response Process 3.
  • Containment
  • Reduce the chance of spread of incident
  • Determine sensitive data
  • Terminate suspicious connections, personnel,
    applications, etc.
  • Move critical computing services
  • Handle human aspects, e.g., perception
    management, panic, etc.

10
Incident Response Process 4.
  • Eradication
  • Determine and remove cause of incident if
    economically feasible
  • Improve defenses, software, hardware, middleware,
    physical security, etc.
  • Increase awareness and training
  • Perform vulnerability analysis

11
Incident Response Process 5.
  • Recovery
  • Determine course of action
  • Reestablish system functionality
  • Reporting and notifications
  • Documentation of incident handling and evidence
    preservation

12
Follow Up Procedures
  • Incident evaluation
  • Quality of incident (preparation, time to
    response, tools used, evaluation of response,
    etc.)
  • Cost of incident (monetary cost, disruption, lost
    data, hardware damage, etc.)
  • Preparing report
  • Revise policies and procedures

13
Security Awareness and Training
  • Major weakness users unawareness
  • Organizational effort
  • Educational effort
  • Customer training
  • Federal Trade Commission program to educate
    customers about web scams

14
Risk Management
15
Risk Assessment
16
Financial Loss
Dollar Amount Losses by Type
Total Loss (2006) 53,494,290 CSI/FBI Computer
Crime and Security Survey Computer Security
Institute
17
Security Protection
18
Real Cost of Cyber Attack
  • Damage of the target may not reflect the real
    amount of damage
  • Services may rely on the attacked service,
    causing a cascading and escalating damage
  • Need support for decision makers to
  • Evaluate risk and consequences of cyber attacks
  • Support methods to prevent, deter, and mitigate
    consequences of attacks

19
System Security Engineering (Traditional View)
Specify System Architecture
Identify and Install Safeguards
Identify Threats, Vulnerabilities, Attacks
Prioritize Vulnerabilities
Estimate Risk
Risk is acceptably low
20
Risk Management Framework (Business Context)
Understand Business Context
21
Understand the Business Context
  • Who cares?
  • Identify business goals, priorities and
    circumstances, e.g.,
  • Increasing revenue
  • Meeting service-level agreements
  • Reducing development cost
  • Generating high return investment
  • Identify software risk to consider

22
Identify Business and Technical Risks
  • Why should business care?
  • Business risk
  • Direct threat
  • Indirect threat
  • Consequences
  • Financial loss
  • Loss of reputation
  • Violation of customer or regulatory constraints
  • Liability
  • Tying technical risks to the business context in
    a meaningful way

23
Synthesize and Rank the Risks
  • What should be done first?
  • Prioritization of identified risks based on
    business goals
  • Allocating resources
  • Risk metrics
  • Risk likelihood
  • Risk impact
  • Risk severity
  • Number of emerging risks

24
Define the Risk Mitigation Strategy
  • How to mitigate risks?
  • Available technology and resources
  • Constrained by the business context what can the
    organization afford, integrate, and understand
  • Need validation techniques

25
Carry Out Fixes and Validate
  • Perform actions defined in the previous stage
  • Measure completeness against the risk
    mitigation strategy
  • Progress against risk
  • Remaining risks
  • Assurance of mechanisms
  • Testing

26
Measuring and Reporting
  • Continuous and consistent identification and
    storage of risk information over time
  • Maintain risk information at all stages of risk
    management
  • Establish measurements, e.g.,
  • Number of risks, severity of risks, cost of
    mitigation, etc.

27
Assets-Threat Model (1)
  • Threats compromise assets
  • Threats have a probability of occurrence and
    severity of effect
  • Assets have values
  • Assets are vulnerable to threats

Threats
Assets
28
Assets-Threat Model (2)
  • Risk expected loss from the threat against an
    asset
  • RVPS
  • R risk
  • V value of asset
  • P probability of occurrence of threat
  • V vulnerability of the asset to the threat

29
System-Failure Model
  • Estimate probability of highly undesirable events
  • Risk likelihood of undesirable outcome

Threat
Undesirable outcome
System
30
Risk Acceptance
  • Certification
  • How well the system meet the security
    requirements (technical)
  • Accreditation
  • Managements approval of automated system
    (administrative)

31
Building It Secure
  • 1960s US Department of Defense (DoD) risk of
    unsecured information systems
  • 1970s
  • 1977 DoD Computer Security Initiative
  • US Government and private concerns
  • National Bureau of Standards (NBS now NIST)
  • Responsible for standards for acquisition and use
    of federal computing systems
  • Federal Information Processing Standards (FIPS
    PUBs)

32
NBS
  • Two initiatives for security
  • Cryptography standards
  • 1973 invitation for technical proposals for
    ciphers
  • 1977 Data Encryption Standard
  • 2001 Advanced Encryption Standard (NIST)
  • Development and evaluation processes for secure
    systems
  • Conferences and workshops
  • Involves researchers, constructors, vendors,
    software developers, and users
  • 1979 Mitre Corporation entrusted to produce an
    initial set of criteria to evaluate the security
    of a system handling classified data

33
National Computer Security Center
  • 1981 National Computer Security Center (NCSC)
    was established within NSA
  • To provide technical support and reference for
    government agencies
  • To define a set of criteria for the evaluation
    and assessment of security
  • To encourage and perform research in the field of
    security
  • To develop verification and testing tools
  • To increase security awareness in both federal
    and private sector
  • 1985 Trusted Computer System Evaluation Criteria
    (TCSEC) Orange Book

34
Orange Book
  • Orange Book objectives
  • Guidance of what security features to build into
    new products
  • Provide measurement to evaluate security of
    systems
  • Basis for specifying security requirements
  • Security features and Assurances
  • Trusted Computing Base (TCB) security components
    of the system hardware, software, and firmware
    reference monitor

35
Orange Book
  • Supply
  • Users evaluation metrics to assess the
    reliability of the security system for protection
    of classified or sensitive information when
  • Commercial product
  • Internally developed system
  • Developers/vendors design guide showing security
    features to be included in commercial systems
  • Designers guide for the specification of
    security requirements

36
Orange book
  • Set of criteria and requirements
  • Three main categories
  • Security policy protection level offered by the
    system
  • Accountability of the users and user operations
  • Assurance of the reliability of the system

37
Security Policy
  • Concerns the definition of the policy regulation
    the access of users to information
  • Discretionary Access Control
  • Mandatory Access Control
  • Labels for objects and subjects
  • Reuse of objects basic storage elements must be
    cleaned before released to a new user

38
Accountability
  • Identification/authentication
  • Audit
  • Trusted path no users are attempting to access
    thr system fraudulently

39
Assurance
  • Reliable hardware/software/firmware components
    that can be evaluated separately
  • Operation reliability
  • Development reliability

40
Operation reliability
  • During system operation
  • System architecture TCB isolated from user
    processes, security kernel isolated from
    non-security critical portions of the TCB
  • System integrity correct operation (use
    diagnostic software)
  • Covert channel analysis
  • Trusted facility management separation of duties
  • Trusted recovery recover security features after
    TCB failures

41
Development reliability
  • System reliable during the development process.
    Formal methods.
  • System testing security features tested and
    verified
  • Design specification and verification correct
    design and implementation wrt security policy.
    TCB formal specifications proved
  • Configuration management configuration of the
    system components and its documentation
  • Trusted distribution no unauthorized
    modifications

42
Documentation
  • Defined set of documents
  • Minimal set
  • Trusted facility manual
  • Security features users guide
  • Test documentation
  • Design documentation
  • Personnel info Operators, Users, Developers,
    Maintainers

43
Orange Book Levels
  • Highest Security
  • A1 Verified protection
  • B3 Security Domains
  • B2 Structured Protection
  • B1 Labeled Security Protections
  • C2 Controlled Access Protection
  • C1 Discretionary Security Protection
  • D Minimal Protection
  • No Security

44
NCSC Rainbow Series
  • Orange Trusted Computer System Evaluation
    Criteria
  • Yellow Guidance for applying the Orange Book
  • Red Trusted Network Interpretation
  • Lavender Trusted Database Interpretation

45
Evaluation Process
  • Preliminary technical review (PTR)
  • Preliminary technical report architecture
    potential for target rating
  • Vendor assistance phase (VAP)
  • Review of the documentation needed for the
    evaluation process, e.g., security features
    users guide, trusted facility manual, design
    documentation, test plan. For B or higher,
    additional documentations are needed, e.g.,
    covert channel analysis, formal model, etc.
  • Design analysis phase (DAP)
  • Initial product assessment report (IPAR) 100-200
    pages, detailed info about the hardware, software
    architecture, security relevant features, team
    assessments, etc.
  • Technical Review Board
  • Recommendation to the NCSC

46
Evaluation Process
  • Formal evaluation phase (FEP)
  • Product Bulletin formal and public announcement
  • Final Evaluation Report information from IPAR
    and testing results, additional tests, review
    code (B2 and up), formal policy model, proof.
  • Recommends rating for the system
  • NCSC decides final rating
  • Rating maintenance phase (RAMP)
  • Minor changes and revisions
  • Reevaluated
  • Rating maintenance plan

47
European Criteria
  • German Information Security Agency German Green
    Book (1988)
  • British Department of Trade and Industry and
    Ministry of Defense several volumes of criteria
  • Canada, Australia, France works on evaluation
    criteria
  • 1991 Information Technology Security Evaluation
    Criteria (ITSEC)
  • For European community
  • Decoupled features from assurance
  • Introduced new functionality requirement classes
  • Accommodated commercial security requirements

48
Common Criteria
  • January 1996 Common Criteria
  • Joint work with Canada and Europe
  • Separates functionality from assurance
  • Nine classes of functionality audit,
    communications, user data protection,
    identification and authentication, privacy,
    protection of trusted functions, resource
    utilization, establishing user sessions, and
    trusted path.
  • Seven classes of assurance configuration
    management, delivery and operation, development,
    guidance documents, life cycle support, tests,
    and vulnerability assessment.

49
Common Criteria
  • Evaluation Assurance Levels (EAL)
  • EAL1 functionally tested
  • EAL2 structurally tested
  • EAL3 methodologically tested and checked
  • EAL4 methodologically designed, tested and
    reviewed
  • EAL5 semi-formally designed and tested
  • EAL6 semi-formally verified and tested
  • EAL7 formally verified design and tested

50
National Information Assurance Partnership (NIAP)
  • 1997 National Institute of Standards and
    Technology (NIST), National Security Agency
    (NSA), and Industry
  • Aims to improve the efficiency of evaluation
  • Transfer methodologies and techniques to private
    sector laboratories
  • Functions developing tests, test methods, tools
    for evaluating and improving security products,
    developing protection profiles and associated
    tests, establish formal and international schema
    for CC

51
Next Class
  • Software Development Lifecycle
Write a Comment
User Comments (0)
About PowerShow.com