The Systems Security Engineering Capability Maturity Model - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

The Systems Security Engineering Capability Maturity Model

Description:

Software project tracking and oversight. Software project planning. Requirements management ... effort by industry and government on their own funding. 10 ... – PowerPoint PPT presentation

Number of Views:104
Avg rating:3.0/5.0
Slides: 46
Provided by: sse4
Category:

less

Transcript and Presenter's Notes

Title: The Systems Security Engineering Capability Maturity Model


1
The Systems Security Engineering Capability
Maturity Model
  • Karen Ferraiolo
  • Arca Systems, Inc.
  • October 24, 1996

2
What you will learn
  • Why a CMM for Security Engineering?
  • What is a CMM?
  • What is the SSE-CMM?
  • How will the SSE-CMM be used?
  • How can I get involved?

10/24/96
2
3
Agenda
  • History the Need 1030 - 1050
  • Process Improvement and CMMs 1050 - 1110
  • SSE-CMM Overview 1110 - 1130
  • Using the SSE-CMM 1130 - 1155
  • Project Status 1155 - 1200

10/24/96
3
4
History and the Need
5
What is security engineering?
  • Security engineering, or aspects thereof,
    attempts to
  • establish a balanced set of security needs
  • transform security needs into security guidance
  • establish confidence in the correctness and
    effectiveness of security mechanisms
  • judge that operational impacts due ro residual
    security vulnerabilities are tolerable
  • integrate all aspects into a combined
    understanding of the trustworthiness of a system

10/24/96
5
6
Where are we now?
  • Security products come to market through
  • lengthy and expensive evaluation
  • no evaluation
  • Results
  • technology growth more rapid than its
    assimilation
  • unsubstantiated security claims
  • Causes?

10/24/96
6
7
What is needed?
  • continuity
  • repeatability
  • efficiency
  • assurance

10/24/96
7
8
One Potential Solution
  • Can knowing something about the organization or
    individual provide a solution?
  • Examples
  • ISO 9000
  • Certification of Information System Security
    Professionals (CISSP)
  • Capability Maturity Model (CMM)
  • Malcolm Baldridge National Quality Award
  • Past Performance

10/24/96
8
9
Process Improvement CMMs
10
Process Capability
  • Process Capability
  • the range of expected results that can be
    achieved by following a process
  • a predictor of future project outcomes
  • Process Performance
  • a measure of the actual results achieved from
    following a process (on a particular project)
















10/24/96
10
11
Statistical Process Control
  • A process in statistical control
  • has definable, measurable, communicable
  • identity
  • capability
  • limits of variation are predictable
  • however,
  • it does not imply the absense of defective items

Once statistical control has been established,
work can begin to improve quality and economy of
production
10/24/96
11
12
Process Maturity
  • extent to which process is explicitly
  • defined managed measured controlled effective
  • implies a potential for growth in capability
  • indicates richness of process and consistency of
    its application

10/24/96
12
13
Why are Maturity Levels Important?
  • Maturity Levels (in Capability Maturity Models)
  • define ordinal scale for measuring / evaluating
    process capability
  • define incremental steps for improving process
    capability
  • Maturity Levels Discriminate Process Capability

10/24/96
13
14
How do CMMs define Maturity?
  • Two aspects
  • the domain
  • process areas
  • base practices
  • the organization
  • institutionalization of process areas
  • implementation of process areas

10/24/96
14
15
How do CMMs define Maturity?Staged Capability
Maturity Model
  • Process Areas (PAs) define Process Maturity for a
    specific domain
  • Capability Maturity within a specific domain is
    achieved by implementation of specific PAs
  • Institutionalization / Implementation aspects are
    addressed within PAs
  • Domain Process Maturity
  • is defined in
  • Model Structure

4 Managed
Quality management
Software quality management
Process measurement and analysis

Quantitative process management
10/24/96
15
16
How do CMMs define Maturity? Continuous
Capability Maturity Model
  • Process Areas (PAs) organize practices of a
    specific domain
  • Institutionalization / implementation of PAs
    define the Process Maturity for any domain
  • Capability Maturity needs to be interpreted for a
    specific domain
  • Domain Process Maturity
  • must be defined by
  • Model Appraisal Structure

applied to
Capability
Domain
5 Continuously Improving
4 Quantitatively Controlled
3 Well Defined
Process Areas
2 Planned Tracked
1 Performed






Base Practices
Base Practices
Common Features
Common Features

Base Practices
Base Practices


Base Practices
Generic Practices
Base Practices
Generic Practices
10/24/96
16
17
Vocabulary Summary
18
Vocabulary
  • ORGANIZATION - a company or entity within a
    company within which many projects are managed as
    a whole
  • PROJECT - the aggregate of effort and resources
    focused on developing and/or maintaining a
    specific product or providing a service
  • SYSTEM - the sum of products being delivered to a
    customer or user denoting a product as a system
    acknowledges the need to treat all elements of a
    product and their inerfaces in a disciplined and
    systematic way
  • WORK PRODUCT - all documents, reports, files,
    data, etc., generated in the course of performing
    any process
  • CUSTOMER - the individual(s) or entity for whom a
    product is developed or service is rendered,
    and/or who uses the product or service
  • PROCESS - a set of activities performed to
    achieve a given purpose
  • PROCESS AREA (PA) - a defined set of related
    process characteristics, which when performed
    collectively, can achieve a defined purpose

10/24/96
18
19
Vocabulary
  • PROCESS CAPABILITY - the quantifiable range of
    expected results that can be achieved by
    following a process helps to predict a projects
    ability to meet its goals
  • INSTITUTIONALIZATION - the building of
    infrastructure and corporate culture that support
    methods, practices, and procedures so that they
    are the ongoing way of doing business, even after
    those who originally defined them are gone
  • PROCESS MANAGEMENT - the set of activities and
    infrastructures used to predict, evaluate, and
    control the performance of a process
  • CAPABILITY MATURITY MODEL (CMM) - describes the
    stages through which processes progress as they
    are defined, implemented, and improved
  • CAPABILITY LEVEL - a set of implementation and
    institutionalization practices that work together
    to provide a major enhancement in the ability to
    perform a process area

10/24/96
19
20
Vocabulary
  • ASSURANCE - the degree of confidence that
    security needs are satisfied
  • GROUP - the collection of individuals that has
    responsibility for a set of tasks or activities
  • ENGINEERING GROUP - the collection of individuals
    (both managers and technical staff) that is
    responsible for project or organizational
    activities related to a particular engineering
    discipline
  • SECURITY ENGINEERING GROUP - the collection of
    individuals (both managers and technical staff)
    which is responsible for project or
    organizational security engineering activities
  • SYSTEMS ENGINEERING CMM (SSE-CMM) - developed for
    the discipline of systems engineering structure
    is the basis for the SSE-CMM

10/24/96
20
21
SSE-CMM Overview
22
SSE-CMM Model Architecture(based on SE-CMM
Architecture)
Domain
Capability
Domain
Capability Levels
Continuously Improving
Process Areas
Organization
Capability Levels
Quantitatively Controlled
Process Areas
Project
Capability Levels
Well Defined
Process Areas
Security Engineering
Capability Levels
Planned Tracked
Capability Levels
Performed
Capability Levels
Initial
Process Areas
Common Features

Process Areas



Common Features
Process Areas





Base Practices
Base Practices
Generic Practices
Base Practices
Base Practices
Base Practices
Generic Practices
Base Practices
10/24/96
22
23
SSE-CMM Architecture(Capability Aspect)
Common Features
Generic Practices
Generic Practices
Common Features
Capability Level
Generic Practices
Common Features
Implementation or institutionalization practices
that enhance the capability to perform any
process
Set of practices that address the same aspect of
process management or institutionalization
A set of common features that work together to
provide a major enhancement in the capability to
perform a process
10/24/96
23
24
SSE-CMMCapability Levels and Common Features
  • 0 INITIAL
  • 1 PERFORMED INFORMALLY
  • Base practices performed
  • 2 PLANNED TRACKED
  • Planning performance
  • Disciplined performance
  • Verifying performance
  • Tracking performance
  • 3 WELL-DEFINED
  • Defining a standard process
  • Perform the defined process
  • Coordinate practices
  • 4 QUANTITATIVELY CONTROLLED
  • Establishing measurable quality goals
  • Objectively managing performance
  • 5 CONTINUOUSLY IMPROVING
  • Improving organizational capability
  • Improving process effectiveness

Note Capability Levels and Common Features
are taken from the SE-CMM Italics
indicate SSE-CMM additional Common Feature
10/24/96
24
25
SSE-CMM Architecture(Domain Aspect)
Engineering or management practices that address
the purpose of a particular process area and thus
belong to it
Base Practices
Base Practices
Base Practices
Process Areas
Sets of related practices, which when performed
collectively, can achieve the purpose of the
process area
Process Areas
Process Areas
Process Category
A set of process areas addressing the same
general area of activity
10/24/96
25
26
Security Engineering Process Areas
  • PA01 Specify Security Needs
  • PA02 Provide Security Input
  • PA03 Verify and Validate Security
  • PA04 Attack Security
  • PA05 Assess Security Risk
  • PA06 Build Assurance Argument
  • PA07 Monitor System Security Posture
  • PA08 Manage System Security Controls
  • PA09 Coordinate Security
  • PA10 Determine Security Vulnerabilities

10/24/96
26
27
Basis for Engineering Process Areas(Security
Engineering Providers)
from SSE-CMM Model and Application
Report October 2, 1995
10/24/96
27
28
Project/Organization PAs(based on SE-CMM with
Security Considerations)
  • Project
  • PA11 Ensure Quality
  • PA12 Manage Configurations
  • PA13 Manage Program Risk
  • PA14 Monitor and Control Technical Effort
  • PA15 Plan Technical Effort
  • Organization
  • PA16 Define Organizations Security Engineering
    Process
  • PA17 Improve Orgnaizations Security
    Engineering Process
  • PA18 Manage Security Product Line Evolution
  • PA19 Manage Security Engineering Support
    Environment
  • PA20 Provide Ongoing Skills and Knowledge
  • PA21 Coordinate with Suppliers

10/24/96
28
29
Using the SSE-CMM
30
The Appraisal Process(based on the SE-CMM
Appraisal Method)
On-Site Phase
Post-Appraisal Phase
Orient/Train Participants
Preparation Phase
Report Lessons Learned
Interview Leads/ Practitioners
Obtain Sponsor Commitment
Report Appraisal Outcomes
Establish Findings
Scope Appraisal
Manage Appraisal Artifacts
Refine Findings
Plan Appraisal
Develop Rating Profile
Develop Findings and Recommendations Report
Collect Data
Report Results
Wrap up
10/24/96
30
31
Appraisal Results a Rating Profile
Domain Aspect
Base Practices
Base Practices
Base Practices
Base Practices
Base Practices
Base Practices
Process Areas
Process Areas
Process Areas
Process Areas
Process Category
Capability Aspect
Generic Practices
Generic Practices
Common Features
Generic Practices
CapabilityLevel
Common Features
Generic Practices
Common Features
Generic Practices
Generic Practices
10/24/96
31
32
Appraisal Scenarios
  • Use with the SE-CMM
  • with SE-CMM Appraisal
  • after SE-CMM Appraisal
  • no SE-CMM Appraisal
  • Use in different contexts
  • integrator vs. product vendor vs. service
    provider
  • development vs. operation

10/24/96
32
33
Who, When, How
ENGINEERING
ACQUISITION
EVALUATION
Process Improvement
RFP
Proposal Preparation
Source Selection
Product Evaluation / Re-evaluation
Protest
System Development
System Certification / Recertification
Risk Management
Product Selection
10/24/96
33
34
Use by Engineering Organizations
  • Define processes / practices
  • Use for competitive edge (in source selections)
  • Focus improvement efforts
  • Issues

10/24/96
34
35
Use by Acquirers
  • Standard RFP language and bidder evaluation
  • Understanding programmatic risks
  • Avoid protests (uniform assessments)
  • Greater level of confidence in end results
  • Issues

10/24/96
35
36
Use bySecurity Evaluation Organizations
  • Alternative to extensive evaluation/re-evaluation
  • confidence in integration of security engineering
    with other disciplines
  • confidence in end results
  • Issues

10/24/96
36
37
Where to get more information
38
Process Improvement / CMMs
  • Deming, W.E., Out of the Crisis, Cambridge MA
    Massachusetts Institute of Technology Center for
    Advanced Engineering Study, 1986.
  • Humphrey, W.S., Characterizing the Software
    Process A Maturity Framework, IEEE Software,
    Vol. 5, No. 2, Mar 1988, pp. 73-79.
  • Office of the Under Secretary of Defense for
    Acquisition, Washington, D.C., Report of te
    Defense Science Board Task Force on Military
    Software, Sept 1987.
  • Paulk, M.C. Curtis, B. Chrissis, M.B. Weber,
    C.V., Capability Maturity Model for Software,
    Version1.1, Software Engineering Institute,
    CMU/SEI-93-TR-24, Feb 1993.
  • Paulk, M.C. Weber, C.V. Garcia, S. Chrissis,
    M.B. Bush, M., Key Practices of theCapability
    Maturity Model, Version1.1, Software Engineering
    Institute, CMU/SEI-93-TR-25, Feb 1993.
  • Software Engineering Institute, Benefits of
    CMM-Based Software Process Improvement Initial
    Results, Software Engineering Institute,
    SEI-94-TR-013, 1994.

10/24/96
38
39
CMM for Security Engineering
  • Ferraiolo, K. Sachs, J., Determining Assurance
    Levels by Security Engineering Process Maturity,
    Proceedings of the Fifth Annual Canadian Computer
    Security Symposium, May 1993.
  • Ferraiolo, K. Williams, J. Landoll, D., A
    Capability Maturity Model for Security
    Engineering, Proceedings of the Sixth Annual
    Canadian Computer Security Symposium, May 1994.
  • Ferraiolo, K. Sachs, J., Distinguishing
    Security Engineering Process Areas by Maturity
    Levels, Proceedings of the Eighth Annual
    Canadian Computer Security Symposium, May 1996.
  • Gallagher, L., Thompson, V., An Update on the
    Security Engineering Capability Maturity Model
    Project, Proceedings of the Seventh Annual
    Canadian Computer Security Symposium, May 1995.
  • Hefner, R. Hsiao, D. Monroe, W., Experience
    with the Systems Security Engineering Capability
    Maturity Model, Proceedings of the
    International Council on Systems Engineering
    Symposium, July 1996.
  • Hosy, H. Roussely, B., Industrial Maturity and
    Information Technology Security, Proceedings of
    the Seventh Annual Canadian Computer Security
    Symposium, May 1995.
  • Menk, C.G. III, The SSE-CMM Evaluations
    Partners within the Assurance Framework,
    Proceedings of the 1996 National Information
    Systems Security Conference, Oct 1996.
  • Zior, M., Community Response to CMM-Based
    Security Engineering Process Improvement,
    Proceedings of the 1995 National Information
    Systems Security Conference, Oct 1995.

10/24/96
39
40
The SSE-CMM Project
41
Project Background
  • Project Goals
  • Develop a maturity model and appraisal methods
    for
  • Security engineering process improvement
  • Security engineering capability evaluation
  • Capability-based assurance
  • Encourage and maintain consistency with other
    CMMs
  • Promote commonality and economy of scale
  • Project Participants
  • Original work and project infrastructure
    sponsored by NSA additional funding provided by
    OSD
  • Collaborative effort by industry and government
    on their own funding

10/24/96
41
42
Project Structure
Steering Group
  • Steering Group
  • Provides project direction and strategy
  • Reviews and approves release of work products

Chair

Project Leader
TechnicalSupport
Committees
Author Group
Applications Group
  • Application Group
  • Defines and develops appraisal methods
  • Plans and provides for training
  • Plans and provides support for pilot trials
  • Author Group
  • Develops model
  • Recommends solutions to issues

Chair
Chair
TechnicalSupport
TechnicalSupport
Committees
Committees
CommunityReviewers
KeyReviewers
  • Community Reviewers
  • Provide comments on project materials after
    public release
  • Key Reviewers
  • Provide expert review of project materials before
    public release

10/24/96
42
43
Project Participants
  • Arca Systems, Inc.
  • BDM International Inc.
  • Booz-Allen and Hamilton, Inc.
  • Canadian Communications Security Establishment
  • Computer Sciences Corporation
  • Data Systems Analysts, Inc.
  • Defense Information Systems Agency
  • E-Systems
  • Electronic Warfare Associates - Canada, Ltd.
  • Fuentez Systems Concepts
  • GRC International, Inc.
  • Harris Corp.
  • Hughes Aircraft
  • Institute for Computer Information Sciences
  • Institute for Defense Analyses
  • Internal Revenue Service
  • ITT
  • Lockheed Martin
  • Merdan Group, Inc.
  • Motorola
  • National Center for Supercomputing Applications
  • National Security Agency
  • Naval Research Laboratory
  • National Institute for Standards and Technology
  • Northrop Grumman
  • NRaD
  • Office of the Secretary of Defense
  • Oracle Corporation
  • San Antonio Air Logistics Center
  • Science Applications International Corp.
  • SPARTA, Inc.
  • Stanford Telecom
  • Systems Research Applications Corp.
  • Tax Modernization Institute
  • The Sachs Groups
  • tOmega Engineering
  • Trusted Information Systems
  • TRW

10/24/96
43
44
Project Schedule
  • January 95 1st Public Workshop
  • Working Groups Formed
  • April 96 Project Critical Design Review
  • Summer/Fall 96 Security Engineering PA Model
    Pilots
  • NISSC 96 SSE-CMM v1.0
  • Early SSE-CMM Pilot Results
  • November 96 Appraisal Method v1.0
  • Spring 97 SSE-CMM v1.1
  • Appraisal Method v1.1
  • Pilot Results
  • 2nd Public Workshop

10/24/96
44
45
Points of Contact
  • Steering Group
  • Leader Dr. Rick Hefner
  • TRW
  • R5/1030
  • Redondo Beach, CA 90278
  • 310-812-7290
  • rick.hefner_at_trw.com
  • Author Group
  • Leader Karen Ferraiolo
  • Arca Systems, Inc.
  • 8229 Boone Blvd., Suite 750
  • Vienna, VA 22182
  • 703-734-5611
  • ferraiolo_at_arca.com

Project Sponsor Mary Schanken NSA 410-859-6091
schanken_at_romulus.ncsc.mil Applications
Group Leader Warren Monroe Hughes
Aircraft Bldg. 618 Fullerton, CA
92634 warren_at_mls1.hac.com
10/24/96
45
Write a Comment
User Comments (0)
About PowerShow.com