Capability Maturity Models: Can They Be Applied to Security Engineering Presented to LA SPIN 27 Aug - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Capability Maturity Models: Can They Be Applied to Security Engineering Presented to LA SPIN 27 Aug

Description:

... US 12207, a blending of -498 and ISO 12207 to reflect US contracting customs) ... Communications Security Establishment (Canada) Computer Sciences Corporation ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 31
Provided by: sse4
Category:

less

Transcript and Presenter's Notes

Title: Capability Maturity Models: Can They Be Applied to Security Engineering Presented to LA SPIN 27 Aug


1
Capability Maturity ModelsCan They Be Applied
to Security Engineering?Presented to LA
SPIN27 Aug 97
Rick Hefner TRW One Space Park - R2/1104 Redondo
Beach, CA 90278 310/812-7290, rick.hefner_at_trw.com
2
Agenda
  • Standards and Maturity Models
  • SSE-CMM Project Background
  • SSE-CMM Structure and Content
  • Development Lessons Learned
  • Application Lessons Learned
  • Future Directions

3
The Frameworks Quagmire
http//www.software.org/Quagmire/index.html

27 Aug 97 LA SPIN - Rick Hefner
4
Standards
  • Standards establish rules for behavior
  • Contractual requirements (e.g., DoD-STD-2167A)
  • Corporate guidelines (e.g., EIA/IEEE J-STD-016)
  • The DoD contracting environment is changing
  • DoD desire for lower cost systems is leading to
    less prescriptive contracting
  • Elimination of military standards in favor of
    commercial practices
  • Single Process Initiative
  • Contractors must specify process and product
    standards themselves
  • Promotes more emphasis on corporate-wide
    processes
  • Requires new ways to distinguish between
    contractors

5
MIL-STD-498
  • A life-cycle standard, specifies a set of
    software activities
  • Replaced by EIA/IEEE J-STD-016(to be replaced
    with US 12207, a blending of -498 and ISO 12207
    to reflect US contracting customs)
  • Development Processes
  • Project planning oversight
  • Establishing a software development environment
  • System requirements analysis
  • System design
  • Software requirements analysis
  • Software design
  • Software implementation unit testing
  • Unit integration testing
  • CSCI qualification testing
  • CSCI/HWCI integration testing
  • System qualification testing
  • Preparing for software use
  • Preparing for software transition
  • Supporting Processes
  • Software configuration management
  • Software product evaluation
  • Software quality assurance
  • Corrective action
  • Joint technical management review
  • Other activities
  • Risk management
  • Software management indicators
  • Security privacy
  • Subcontractor management
  • Interface with software IVV agents
  • Coordination with associate developers
  • Improvement of project processes

6
Maturity Models
  • Identifies the characteristics of a mature
    process
  • Specified as a process model
  • Used in the context of how an organization
    performs the process (How mature is their
    software process?)
  • Can be used to estimate capability, e.g., A
    more mature organization has more capability to
  • Meet tight budgets and schedules
  • Build technically complex products
  • Recover from problems.
  • Inadequate capability implies risk to a
    successful program
  • Maturity is measured, usually on a numeric scale
  • Customers define the level of maturity needed to
    do the work measure potential bidders for
    adequacy estimate risks
  • Bidders seek adequate levels of maturity by
    improving their processes (to gain more maturity)

7
Measuring Maturity
  • There are different ways to define maturity
  • Maturity must be measured
  • Requires evidence of practices, use of
    organizational assets
  • Internal (contractor) vs. external (customer)
  • Methodology, accuracy, repeatability, independence

Innovate/optimize the process continuously Measure
the process and associated products
quantitatively Follow an organization standard
process Provide organizational support (policies,
procedures, training) Plan and manage the
process Perform all key parts of the process
(even if ad hoc) Perform some parts of the process
8
Implementation vs. Institutionalization
  • Implementation is the performance of an activity
  • Institutionalization is the building of a
    corporate culture to support the activities
  • What elements of the culture support (encourage,
    enforce) people in performing the activities?
  • Commitments - policies, sponsorship
  • Abilities - training, resources, organizational
    structures
  • Measurement - measurements, analysis
  • Verification - audits, management reviews
  • An organization supports its people by providing
  • Needed resources (time, funding) and tools
  • Policies, to guide performers in making decisions
  • Procedures and guidance, based on past successes
  • Training, for the needed skills and knowledge
  • Management review, to ensure the work is done
    properly
  • Quality audits, to ensure the work and work
    products are of sufficient quality

9
Discrete Maturity Models
  • Premise Advanced practices rely on the
    successful performance of more fundamental
    practices
  • Lower levelpractices areimplemented
    andinstitutionalizedfirst

Defect prevention Technology change
management Process change management Quantitative
process management Software quality
management Organization process focus Software
product engr Organization process
definition Intergroup coordination Training
program Peer reviews Integrated software
management Requirements management Software
subcontract mgmt Software project
planning Software quality assurance Software
project tracking oversight Software
configuration mgmt
LEVEL 5 OPTIMIZING
LEVEL 4 MANAGED
LEVEL 3 DEFINED
LEVEL 2 REPEATABLE
Software Capability Maturity Model (SW-CMM)
LEVEL 1 INITIAL
10
Continuous Maturity Models
  • Premise Maturity advances continuously in all
    process areas

Domain Capability
  • Continuously Improving
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Organization
  • Process Area 1
  • Base Practice 1
  • Base practice 2
  • etc.
  • Process Area 2
  • etc.
  • Quantitatively Controlled
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Project
  • Process Area 1
  • Base Practice 1
  • Base practice 2
  • etc.
  • Process Area 2
  • etc.

X
  • Managed
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Engineering
  • Process Area 1
  • Base Practice 1
  • Base practice 2
  • etc.
  • Process Area 2
  • etc.
  • Planned Tracked
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Performed
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Initial
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.

11
Discrete vs. Continuous
  • Discrete models provide more definitive direction
    for improvement
  • Requires that the authors know what direction to
    give(understand hierarchy among components)
  • May apply to a particular business domain, even
    though the model is general
  • May apply only to one dimension of maturity
  • Continuous models provide more flexibility in
    reaching maturity
  • Organizations (or customers) must chart their own
    direction for improvement

12
Current Maturity Models
  • D/C Domain Program Org
  • SEI SW-CMM D software X X
  • ISO SPICE C software X X
  • EPIC SE-CMM C systems engr X X
  • INCOSE SECAM C systems engr X X
  • EIA SECM C systems engr X X
  • EPIC IPD-CMM C IPD X X
  • People CMM D management X X
  • SW Acq CMM D SW acquisition X X
  • PSP ? software
  • Trusted CMM D secure software X X
  • System Security CMM C security engr X X
  • ...over 20 more...

CMM, Capability Maturity Model are service marks
of Carnegie Mellon University
13
Process is One Dimension of Success
Corporate Management
Business Environment
People
Product
Technology
Process
14
SSE-CMM Project Background
  • Government acquisition agencies are driven to
  • Lower the cost of developing and maintaining
    systems
  • Improve consistency in meeting schedule and
    budget
  • Select capable contractors
  • Few standards exist to judge corporate security
    engineering capabilities
  • Wide variations in secure products, systems, and
    services
  • Wide variations in provider structures and
    capabilities
  • Long timelines for product/system certification
  • In March 1995, a joint government/industry effort
    was initiated to develop a commonly-accepted
    security engineering standard
  • Sponsored by the Department of Defense, National
    Security Agency, and the Office of the Secretary
    of Defense

15
Project Background (continued)
  • Project Goals
  • Develop a maturity model and appraisal methods
    for
  • Security engineering process improvement
  • Security engineering capability evaluation
  • Process-based assurance
  • Encourage and maintain consistency with other
    CMMs
  • Promote commonality and economy of scale
  • Project Participants
  • Original work sponsored by NSA additional
    funding provided by OSD
  • Collaborative effort by industry and government
    agencies on their own funding

16
SSE-CMM Project Participants (as of August 1997)
  • Arca Systems, Inc.
  • BDM International, Inc.
  • Booz-Allen-Hamilton, Inc.
  • Communications Security Establishment (Canada)
  • Computer Sciences Corporation
  • Data Systems Analysts, Inc.
  • Department of Defense Information Systems Agency
  • E-Systems
  • Electronic Warfare Associates - Canada
  • Fuentez Systems Concepts, Inc.
  • G-J Consulting
  • GRC International, Inc.
  • Harris Corporation
  • Hughes Aircraft
  • Institute for Computer Information Sciences
  • Institute for Defense Analyses
  • Internal Revenue Service
  • ITT Aerospace
  • Lockheed Martin
  • Motorola
  • National Center for Supercomputing Applications,
    Univ. of Illinois
  • National Security Agency
  • Naval Research Laboratory
  • National Institute for Standards and Technology
  • Northrop Grumman
  • Navy Command, Control, Operations Support Center
    Research, Development, Testing Evaluation
    Division
  • Office of the Secretary of Defense
  • Oracle Corporation
  • pragma Systems Corporation
  • San Antonio Air Logistics Center
  • Science Applications International Corporation
  • SPARTA, Inc.
  • Stanford Telecom
  • Systems Research Applications
  • Tax Modernization Institute
  • The Sachs Groups
  • tOmega Engineering
  • Trusted Information Systems

17
SSE-CMM Project Structure
Steering Group
Chair
ProjectLeader
Committees
TechnicalSupport
Author Group
Applications Group
Chair
Chair
Committees
Committees
TechnicalSupport
TechnicalSupport
KeyReviewers
KeyReviewers
CommunityReviewers
18
Project Background (continued)Responsibilities
of Working Groups
  • Steering Group
  • Provides project direction and strategy
  • Reviews and approves release of work products
  • Author Group
  • Generates model description
  • Recommends solutions to issues
  • Application Working Group
  • Defines and develops appraisal methods
  • Plans and provides for training
  • Plans and provides support for pilot trials

19
SSE-CMM Foundations
Software CMM
Common Criteria
GenerallyAccepted SecurityPrinciples (GSSP)
System Engineering CMM
AssuranceFrameworks
Trusted CMM
Trusted SoftwareDevelopmentMethodology
Certification of InfoSystems SecurityProfessiona
ls
INCOSESystems EngineeringCapabilityAssessment
Method
20
SSE-CMM Model Structure
  • The SSE-CMM structure is based on maturity model
    concepts and the Systems Engineering-CMM (SE-CMM)
  • Shows interrelationship between system
    engineering and security engineering
  • Addresses unique features and activities in
    security engineering
  • A model structure was adopted that would permit
    appraisal of an organizations security
    engineering practices
  • As part of a SE-CMM appraisal
  • As an addition to a previously completed SE-CMM
    appraisal
  • By itself, without an SE-CMM appraisal
  • The model scope is
  • Providers of secure systems, components,
    services
  • Development, operation, maintenance
    decommissioning

21
SSE-CMM Model Structure
Domain Capability
  • Continuously Improving
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Organization
  • Process Area 1
  • Base Practice 1
  • Base practice 2
  • etc.
  • Process Area 2
  • etc.
  • Quantitatively Controlled
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Project
  • Process Area 1
  • Base Practice 1
  • Base practice 2
  • etc.
  • Process Area 2
  • etc.

X
  • Managed
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Engineering
  • Process Area 1
  • Base Practice 1
  • Base practice 2
  • etc.
  • Process Area 2
  • etc.
  • Planned Tracked
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Performed
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • Initial
  • Common Feature 1
  • Generic Practice 1
  • Generic practice 2
  • etc.
  • Common Feature 2, etc.
  • An SSE-CMM appraisal rates the capability of each
    Process Area

22
SSE-CMM Process Areas
Engineering PAs Project PAs Organizational
PAs (Developed for SSE-CMM) (Adapted from
SE-CMM) (Adapted from SE-CMM)


A
d
m
i
n
i
s
t
e
r

S
e
c
u
r
i
t
y


E
n
s
u
r
e

Q
u
a
l
i
t
y


C
o
o
r
d
i
n
a
t
e

w
i
t
h

S
u
p
p
l
i
e
r
s
C
o
n
t
r
o
l
s


M
a
n
a
g
e

C
o
n
f
i
g
u
r
a
t
i
o
n
s


D
e
f
i
n
e

O
r
g
a
n
i
z
a
t
i
o
n
'
s


A
s
s
e
s
s

O
p
e
r
a
t
i
o
n
a
l
S
e
c
u
r
i
t
y

E
n
g
i
n
e
e
r
i
n
g


M
a
n
a
g
e

P
r
o
g
r
a
m

R
i
s
k
S
e
c
u
r
i
t
y

R
i
s
k
P
r
o
c
e
s
s


M
o
n
i
t
o
r

a
n
d

C
o
n
t
r
o
l


A
t
t
a
c
k

S
e
c
u
r
i
t
y


I
m
p
r
o
v
e

O
r
g
a
n
i
z
a
t
i
o
n
'
s
T
e
c
h
n
i
c
a
l

E
f
f
o
r
t
S
e
c
u
r
i
t
y

E
n
g
i
n
e
e
r
i
n
g


B
u
i
l
d

A
s
s
u
r
a
n
c
e

A
r
g
u
m
e
n
t
P
r
o
c
e
s
s
e
s


P
l
a
n

T
e
c
h
n
i
c
a
l

E
f
f
o
r
t


C
o
o
r
d
i
n
a
t
e

S
e
c
u
r
i
t
y


M
a
n
a
g
e

S
e
c
u
r
i
t
y

P
r
o
d
u
c
t
L
i
n
e

E
v
o
l
u
t
i
o
n


D
e
t
e
r
m
i
n
e

S
e
c
u
r
i
t
y
V
u
l
n
e
r
a
b
i
l
i
t
i
e
s


M
a
n
a
g
e

S
e
c
u
r
i
t
y
E
n
g
i
n
e
e
r
i
n
g

S
u
p
p
o
r
t


M
o
n
i
t
o
r

S
y
s
t
e
m

S
e
c
u
r
i
t
y
E
n
v
i
r
o
n
m
e
n
t
P
o
s
t
u
r
e


P
r
o
v
i
d
e

O
n
g
o
i
n
g


P
r
o
v
i
d
e

S
e
c
u
r
i
t
y

I
n
p
u
t
K
n
o
w
l
e
d
g
e

a
n
d

S
k
i
l
l
s


S
p
e
c
i
f
y

S
e
c
u
r
i
t
y

N
e
e
d
s


V
e
r
i
f
y

a
n
d

V
a
l
i
d
a
t
e
S
e
c
u
r
i
t
y
23
Key Practices
  • Specify Security Needs Key Process Area
  • Key Practices
  • BP.01 Gain an understanding of the customers
    security needs.
  • BP.02 Identify which laws, policies, standards,
    external influences and constraints govern the
    system.
  • BP.03 Identify the purpose of the system in order
    to determine the security context.
  • BP.04 Capture a high-level security oriented view
    of the system operation.
  • BP.05 Capture high-level goals that define the
    security of the system.
  • BP.06 Define a consistent set of statements which
    define the protection to be implemented in the
    system.
  • BP.07 Obtain agreement that the specified
    security meets the customers needs.

24
Capability Dimension
  • The capability dimension has six maturity levels
    (0 to 5)
  • Each level represents increasing maturity as
    measured by support forthe practices

5
4
3
2
1
0
25
Appraisal Method
  • Select a Appraisal Team Leader
  • Define the appraisal scope (which projects, KPAs,
    CLs)
  • Select/train the internal/external team members
  • Administer questionnaires to key
    project/organization performers
  • Analyze questionnaires identify areas of focus
  • Interview key performers review artifacts
    develop ratings
  • Consolidate and present findings
  • (Generate/execute improvement plan)

Process Areas Understand Customer Needs Derive
and Allocate Requirements Develop Physical
Architecture Analyze Candidate Solutions Verify
Validate System, etc.
L0 L1 L2 L3 L4 L5
26
Development Lessons Learned
  • Community-wide consensus approach was critical to
    success
  • Project adopted an open, consensus approach
  • All security customers and providers participated
    as equals
  • Developing the model and appraisal method
    required a background in both security
    engineering and maturity models
  • Each model architecture has positives and
    negatives
  • Continuous architecture was selected to
  • Ease model integration with SE-CMM
  • Ease application of security engineering
    expertise
  • Determining the proper scope of the model was
    difficult
  • Wide variety of providers products, systems, and
    services
  • Focused on most commonly performed activities

27
Pilot Appraisal Results
  • Pilots were held to validate the model
    appraisal method
  • TRW, June 1996
  • CSC, July 1996
  • Hughes, November 1996
  • Model results
  • The models scope addressed all security
    engineering processes of the organizations
    appraised
  • The models practices allowed for different
    organizational and project structures
  • The models structure supported tailoring
  • Appraisal method results
  • Appraisers gained valuable insight in a short
    time period
  • The findings were valid and useful to the
    organizations
  • GITS-Canada, December 1996
  • Data General, June 1997

28
Future Directions
  • The adoption of industry-wide commercial
    standards will level the playing field
  • Some contractors will fall behind many will
    adopt a common set of standards
  • Process maturity will continue to be a factor in
    deciding between bidders
  • Maturity models will continue to change
  • Models will consolidate
  • SECAM SE-CMM SECM
  • SEI model integration efforts
  • New models will be created
  • New domains, new players

29
For Further Information / To Get Involved
  • SSE-CMM Sponsor
  • Mary Schanken, DoD
  • 410-859-6091, schanken_at_romulus.ncsc.mil
  • Project Web Site (model, appraisal method, news)
  • http//www.sse-cmm.org

30
References
  • CMM M. C. Paulk, B. Curtis, M. B. Chrissis,
    and C. V. Weber, Capability Maturity Model for
    Software, Version 1.1, Software Engineering
    Insti-tute, CMU/SEI-93-TR-24, February 1993
    available at www.sei.cmu.edu
  • IPD-CMM Software Engineering Institute and
    EPIC, An Integrated Product Development
    Capability Maturity Model, Carnegie Mellon
    University, Ver-sion 0.9, 28 October 1996.
  • P-CMM Curtis, Bill, William E. Hefley, and
    Sally Miller. People Capability Maturity Model.
    Soft-ware Engineering Institute,
    CMU/SEI-95-MM-02, September 1995.
  • SA-CMM Software Acquisition Capability
    Maturity Model available at www.sei.cmu.edu/techn
    ology/risk /Risk_SW_Acq/SA-CMM.html.
  • SECAM INCOSE Capability Assessment Working
    Group, Systems Engineering Capability Assess-ment
    Model, Version 1.50, June 1996.
  • SE-CMM EPIC, A Systems Engineering Capability
    Maturity Model, Version 1.1 available at
    www.software.org/secmminfo.html.
Write a Comment
User Comments (0)
About PowerShow.com