Software Process CS 3302 Lecture 1 - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Software Process CS 3302 Lecture 1

Description:

Physical equipment and software used to record and analyze ... 'Interactive' (Chaotic) Model. REQUIREMENTS. ANALYSIS. SYSTEM. DESIGN. PROGRAM. DESIGN. PROGRAM ... – PowerPoint PPT presentation

Number of Views:36
Avg rating:3.0/5.0
Slides: 32
Provided by: david3049
Category:

less

Transcript and Presenter's Notes

Title: Software Process CS 3302 Lecture 1


1
Software ProcessCS 3302Lecture 1
  • David Smith
  • dmsmith_at_cc.gatech.edu

2
Course Overview
Week 2 Concept
Weeks 3-4 Requirements
Weeks 5-6 Design
Weeks 7-8 Code
Process Management Analysis Planning
Scheduling Tracking Risk Management SW Design
Week 9 Test
Week 10 Slack Time
OO Concepts OO Design
Measurement Testing Design for Real-Time
Requirements Review
Software Quality Software Re-Use
Design Review
Modeling Summary
Demo
3
Software Process
  • Background
  • The Software Life Cycle
  • Waterfall
  • Spiral Model
  • Metrics
  • Peer Reviews
  • Assessment

4
Environmental Impact
Problems with waterfall
Object technology
CHANGES IN SOFTWARE ENGINEERING
Time to market
Desktop computing
Shifts in economics
Networking
User interfaces
5
Whos Who
6
Overview
COMPUTER SCIENCE
CUSTOMER
Computer Functions
Problem
SOFTWARE ENGINEERING
Tools and Techniques to Solve Problem
7
Problem Decomposition
PROBLEM
Subproblem 4
Subproblem 1
Subproblem 2
Subproblem 3
8
Solution Synthesis
Solution 4
Solution 1
Solution 2
Solution 3
9
Paycheck Generation System
10
Process Background
The biggest organizational mistake is failing to
separate process, methods and tools
  • Process
  • Tasks necessary to produce the product, their
    relationships and the measurement points.
  • Methods
  • The techniques used to perform each task
  • Tools
  • Physical equipment and software used to record
    and analyze the work products.
  • More advanced tools implement specific methods
  • TOOLS NEVER HELP TO SPECIFY THE PROCESS

11
Process Framework
  • Generic descriptions of the activities to be
    performed and the flow of work products
  • Usually, hierarchical
  • Full framework will have hundreds of activities
  • Ours wont
  • Top-Level Framework

Acquisition
Supply
Development
Maintenance
Management
Retirement
Measurement
Process Improvement
12
The Software Life Cycle
  • Software Engineering is a series of steps for
    producing software.
  • Methods, procedures, and tools are defined.
  • Engineering discipline
  • Metrics and measures - prediction

13
Classic Life Cycle Models
  • Linear sequential (Waterfall)
  • Prototyping
  • Rapid Application Development (RAD)
  • Incremental (Block Release)
  • Spiral
  • Component Assembly
  • Concurrent Development
  • Fourth Generation techniques

14
Waterfall Model
15
Interactive (Chaotic) Model
16
Waterfall with Prototyping
17
The V Model
18
Prototyping Cycles
19
Multiple Builds
Development systems
DEVELOPERS
Build Release 1
Build Release 2
Build Release 3
Time
USERS
Production systems
20
Spiral Model
Planning
Risk Analysis
Customer Communication
Start Axis
Customer Evaluation
Development
Integration
21
Process Metrics
  • Process Assessment
  • Document review
  • Peer reviews
  • completed vs scheduled
  • results reported
  • Problem report status
  • Product Quality
  • Defect characteristics
  • Difficulty of accurate measurement

22
Error Distribution
  • How would these data be gathered?
  • Which errors are most costly?
  • Where should a metric effort be focused?
  • Error detection
  • Requirements
  • Design
  • Code

23
Real Program Examples
24
Peer Reviews
  • The most useful single source of metrics data
  • All contributors involved
  • Non-threatening environment
  • Applicable at key program milestones
  • Process Planning
  • Requirements
  • Design
  • Code
  • Test
  • Measure different parameters at each phase
    transition
  • Ref ISO/IEC 12207

25
Necessary Features of Peer Reviews
  • Formal announcement
  • Who, what, when, where, why, how
  • Moderated by team manager of program under review
  • Results documented by his team process person
  • When it really happened
  • Who was there
  • What observations were made
  • Categorize any errors noted
  • What actions were agreed
  • Actions tracked to closure
  • While I will not formally require peer reviews
    on this course, both the reviewers and reviewees
    in properly documented peer reviews will receive
    some form of credit

26
Process Review
  • These could be an integral part of the technical
    reviews
  • Project plans are adequate and timely
  • Processes are adequate, implemented, executed as
    planned.
  • Standards, procedures and environments are
    appropriate.
  • Staffing and training is consistent with project
    goals.

27
Requirements Review
  • System requirements are consistent, feasible and
    testable
  • System requirements are appropriately allocated
    to hardware and software
  • Software requirements are consistent, feasible,
    testable and reflect system requiremetns
  • pre-meeting use of requirements quality matrix
  • Safety, security and criticality are addressed

28
Design Review
  • Design is correct, consistent with and traceable
    to, requirements
  • Design implements the proper sequence of events,
    inputs, outputs, interfaces, logic flows
  • Design can be derived from the requirements
  • Design implements safety, security and
    criticality requirements

29
Code Review
  • Code is traceable to design and requirements
  • Code is testable, correct and compliant with
    coding standards
  • Code implements the proper sequence of events,
    inputs, outputs, interfaces, logic flows
  • Code can be derived from design and requirements
  • Code implements safety, security and criticality
    requirements

30
Test Review
  • Tests completely cover the requirements
  • Test coverage includes extreme values as well as
    nominal
  • Test progress is tracked and reported
  • Test completion is documented

31
Assessment
  • Software Engineering Institute (SEI) Capability
    Maturity Model (CMM)
  • Designed to assess an organizations software
    development process
  • existence of a documented standard process
  • company commitment to using that process
  • institutionalization of that process
  • Measured by assessing 18 Key Process Areas (KPAs)
  • Reported as a maturity level 1 - 5
  • Cynics have defined levels 0 through -2
    (Crosstalk)
  • See Pressman pp 27-28 for details
Write a Comment
User Comments (0)
About PowerShow.com