Title: EEEGEF 492'17 Inspections
1EEE/GEF 492.17Inspections
Royal Military College of Canada Electrical and
Computer Engineering
- Dr. Diane Kelly
- kelly-d_at_rmc.ca
- 613-541-6031
Dr. Terry Shepard shepard_at_rmc.ca 613-541-6031
2This set of slides is based on a tutorial
presented by Dr. Terry Shepard and Dr. Diane
Kelly at the International Conference on
Software Engineering in Toronto in May 2001 under
the heading How to do Inspections When There is
No Time. A summary appears on pp. 718-719 of the
Conference Proceedings.
3Outline
- Current Status of Inspection in Industry
- Fagan Inspections
- The 3 Ms of Inspection
- Management
- Mechanics
- Metrics
- Eight Maxims of Inspection
4Current Status of Inspection in Industry (1)
- Informal survey in 1998 80 of developers
practice inspection irregularly or not at all
(Johnson 1) - A more recent survey (2005) suggests 53 practice
inspection irregularly or not at all (El Emam
48) - not to be regarded as definitive, or even an
improving situation! - Tech bubble peak and bust both happened in that 7
year period - In companies that do inspections, code
inspections predominate (Laitenberger et al 2) - 50 or more of development time spent testing
debugging (Harrold 47 )
5Current Status of Inspection in Industry (2) 25
- Recommendation to Microsoft to shift focus from
testing to inspection (Cusumano et al 1995 23) - Microsoft practices are changing, but details are
not known. - At best, the state of practice is we inspect our
key components - At worst, its we dont have the time to do
inspections at all - The value of inspection is probably the topic on
which computing research agrees more consistently
than any other
6Rationale for Inspections
- The original argument payback on cost of fixes
(e.g. debugging) leading to improved productivity - according to work done by Glen Russell at BNR
(now Nortel Technologies) in 1991, 65 to 90 of
operational (execution-time) defects are detected
by inspection at 1/4 to 2/3 the cost of testing
13 - Other arguments for inspection
- better control of process
- higher quality
- reduced defect rates
- reduced cost of finding defects
- payback on fixing soft issues (Votta 3)
7What makes inspections hard?
- Mentally demanding
- Specific skills needed may be in short supply
- Role of moderator requires special people skills
- meeting dynamics
- possible tension between inspectors and product
author - Time pressures
- squeezed by tight schedules
- inspection interval problem
- Attitudes
- perceived as uninteresting work
- not glamorous, support role only
- Lack of ownership
- who owns the inspection?
- Unclear goals, multiple goals
8Industry Standard Inspection
- First described by Michael Fagan 1976 30
- subsequent paper in 1986 31
- Fagan defines inspection
- defines a process
- defines roles for the participants
- suggests feedback to improve the development
process - Fagan approach to inspections has become an
industry standard - Several variations
- e.g. Gilb and Graham 4 - next slide
9Inspection process from Gilb and Graham -
variation on Fagan
10Fagan-style inspections
- Entry and exit criteria
- Preparation
- Moderator
- Meeting 3-6 people
- Controlled rate typically 100 to 200 loc/hr
- rates that are too fast reduce effectiveness,
while going too slow increases cost with little
benefit - Controlled meeting length max 2 hours
- Controlled number of meetings per day max 2
- Track and report defects found, time spent
- rate of finding defects per hour, rate per kloc
- Classify defects
11New Approaches
- Research and industry practice suggest variations
on - goals
- meetings
- techniques
- size of teams
- entry criteria
- exit criteria
- when to inspect
- activities to include/exclude
- ...
- There are better approaches than the commonly
used Fagan method. 25
12The 3 Ms of Inspection
Inspection
Mechanics
Metrics
Management
13Three Ms of Inspection
- Management
- Mechanics
- Metrics
14Management
- Goals
- Culture
- Resources
- Process
15Management goals (1)
- Management sets goals
- goals can be related to operations, stock price,
profits, marketing, product lines, safety, - In order for an inspection program to be
successful -- - management and development teams must agree on
the goals - goals must be clear, relevant and meaningful
16Management goals (2)
- Goals have to be operationalized and interpreted
- there are usually difficult choices among
alternative approaches to operationalization - management guides the definition of expected
deliverables - inspection criteria should be based on realistic
goals - scope of inspection must be under control
- Circumstances, and therefore goals, may change
- flexibility and consistency must be balanced
17Setting and evaluating goals
- One good framework is the Goal/Question/Metric
paradigm (GQM) developed by Vic Basili - Goals can be set at many levels
- Questions are formulated to determine when
particular goals are met - Metrics are chosen to supply answers (or parts of
answers) to one or more questions - see The Goal/Question/Metric Method A Practical
Guide For Quality Improvement of Software
Development, by Rini van Solingen and Egon
Berghout, McGraw Hill, 1999
18GQM example
- Goal Analyze the final product to characterize
it with respect to the various defect classes
from the point of view of the software
development organization - format of goal follows GQM template
- Sample Question What is the distribution of the
introduction of errors across all the development
phases? - Metrics Number of requirements errors, number of
design errors,
19Management culture (1)
- Culture is made up of qualities like attitudes
and perceptions, degree of openness, visible and
hidden rewards and priorities, - these determine the likelihood of success of
inspections - e.g.
- easier if the culture is open
- harder if management focus is on visible
operational goals so that software is relatively
invisible - is being a good inspector considered a core
competency in the organization? - is there a standing VV group?
- is the organization willing to invest in
long-term diversion of resources to inspections?
20Management culture (2)
- Making the case depends on the culture
- need a champion for all but personal inspections
- The culture of keeping out of the way
- managers cant participate as managers,
especially if they use inspections for evaluation
of people - on the other hand, the president of the company
can be an inspector - there are many other ways to evaluate people
21Management resources (1)
- Kinds of resources
- personnel
- skill set issues
- availability
- inspectors must be perceived as impartial,
objective, respected - inspectors must understand the criteria for the
inspection - space
- mental overloaded people are poor inspectors
- physical quiet, comfortable, private, lack of
interruptions - inspection interval can increase if meeting rooms
are scarce - time
- time for inspections must be focused and
dedicated - schedule pressures may dominate, especially late
in a project
22Management resources (2)
- Quantifying soft factors 21
- human-oriented factors influence productivity
more than process does (list from COCOMO model) - analyst experience
- communications factors
- personnel continuity
- programmer capability
- capability of requirements analyst
- language and tool expertise
- programmer experience
- Soft factors collectively exert an influence on
productivity ranging up to 25.8 times - one source suggests that moving from CMM level 1
to level 5 only improves productivity by 1.43
times
23Management process (1)
- Process definition is a choice
- a chaotic process is one possible choice
- Inspections help bring order out of chaos
- inspections are a form of discipline
- e.g. time estimation depends on time tracking
- weekly timesheets arent precise enough
- learning to track time is necessary
- granularity must be better than hourly
- making inspections an integral part of the
development process gives structure to that
process - inspections lead to predictability
- collecting data from inspections should lead to
further process improvement
24Management process (2)
- Data-Based Process Improvement
- try inspection at different stages
- collect data to see which is most effective
- compare different process approaches
- collect data on testing and debugging effort
levels with and without inspections - the care needed to track times for inspections
spills over into estimation and tracking of time
for other activities - Process improvement approaches that are not based
on actual data are more likely to fail
25Mechanics
- Inspect for specific goals
- Types of inspectors
- Desk checking
- What to inspect
- When to inspect
- Process variations
- Techniques
- Tools
26Mechanics Inspect for Specific Goals - Examples
- Set an inspection goal to identify usability
issues - Ask questions such as
- What are these identifiers used for?
- What is the default choice in these switch
statements? - Have a numerical techniques expert check one
module that does a matrix inversion - Have a junior programmer check the entire product
for meaningful identifiers - Have a meeting focused solely on consistency
issues of the user interface
27Mechanics Inspect for Specific Goals Particular
Approaches 6,7
- Set objectives of inspection ahead
- Inspector skills are matched to specific aspects
of the inspection - Each inspector focuses on a specific purpose
- inspectors actively work with the document
- questionnaires dont have passive questions
requiring simple yes/no answers - Inspectors may either
- examine specific modules -or-
- examine entire product for one property
- Small meetings focused on specific aspects
- meetings may not raise new findings
28Mechanics Types of Inspectors 6
- Suggestions for finding good inspectors
- specialists
- application area
- hardware
- operating system
-
- potential users of system
- those familiar with design methodology used
- those who enjoy finding logical inconsistencies
and are skilled at doing so
29Mechanics Desk Checking
- Part of your personal tool kit
- some studies show that desk reviews can be
extremely effective (e.g. 32) - track time it takes
- track defects found
- take ideas from Pair Programming 15 and Testing
Buddies 23 - team up to do inspections with your buddy
30Mechanics what to inspect
- All software artifacts can be inspected
- In some cases, code inspections can be replaced
by design, architecture or requirements
inspections - In some cases, code inspections can replace unit
testing - Architecture and design inspections have high
payback (XP challenges this current debate) - Start small even one page can contain 50 or more
defects (e.g. early requirements inspection) - Inspection and testing are complementary
activities
31Mechanics when to inspect (1)
- Inspection is possible at any time
- Based on your goals
- Fit inspections into your existing process
- even an immature organization can implement an
effective inspection program (e.g. 24) - Dont allow the inspection to become a bottleneck
- do what you can in parallel with other activities
- Look at what resources are available
32Mechanics when to inspect (2)
- Determine the criteria
- entry criteria
- code after clean compile, before testing?
- documents after spell checking, after syntax
checking? - anything can I comprehend what youve given me?
- re-inspection criteria
- more than 5 errors per page?
- exit criteria
- all findings addressed?
- inspection tasks completed?
- metrics at acceptable levels?
33Mechanics Process variations
Process goal is to combine structure and rigour
with freedom to act effectively and so encourage
participants to add value
- Should process include sampling?
- Does every inspection need a Fagan-style meeting?
- alternatives
- Is there a case for more than one meeting?
34Should process include sampling?
- Sampling selected portions of a document is an
excellent practical alternative to complete
checking. 4 - estimate issue density per page in unchecked
pages - can determine value of checking other pages
- can suggest systematic corrections to document
- can determine value of further inspection
- based on assumption that the same process that
produced the entire document produced the sample - mistakes on one page will often recur on many
other pages - will not identify specific issues on unchecked
pages - satisfies the purpose of quality measurement,
teaching the product author, gaining insights
about defects that could help in process
improvement
35Does Every Inspection Need a Fagan-style Meeting?
36Does Every Inspection Need a Fagan-style Meeting?
(1)
- Report on an experiment Votta 5
- investigates the need for large inspection
meetings - two-person meetings are enough?
- Report on an experiment Porter Votta 26
- collection meetings produce no net improvement
in terms of meeting losses and meeting gains - Report on an experiment Johnson Tjanjono 17
- shows that the number of defects found does not
increase as a result of meetings, but that
meetings help to detect false positives
37Does Every Inspection Need a Fagan-style Meeting?
(2)
- Report on an experiment Laitenberger et al 28
- non-traditional structure of meeting
(DaimlerChrysler Germany) - each inspector prepares a model of the
requirements document under inspection - each inspector presents their prepared model
- in the presentation, each inspector explains the
defects found in the context of their model - ensures
- each inspector is prepared
- technical justification for each defect
- discussions with author are related to technical
content - helps to avoid personal conflicts
- meetings all consist of two inspectors plus the
author - experiment concludes that meeting results are
better than individual results
38Is there a case for more thanone meeting?
- An Experiment to Assess the Cost-Benefits of Code
Inspections in Large Scale Software Development
(Porter, Siy, Toman, Votta 8) - varied the the number of reviewers, the number of
teams inspecting the code unit, and the
requirement of fixing defects between first and
second teams inspections - two teams per inspection with fix was dropped
part way through experiment as infeasible - results on two teams in parallel are
inconclusive, where the total number of people is
the same (e.g. 2 teams of 1 vs. 1 team of 2) - one of the conclusions
- structural changes to the inspection process do
not always have the intended effect significant
improvements to the inspection process will
depend on the development of defect detection
techniques
39Mechanics techniques
- Unstructured inspection techniques
- ad hoc and paraphrasing
- check lists
- Structured individual techniques
- scenario based reading
- defect based reading
- test based perspective
- program comprehension case study
- modeling the inspection artifact
- task-directed inspections
40Unstructured Techniques (1)
- Ad-hoc is common
- often based on paraphrasing
- no focus on particular issues
- no guidance on how to consider the product
- no explicit use of inspectors expertise
- not repeatable
- no record of inspectors thought processes and
actions - does the beginning of the product get more
attention than the end? - are conceptually difficult parts ignored?
41Unstructured Techniques (2)
- Checklists are popular but have several
shortcomings2, 9 - checklists are based on past experience
- some categories will not have been found yet
- depends on inspectors using individual expertise
to find issues in missing categories - inspectors may not look for anything beyond what
is in checklist - checklists may be lengthy and hard to use
- checklist categories may not be effective or
productive - categories need to be prioritized
- some categories may not provide enough guidance
- checklist for one project may not work on another
- some checklists should be personalized
- checklist effectiveness needs to be evaluated
- constant improvement needed
42Unstructured Techniques (3)
- Checklists are effective when 12, 33
- well established history of the product exists
- product is predictable
- inspection goals are accompanied by a well
documented method of operationalization - e.g. style manual, standards
- providing a source of accumulated wisdom to a
junior member of the team - an inspector is working independently and
- the inspector is not experienced and
- the checklist provides a relatively complete tour
of the product under inspection
43Overview of Structured Inspection Techniques
(University of Maryland 10)
- Systematic
- inspector knows how to inspect the document
- Specific roles
- inspector is responsible for a specific role with
a specific focus - Distinct
- each role is distinct in the sense of minimizing
overlap - coverage of different defect classes is achieved
by having multiple roles
44Scenario Based Reading
- A technique that is used individually in order
to analyze a product or a set of products. Some
concrete instructions are given to the reader on
how to read or what to look for in a document.
(University of Maryland, Notes on Perspective
based Scenarios 10) - defect-based and perspective-based scenarios
(Porter, Votta, Basili 11) - focus on defect classes and use questionnaire to
guide inspectors - focus on a role (perspective) and use
questionnaire to guide inspectors (may also
produce a product associated with that role) - can prepare a scenario for any inspection
45Scenario example Defect-based reading 11
- Requirements inspection - Ambiguities or Missing
Functionality Scenario - Identify the required precision, response time,
etc. for each functional requirement. - Are all required precisions indicated?
- For each requirement, identify all monitored
events. - Does a sequence of events exist for which
multiple output values can be computed? - Does a sequence of events exist for which no
output value will be computed? - For each system mode, identify all monitored
events. - Does a sequence of events exist for which
transitions into two or more system modes is
allowed?
46Scenario examplePerspective-based reading (1)
27
- Requirements inspection - Test-based Perspective
- For each requirement, make up a test or set of
tests that will allow you to ensure that the
implementation satisfies the requirement. Use
your standard test approach and test criteria to
make up the test suite. While making up your test
suite for each requirement, ask yourself the
following questions - next slide ...
47Scenario examplePerspective-based reading (2)
27
- Requirements inspection - Test-based Perspective
- Do you have all the information necessary to
identify the item being tested and to identify
your test criteria? Can you make up reasonable
test cases for each item based upon the criteria? - Is there another requirement for which you would
generate a similar test case but would get a
contradictory result? - Can you be sure the test you generate will yield
the correct value in the correct units? - Are there other interpretations of this
requirement that the implementor might make
based upon the way the requirement is defined?
Will this affect the test you made up? - Does the requirement make sense from what you
know about the application and from what is
specified in the general description?
48Program Comprehension Case Study 20,25
- Most developers have weak program comprehension
skills - A group of inspectors in a particular company was
trained in program comprehension - focus shifted from process to product
- built mental models and understood the
translations from one model to the next - inspectors developed their own checklists from
these models - there was a 90 reduction in post-release errors
when inspectors used program comprehension
approach compared to two other groups of
inspectors in the same company
49Modeling the Inspection Artifact
- Example DaimlerChrysler requirements inspections
- Create a model of the requirements specifications
- e.g. using UML, usually sequence diagrams
- Two inspectors read specifications
- focus on qualities such as correctness,
consistency, testability, maintainability - prepare slides about the created model for the
subsequent inspection meeting - document any defect found on a prescribed form
- inspectors dont need inspection specific
training - technique is effective at operationalization of
given qualities?
50Task Directed Inspection (TDI)
- What is TDI?
- inspectors produce a usable product for future
evolution of the software system - inspectors expertise is matched to the task
- used with a light-weight inspection process
- focus on the work of the individual rather than
that of the team - based on observation that creating careful
documentation often shows up defects in product
being documented - Case Study at OHN
- inspection exercise piggybacked on an existing
software development activity
51Mechanics tools
- Comments about tools here are focused on code
- use tools to make inspection more productive
- reduce tedious trivia burden
- for code clean up, restructuring
- enforce code styles
- e.g. enforce comment style consistency
- some style errors slip past
- automatic generation of documentation
- e.g. call trees, flowcharts, UML diagrams, ...
- find some syntactic errors
- tools cannot find errors that result in correct
syntax - tools can get confused in complex code
- report false positives
- miss real issues
- still need someone to read the code
- asynchronous inspections virtual meetings
- can reduce inspection intervals, but no synergy
- current research on Computer Supported
Collaborative Work (CSCW) may improve inspections - still very much in research phase
52Inspection Metrics defect data
- Robert Grady of HP 1996 34
- software defect data is the most important
available management information source for
software process improvement decisions - ignoring defect data can lead to serious
consequences for an organizations business
53Inspection Metrics What can be collected?
- Time used
- hardware support for time tracking would help?
- e.g. palmtop
- Findings
- may generate change requests
- only warranted when change is large
- minor changes can be fixed in the inspection
process - Others counts, location, sources, impact, time
to fix, role (user, developer, inspector,...),
phase - Data collection effort must be proportional to
the value of the data
54Inspection Metrics When There Is No Time
- Measurement gathering must be an integral part of
the software development process - must involve as little human effort as possible
- must be supported with tools if at all possible
- minimize the number of measurements gathered
- those gathered must be useful and useable
- there must be a payback on the metrics chosen, or
the time needed to collect them is better spent
elsewhere - be willing to change the metrics being collected
55Metrics Defect Classification
- Why classify and track defects?
- Sample defect categorization schemes
- Definition is a form of classification What is a
defect? - IEEE Standard 1044 Classification for Software
Anomalies - Defect Classification for Defect Causal Analysis
- What to do about classification when there is no
time?
56Why classify and track defects? (1)
- Project management reasons
- allocation of resources
- allocation of priorities
- monitoring project progress
- Project execution reasons
- guide the selection of test cases
- guide what to inspect for
- Responsiveness to customers
- nothing falls off the list
- dont make the same mistake again
- which types of defects cause greatest downtime
57Why classify and track defects? (2)
- Process improvement reasons
- identify what parts of process are injecting
defects - analyze faulty work processes and improve them
- identify activities which
- capture specific defect classes
- capture more defects
- identify effective personal practices
- analysis of steps in process
- do the steps meet their goals?
- evaluate impact of changes to development process
- defect classification as part of a measurement
program
58Why classify and track defects? (3)
- Product improvement reasons
- defect profiles for specific components/modules
- statistical assessment of software
- Research
- evaluation of methodologies
- evaluation of VV techniques
- identifying different types of defects
- differentiating types of defects found by
different inspection techniques
59Sample defect classification schemes
- IBM ODC
- http//www.research.ibm.com/softeng/ODC/DETODC.HTM
- Hewlett Packard
- Boris Beizer
- Defect Classification for Computational Codes
60IBM ODC Categories
- Defect Removal Activities
- Triggers
- Impact
- Target
- Defect Type
- Qualifier
- Age
- Source
61IBM ODC defect types for code and design
- 1. Assignment/Initialization
- 2. Checking
- 3. Algorithm/Method
- 4. Function/Class/Object
- 5. Timing/Serial
- 6. Interface/O-O Messages
- 7. Relationship
62HPs Categorization Scheme 1987
63Beizers taxonomy (1)
- Developed as a statistical framework for
determining testing strategy - Four levels of detail with eight categories at
top level - 1. Requirements and Features
- defects related to the requirements specification
- 2. Functionality and Features
- completeness, correctness, input domain, error
messages - 3. Component Structure
- control flow, case selection, initialization,
processing - 4. Data
- definition, structure, or use of data
64Beizers taxonomy (2)
- 5. Implementation
- typographical errors, standards violation,
documentation - 6. Integration
- integration and interfaces between components
- 7. System and Software Architecture
- architectural errors that affect entire system
- 8. Test Definition Execution
- definition, design, execution of tests or data
used in tests - Example
- 4232 Data(4) Access and Handling(2)
Dimension(3) Initialization(2)
65Defect Classification for Computational Codes
DC-CC (1)
- Description
- structured to have multiple levels of detail
- five categories at top level
- documentation
- calculation/logic
- error checking
- support code
- external resources
- as currently defined, three more levels each
providing more detail - five defect qualifiers associated with each
category - missing, wrong, superfluous, inconsistent, obscure
66Defect Classification for Computational Codes
DC-CC (2)
- Additional classification as a tool for research
- DC-CC is overlaid with levels of understanding
- Comparative level C
- consistency of code with documentation
- Identifier level I
- consistency of variable naming with use
- Structural level S
- data and logical structures that support code
architecture - Logical level L
- calculation and logic correct
67Definitions
- What is a Defect?
- too many terms
- examples
- Simple and complex definitions
- Our definition
68What is a Defect?Too many terms
error
issue
botch
problem
defect
incident
delusion
fault
bug
elision
finding
complaint
goof
slip
mistake
flaw
failure
trouble
boner
error
anomaly
gripe
glitch
blunder
howler
oversight
69What is a Defect? Examples (1)
- Difficulty in categorizing 35
- one-character error in source statement
- passes syntax checking
- data is corrupted in another area of the system
- function then performs incorrectly
- is this a typing error, coding error, data error
or functional error? - is the error due to inadequate data validation,
inadequate code review? - Defect or opportunity?
- Help system could be improved
- Scale issues
- e.g. typing error in internal comment
- group all typing errors as a single defect?
- e.g. system fails to complete computations in
required time limit - several components may be contributing to problem
70What is a Defect? Examples (2)
- Definition/scope/details of defect may change
after initial report filed - e.g. variable used multiple times but only
guarded (e.g. range checked) in one case - missing guards?
- unnecessary guard?
- missing documentation for a special case?
- e.g. setting of a flag indicating a fatal error
is commented out - mistake?
- undocumented change of error handling style?
- kluge to stop the program from terminating?
71What is a Defect?
- An argument may be made that its not a defect
since it is ... - not a fault
- defects may not lead to failures
- e.g. incorrect comment
- but this may lead indirectly to a failure
- code is changed to be consistent with comment
- not an error
- defects may be inadvertent
- e.g. new version of a component or a platform
- sometimes not clear
- e.g. dead code that may have a purpose
- really an enhancement
- e.g. limitations on current functionality
72What is a defect? Definitions(1)
- Simple definitions
- Gilb 4 (1)
- a defect is identified as something that is
incorrect - Gilb 4 (2) for defect prevention process
- anything that impacts or interferes with
maintaining or improving quality and production - Fredericks and Basili 29 trying to find a
consistent terminology - any fault, failure, or error occurring in a
software system
73What is a defect? Definitions (2)
- Definitions from standards 36
- The non-fulfillment of intended usage
requirements. ISO8402, QMPPQuality Management
for Projects and Programs, Lew Ireland Project
Management Institute - Any non-conformance of a characteristic with
specified requirements. MIL-STD 105, QMPP - Any condition or characteristic in any supplies
or services furnished by the contractor under the
contract that is not in compliance with the
requirements of the contract. GATGlossary of
Acquisition Terms, US federal govt - A substandard condition. CSM Centre for Systems
Management
74What is a defect? Definitions (3)
- Definition from Robert Grady at HP 37
- A defect is a deviation from the product
specification or an error in the specification if
the error could have been detected and would have
been corrected. If the error could not possibly
have been detected, or it could have been
detected and would not have been corrected, then
it is an enhancement, not a defect. Defects do
not include typographical or grammatical errors
in the engineering documentation.
75What is a defect?Our Definition
A defect is an observable property of code or
other work product that could diminish the
desired level of any quality factor defined for
the software system.
Quality factors (Ilities) Timeliness,
Functionality, Cost, Correctness, Reliability,
Maintainability Usability, Flexibility,
Adaptability, Testability, Readability,
Portability, Device independence, Self
containedness, Reusability, Interoperability,
Efficiency, Device efficiency, Integrity,
Security, Accuracy, Robustness, Completeness,
Consistency, Accountability, Accessibility, Human
Engineered, Self descriptive, Fault tolerant,
Structuredness, Conciseness, Legibility,
Augmentability, Modifiability, Understandability,
Clarity, Resilience, Validity, Generality,
Minimality, Modularity, Expandability,...
76IEEE Standard 1044 - Classification for Software
Anomalies 38
- Describes classification itself as a process with
four steps - recognition
- when anomaly found
- investigation
- propose solutions
- action
- all activities to prevent occurrence of similar
anomalies - disposition
- At each step
- record
- classify
- identify impact
77Defect Classification for Defect Causal Analysis
(DCA) 39
- Make use of most widely available types of
quality information - the software problem
report - Classify or group problems
- classify problem by programmer when implementing
the fix - define or choose a scheme before starting DCA
- most useful dimensions for classification
- when was the defect injected into the software?
- when was the defect detected?
- what type of defect was introduced?
- Identify systematic errors
- Determine principle cause
- Develop corrective actions
78What to do about classification when there is no
time?
- Keep classification system very simple
- e.g. Major, Minor
- Track only defects in progress
- dont keep historical data unless it has value
- small defects that can be quickly fixed are not
tracked, other than by the existence of new
versions - Classify only when there is a very clear purpose
- stop classifying when purpose is achieved
- Use classification to save time
- make judicious use of defect causal analysis on
subsets of defects
79Eight Maxims of Inspection
- Observations from the three experiments conducted
at Ontario Hydro - 1. Specify the goals of the inspection
- define all terminology
- e.g. What does maintainability or correctness
mean? - e.g. What is a defect?
- specify style guides if any
- determine scope of inspection
- do inspectors look at associated pieces of the
product? - 2. Identify required skills
- specific technical skills linked to goals
- communications skills
- 3. Clean up the product to be inspected
- support judicious use of the inspectors time
- use tools to clean up the product under
inspection - supply complete and consistent documentation
- if possible, make product author available as a
resource
80Eight Maxims of Inspection
- 4. Use structured inspection techniques
- provides guidance for breadth and depth of
understanding of product - encourages better coverage of the product
- 5. Build a process based on inspection
- product author knows the product will be
inspected - infrastructure in place to leverage benefits of
inspection - identify the best time to carry out the
inspection - 6. Give inspectors responsibility and authority
- a sense of ownership encourages dedicated work
- allow inspectors access to all resources needed
81Eight Maxims of Inspection
- 7. Ensure inspectors have the space to inspect
- mental space to be able to focus
- physical space free from distractions
- schedule space to get the job done
- 8. Use metrics cautiously when assessing the
effectiveness of inspection activities - defect count is an ambiguous and ill-defined
metric - too many variables affect time-to-inspect
- what about subconscious thought time??
Diane Kelly and Terry Shepard, "Eight Maxims for
Software Inspectors, Journal of Software
Testing, Verification and Reliability, Volume
14, Issue 4 , pp. 243 256, Dec. 2004
82How to do Inspections When there is No Time
If you dont have time to do it right ...
you must have time to do it over.
83References 1
- Phillip M. Johnson, Reengineering Inspection,
Communications of the ACM, Feb. 1998, Vol 41, No
2, pp 49-52 - Oliver Laitenberger, Khaled El Emam, Thomas
Harbich, An Internally Replicated
Quasi-Experimental Comparison of Checklist and
Perspective-Base Reading of Code Documents, IEEE
Transactions in Software Engineering, May 2001,
Vol. 27, No. 5, pp.387-421 - Lawrence G. Votta, Does the Modern Code
Inspection Have Value? Presentation at the NRC
Seminar on Measuring Success Empirical Studies
of Software Engineering March 1999
http//www.cser.ca/seminar/ESSE/slides/ESSE_Votta.
pdf
84References 2
- Tom Gilb and Dorothy Graham, Software Inspection,
Addison Wesley, 1993. see also http//www.result-p
lanning.com/ - Lawrence G. Votta Does Every Inspection Need a
Meeting?, SIGSOFT93 - Proceedings of 1st ACM
SIGSOFT Symposium on Software Development
Engineering, ACM Press, New York, 1993, pp
107-114 - David L. Parnas, David M. Weiss Active Design
Reviews Principles and Practice, Proceedings
8th International Conference on Software
Engineering, Aug. 1985
85References 3
- J.C.Knight, E.A. Myers An Improved Inspection
Technique, Communications of the ACM, Nov. 1993,
Vol 36, No 11, pp. 51-61 - Adam A. Porter, Harvey P. Siy, Carol A. Toman,
Lawrence G. Votta An experiment to Assess the
Cost-Benefits of Code Inspections in Large Scale
Software Development, IEEE Transactions on
Software Engineering, Vol 23, No 6, June 1997, pp
329-346 - Y. Chernak A Statistical Approach to the
Inspection Checklist Formal Synthesis and
Improvement IEEE Transactions on Software
Engineering, 22(12)866-874, Dec. 1996
86References 4
- University of Maryland, Notes on Perspective
based Scenarios online http//www.cs.umd.edu/p
rojects/SoftEng/ESEG/manual/pbr_package/node8.html
Nov. 1999 - Adam A. Porter, Lawrence G. Votta, Victor R.
Basili Comparing Detection Methods for Software
Requirements Inspections A Replicated
Experiment, IEEE Transactions on Software
Engineering, Vol 21, No 6, June 1995, pp 563-575 - Diane Kelly, Terry Shepard Task-Directed
Software Inspection Technique An Experiment and
Case Study, Proceedings IBM CASCON, Nov. 2000
87References 5
- Glen W. Russell, "Experience with Inspection in
Ultralarge-Scale Developments", IEEE Software,
Jan. 1991, pp.25-31 - R. Chillarege, et al., "Orthogonal Defect
Classification - A Concept for In-Process
Management", IEEE Transactions on Software
Engineering, v. 18 n.11, Nov 92, pp. 943-956 - Kent Beck, Extreme Programming Explained Culture
Change, Addison Wesley, 1999 - Robert B. Grady, Successful Software Process
Improvement, Prentice Hall, 1997
88References 6
- Philip M. Johnson and Danu Tjahjono, Does Every
Inspection Really Need A Meeting?, Journal of
Empirical Software Engineering, 4, 1, pp 9-35,
Jan. 1998 - David A. Wheeler, Bill Brykczynski, and Reginald
N. Meeson Jr., Software Inspection An Industry
Best Practice, IEEE CS Press, 1996 - Gregory Abowd, Len Bass, Paul Clement, Rick
Kazman, Linda Northrop, Amy Zaremski
Recommended Best Industrial Practice for
Software Architecture Evaluation, Technical
Report, CMU/SEI-96-TR-025, January 1997
89References 7
- Stan Rifkin, Lionel Deimel Program
Comprehension Techniques Improve Software
Inspections A Case Study, Proceedings IWPC 00,
IEEE 2000 - Steve McConnell, Quantifying Soft Factors, IEEE
Software, Nov/Dec 2000, pp 9-11 - Terry Shepard, Margaret Lamb, and Diane Kelly,
More Testing Should be Taught, Communications
of the ACM, June 2001, 44 (6), pp. 103-108 - Michael Cusumano, Richard Selby, Microsoft
Secrets, Simon Shuster Inc., 1995
90References 8
- Edward Kit, Software Testing in the Real World -
improving the process, Addison-Wesley, 1995 - Robert L. Glass, Inspections - Some Surprising
Findings, Communications of the ACM, April 1999,
Vol. 42, no. 4, pp.17-19 - Adam Porter, Lawrence Votta, Comparing Detection
Methods for Software Requirements Inspections A
Replication Using Professional Subjects,
Empirical Software Engineering Journal, 1997
91References 9
- Victor Basili, Scott Green Oliver Laitenberger,
Filippo Lanubile, Forrest Shull, Sivert
Sorumgard, Marvin Zelkowitz, The Empirical
Investigation of Perspective-Based Reading,
Empirical Software Engineering An International
Journal, 1(2), 1996, pp.133-164 - Oliver Laitenberger, Thomas Bell, An Industrial
Case Study to examine a non-traditional
Inspection Implementation for Requirements
Specifications, Proceedings 8th IEEE Symposium
on Software Metrics, June 2002, pp.97-106
92References 10
- Michael Fredericks, Victor Basili, Using Defect
Tracking and Analysis to Improve Software
Quality A DACS State-of-the-Art Report, Rome
NY, 1998 - M.E. Fagan, Design and Code Inspections to
reduce Errors in Program development, IBM
Systems Journal, Vol. 15, No. 3, 1976,
pp.182-211 - M.E. Fagan, Advances in Software Inspections,
IEEE Transactions on Software Engineering,
Vol.12, No. 7, July 1986, pp.744-751 - Watts S. Humphrey, A Discipline for Software
Engineering, Addison Wesley, 1995
93References 11
- Stefan Biffl, Michael Halling, Investigating the
Influence of Inspector Capability Factors with
Four Inspection Techniques on Inspection
Performance, Eight IEEE Symposium on Software
Metrics, 2002, pp. 107-117 - Robert B. Grady (1996). Software Failure
Analysis for High-Return Process Improvement
Decisions. Hewlett-Packard Journal, 47(4)
(August) - 35 Boris Beizer, Software Testing Techniques,
2nd ed., Van Nostrand Reinhold, NY, 1990
94References 12
- http//www.pmforum.org/library/glossary/
PMG_D00.htm - Robert Grady, Deborah Caswell Software Metrics
Establishing a Company-Wide Program,
Prentice-Hall, 1986 - IEEE Std 1044-1993 IEEE Standard Classification
for Software Anomalies - David Card, Learning from our Mistakes with
Defect Causal Analysis, IEEE Software Jan-Feb
1998, pp.56-63
95References 13
- http//www.research.ibm.com/softeng/ODC/DETODC.HTM
and FAQ.HTMconcepts - Barry Boehm, Victor Basili Software Defect
Reduction Top 10 List IEEE Computer, January
2001, pp.135-137 - Ram Chillarege, Inderpal Bhandari, Jarir Chaar,
Michael Halliday, Diane Moebus, Bonnie Ray,
Man-Yuen Wong Orthogonal Defect Classification
- A Concept for In-Process Measurements IEEE
Transactions on Software Engineering, Vol.18,
No.11, Nov. 1992, pp.943-956
96References 14
- Jarir Chaar, Michael Halliday, Inderpal Bhandari,
Ram Chillarege In-Process Evaluation for
Software Inspection and Test, IEEE Transactions
on Software Engineering, Vol.19, No.11, Nov.
1993, pp.1055-1070 - Karl Weigers Process Impact. Review Checklists,
http//www.processimpact.com/process_assets/review
_checklists.doc - Diane Kelly, Terry Shepard, A Case Study in the
Use of Defect Classification in Inspections,
Proceedings IBM CASCON, Toronto, November 2001
97References 15
- Victor Basili, David Weiss, A Methodology for
Collecting Valid Software Engineering Data, IEEE
Transactions on Software Engineering, Vol. Se-10,
No.6, Nov. 1984 - Mary Jean Harold, Testing A Roadmap,
Proceedings ICSE 2000 - Khaled El Emam, Software Practices and Project
Success, Cutter Consortium, Agile Project
Management Advisory Service Executive Update,
Vol. 6, No. 17, 2005