Empirical Assessment in Software Engineering - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Empirical Assessment in Software Engineering

Description:

Reason (truth can be proven using the rules of ... 'Our ideas and theories need to be tested against reality and not be affected by preconceived notions' ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 23
Provided by: scie205
Category:

less

Transcript and Presenter's Notes

Title: Empirical Assessment in Software Engineering


1
Empirical Assessment in Software Engineering
  • CSEM01SE Evolution ManagementAnne ComerHelen
    Edwards
  • (with special thanks to Cornelia Boldyref, Durham
    University)

2
What is True?
  • Authority (truth is given to us by someone with
    more knowledge than ourselves)
  • God
  • Expert
  • Reason (truth can be proven using the rules of
    deductive logic)
  • Experience (truth is what is encountered through
    one or more senses)
  • Sw. Eng. seems to be based on a combination of
    anecdotal experience and human authority

3
Experience Empiricism
  • Nothing is in the intellect which was not first
    in the senses Thomas Aquinas
  • Empirical methods dominate natural and social
    sciences
  • Empirical methods dominate technology
  • Our ideas and theories need to be tested against
    reality and not be affected by preconceived
    notions
  • Wikipedia (http//en.wikipedia.org/wiki/Empiri
    cism)

4
  • Evaluation and Assessment in Software Engineering
    (EASE) provides a forum for empirical
    researchers to present their latest research,
    discuss issues related to evaluation and
    empirical studies. followed by a half-day post
    graduate school aimed at teaching the critical
    skills associated with evaluating current
    research.
  • research papers on any aspect of product and
    process evaluation and assessment both
    qualitative and quantitative including
  • Experiments (laboratory and field)
  • Replications of empirical studies
  • Case studies, Surveys, Observational studies,
    Field studies
  • Action research
  • Evaluation methodology
  • Systematic reviews and Meta-analysis

5
Software Engineering Research Issues
  • Balancing theory and practice
  • Sw. Eng. research has historically been patchy
    quality
  • Models for SE research
  • The role of empirical studies
  • Evidence Based software engineering

6
Linking research with practice
  • Why after 25 years of SE, has SE research failed
    to influence industrial practice and the quality
    of resulting software?
  • Potts argues that this failure is caused by
    treating research and its application by industry
    as separate, sequential activities.
  • What he calls the research-then-transfer
    approach. The solution he proposes is the
    industry-as-laboratory approach.

Colin Potts, Software Engineering Research
Revisited, IEEE Software, 1993
7
Research-then-Transfer (RgtT)
Wide gulf bridged by indirect, anecdotal
knowledge
Research Solution
Problem
bridged by hard, but often inappropriate
technology
Problem evolves before the research community can
respond
Incremental refinement of research solutions
  • Both research and practice evolve separately
  • Match between current problems in industry and
    research solutions is haphazard
  • No winners

8
Disadvantages of RgtT
  • Research problems described and understood in
    terms of a solution technology - whatever is
    current research fashion (e.g., Formal Methods),
    often connection to practice is tenuous.
  • Lacks industrial need as the focus effort may be
    wasted.
  • Evaluation is difficult as research solutions
    often use technology not commonly available in
    industry.
  • Often, delays to evaluation mean the problem
    researchers are solving has evolved through
    changes in business practice, technology, etc.
  • Transfer is difficult because industry has little
    confidence in the proposed research solution.

9
Industry-as-Laboratory
Problem V1
Research Solution V1
Problem V2
Research Solution V2
Problem V3
Research Solution V3
Problem V4
Research Solution V4
10
Industry-as-Laboratory emphasizes Real Case
Studies
  • Real-world case studies have advantages over the
    lab..
  • Scale and complexity - small, simple (even
    simplistic) cases avoided - these often bear
    little relation to real problems.
  • Unpredictability - assumptions thrown out as
    researchers learn more about real problems
  • Dynamism - a real case study is more vital than
    a textbook account
  • The real-world complications of industrial case
    studies are more likely to throw up
    representative problems than research laboratory
    examples influenced by the researchers
    preconceptions.

11
Need to consider Human/Social Context in SE
research
  • Not all solutions in software engineering are
    purely technical.
  • There is a need to examine organisational, social
    and cognitive factors also.
  • Many problems are people problems, and require
    people-orientated solutions.

12
Theoretical SE research
  • Although it is still appropriate that there is
    innovative, blue-skies research in Software
    Engineering, what really needs to be rewarded is
    practical, industry-led studies.
  • These various forms of research ideally
    complement one another neither is particularly
    successful if it ignores the other.
  • Too industrially focused research may lack
    adequate theory, just as too academically focused
    research may miss the practice.

13
Research models for SE
  • Problem highlighted by Glass
  • Most SE Research in 1990s was Advocacy
    Research. Better research models needed.
  • The software crisis provided the platform on
    which most 90s research was founded.
  • SE Research ignored practice, for the most part
    lack of practical application and evaluation were
    gapping holes in most SE research.
  • Appropriate research models for SE are needed.
  • Robert Glass, The Software -Research Crisis,
    IEEE Software, November 1994

14
Methods underlying Models
  • Scientific method
  • Engineering method
  • Empirical method
  • Analytical method

15
Scientific Method

Observe real world
Propose a model or theory of some real world
phenomena
Measure and analyse above
Validate hypotheses of the model or theory
If possible, repeat
16
Engineering Method

Observe existing solutions
Propose better solutions
Build or develop better solution
Measure, analyse, and evaluate
Repeat until further improvements are impossible
17
Empirical Method


Propose a model
Develop statistical or other basis for the model
Apply to case studies
Measure and analyse
Validate and then repeat
18
Analytical Method


Propose a formal theory or set of axioms
Develop a theory
Derive results
If possible, compare with empirical observations
Refine theory if necessary
19
Need to move away from purely analytical method
  • The analytical method was the most widely used in
    mid-90s SE research, but other methods may be
    more appropriate in some SE research.
  • Good research practice combines elements, and
    examples, from all these approaches.

20
Four important phases for any SE research
project (Glass)
  • Informational phase - Gather or aggregate
    information via
  • reflection
  • literature survey
  • people/organization survey
  • case studies
  • Propositional phase - Propose and build
    hypothesis, method or algorithm, model, theory or
    solution
  • Analytical phase - Analyze and explore proposal
    leading to demonstration and/or formulation of
    principle or theory
  • Evaluation phase - Evaluate proposal or analytic
    findings by means of experimentation (controlled)
    or observation (uncontrolled, such as case study)
    leading to a substantiated model, principle, or
    theory.

21
ICSE 2002 types of study
  • Analysis
  • Evaluation
  • Experience
  • Example
  • Persuasion
  • (blatant) Assertion
  • the most successful studies were based on
    analysis and real-world experience
  • Writing Good SE Research Papers, Mary Shaw, CMU,
    2002

22
Evidence Based Software Engineering for
Practitioners
  • Convert a problem into an answerable question
  • Search the literature for the best available
    evidence
  • Critically appraise the evidence validity,
    impact, applicability
  • Integrate the appraised evidence with practical
    experience, and the customers values and
    circumstances, to make decisions about practice
  • Evaluate performance and seek ways to improve it
  • Dyba, Kitchenham, Jorgensen, IEEE Software,
    Jan/Feb 2005.
Write a Comment
User Comments (0)
About PowerShow.com