Interactive Goal Model Analysis Applied - Systematic Procedures versus Ad hoc Analysis PowerPoint PPT Presentation

presentation player overlay
About This Presentation
Transcript and Presenter's Notes

Title: Interactive Goal Model Analysis Applied - Systematic Procedures versus Ad hoc Analysis


1
Interactive Goal Model Analysis Applied -
Systematic Procedures versus Ad hoc Analysis
  • Jennifer Horkoff1
  • Eric Yu2
  • Arup Ghose1
  • Department of Computer Science1
  • Faculty of Information2
  • jenhork_at_cs.utoronto.ca yu_at_ischool.utoronto.ca
    arup.ghose_at_utoronto.ca
  • University of Toronto
  • November 10, 2010
  • PoEM10

2
Goal Modeling
  • Used as a tool for system analysis and design in
    an enterprise
  • Captures social-driven goals which motivate
    design or redesign
  • First sub-model of Enterprise Knowledge
    Development (EKD) method
  • Used in several Requirements Engineering
    frameworks
  • i (Yu, 97)
  • Tropos (Bresciani et al., 94)
  • GBRAM (Antón et al., 98)
  • KAOS (Dardenne van Lamsweerde, 93)
  • GRL (Liu Yu, 03)
  • Etc.

3
Goal Model Analysis
  • Work has argued that more utility can be gained
    from goal models by applying systematic analysis
  • Many different types of analysis procedures have
    been introduced (metrics, model checking,
    simulation, planning, satisfaction propagation)
  • Most of the work in goal model analysis focuses
    on the analytical power and mechanisms of the
    procedures
  • What are the benefits of goal model analysis?
  • Do these benefits apply only to a systematic
    procedure? Or also to ad-hoc (no systematic
    procedure) analysis?
  • Focus interactive satisfaction propagation

4
Hypotheses Benefits of Systematic, Interactive
Goal Model Analysis
  • Previous work by the authors has introduced
    interactive, qualitative goal model analysis
    aimed for early enterprise analysis (CAiSE09
    Forum, PoEM09, IJISMD)
  • Hypotheses concerning benefits of interactive
    analysis developed through application of several
    case studies (PoEM09, PST06, REFSQ08,
    HICSS07, RE05)
  • Analysis aids in finding non-obvious answers to
    domain analysis questions
  • Model Iteration prompts improvements in the
    model
  • Elicitation leads to further elicitation of
    information in the domain
  • Domain Knowledge leads to a better understanding
    of the domain
  • In this work we design and administer studies to
    test these hypotheses

5
Background i Models
  • We use i as an example goal modeling framework

6
Real Example inflo Case Study
7
Background Interactive Satisfaction Analysis
  • Forward A question/ scenario/ alternative is
    placed on the model and its affects are
    propagated forward through model links
  • Interactive user input (human judgment) is used
    to decide on partial or conflicting evidence
    What is the resulting value?
  • Publications CAiSE09 Forum, PoEM09, IJISMD
  • Additional procedure for backward analysis,
    allows is this possible? questions
  • Publications istar08, ER10

Human Judgment
Human Judgment
What if?
8
Case Study Design
  • One group study involving inflo
    back-of-the-envelope calculation and modeling
    tool (case group)
  • Four grad students, 1 professor, and 1
    facilitator
  • Three two hour modeling sessions one hour
    analysis session
  • Most of each session devoted to developing the
    model discussion with analysis at the end of
    each session
  • Ten two-hour sessions with an individual and a
    facilitator (case individual)
  • Five used systematic forward and backward
    analysis implemented in OpenOME
  • Five were allowed to analyze the models as they
    liked
  • Individual study design was modified midway
    through
  • Divided into Round 1 and Round 2
  • Studies were both exploratory and confirmatory

9
Individual Studies (Round 1)
  • Participants students who had i experience in
    system analysis courses or through i-related
    projects
  • Purposive selection wanted subjects with some
    i knowledge but not much analysis experience
  • Training
  • Participants given 10 minutes of i training
    (including analysis labels)
  • Systematic participants given 15 minutes of
    analysis training using the tool
  • Model Domain ICSE Greening models, large to
    medium models created by others
  • Analysis Questions 12 questions provided
  • 2 for each analysis direction (forward, backward)
    per model 3 models

10
ICSE Greening Example Conference Experience Chair
11
Individual Studies
  • Intermediate (Round 1) results
  • Models were too complicated
  • Too many analysis questions
  • Participants unfamiliar with domain
  • Didnt care about judgment decisions
  • Made very few changes to models (too afraid to
    change others work? too intimidated to change
    complex models?)

12
Individual Studies (Round 2)
  • Round 2 Changes (last 4/10 participants)
  • Model Domain Asked participants to create their
    own models describing student life
  • Group case study showed that participants had
    trouble finding analysis questions over their own
    model
  • Created Analysis Methodology to help guide the
    analysis
  • Extreme test conditions (all alternatives/targets
    satisfied/denied)
  • Analyze likely alternatives/targets
  • Analyze domain-driven questions

13
Data Capture
  • Analysis captured answers to analysis questions
  • Model Iteration quantitative counts of model
    changes for each stage in the studies
  • Elicitation captured lists of questions asked
    about the domain in each stage
  • Domain Knowledge follow-up questions about
    experience
  • Recorded and analyzed other interesting
    qualitative findings

14
Results
15
Analysis
  • Analysis aids in finding non-obvious answers to
    domain analysis questions
  • Some participants gave explicit answers, others
    had difficultly producing answers
  • Some referred to analysis labels in the model as
    answers to the question
  • Only some participants were able to interpret
    analysis results in the context of the domain
  • Generally, difficulty in mapping the model to the
    domain
  • Conclusion knowledge of i and the domain may
    have a significant effect on the ability to apply
    and interpret analysis

16
Model Iteration Elicitation
  • Model Iteration prompts improvements in the
    model
  • Elicitation leads to further elicitation of
    information in the domain
  • Few changes, few differences between ad hoc
    systematic, familiar and unfamiliar domain,
    forward backward

Model Changes Model Changes Questions Asked Questions Asked  
Treatment Partic. Forward Questions Backward Questions Forward Questions Backward Questions Round
Ad-hoc P1 59 10 10 1 1
Ad-hoc P4 0 0 1 0 1
Ad-hoc P5 5 13 6 6 1
Ad-hoc P7 2 5 0 0 2
Ad-hoc P9  0 5 0 0 2
Systematic P2 0 0 2 3 1
Systematic P3 0 0 2 0 1
Systematic P6 0 3 5 1 1
Systematic P8 0 0 2 2 2
Systematic P10  0 0 0 1 2
17
Model Iteration Elicitation
  • Conflicts with previous results (PoEM09, PST06,
    etc.), Why?
  • Underlying theory interactive analysis prompts
    users to notice differences between mental domain
    model and physical model
  • Evaluation did not reveal differences between the
    mental and physical model, or these differences
    existed, but were not used to modify the model

18
Model Iteration Elicitation
  • Previous studies were conducted by i/modeling
    experts who had commitment to the project
  • Conclusion Model iteration and elicitation
    relies on
  • More extensive knowledge of syntax and analysis
    procedures
  • More extensive knowledge of the domain
  • buy-in/caring about a real problem

19
Domain Knowledge
  • Domain Knowledge leads to a better understanding
    of the domain
  • Follow-up question do you feel that you have a
    better understanding of the model and the domain
    after this exercise?
  • 7/10 participants said yes (mix of ad-hoc and
    systematic participants)
  • Conclusion both ad-hoc and systematic knowledge
    can help improve domain knowledge

20
Additional Findings
  • Promoted Discussion in Group Setting human
    judgment caused discussion among participants
  • Example what is meant by Flexibility?
  • Model Interpretation Consistency
  • i syntax leaves room for interpretation
  • Results shows a variety of interpretations when
    propagating analysis labels with ad-hoc analysis
  • Conclusion systematic analysis provokes a more
    consistent interpretation of the model
  • Coverage of Model Analysis
  • Results show significant differences in the
    coverage of analysis across the model with
    systematic vs. ad-hoc analysis
  • Model Completeness and Analysis
  • Analysis may not be useful until the model is
    sufficiently complete
  • Some participants noticed incompleteness in the
    model(s) after applying analysis

21
Conclusions and Future Work
  • Designed and administered studies to test
    perceived benefits of interactive goal model
    analysis
  • Initial Hypotheses Analysis, Model Iteration,
    Elicitation, Domain Knowledge
  • Benefits dependent on
  • Knowledge of i and i evaluation
  • Presence of an experienced facilitator
  • Domain expertise/buy-in
  • The presence of a real motivating problem
  • Discovered benefits Interpretation Consistency,
    Coverage of Model Analysis, Model Completeness
  • Several threats to validity (construct, internal,
    external, reliability) described in the paper
  • Future Work
  • More realistic action-research type studies
  • Better tool support make the tool the expert?

22
Thank you
  • Questions?
  • jenhork_at_cs.utoronto.ca
  • www.cs.utoronto.ca/jenhork
  • yu_at_ischool.utoronto.ca
  • www.cs.utoronto.ca/eric
  • arup.ghose_at_utoronto.ca
  • OpenOME
  • https//se.cs.toronto.edu/trac/ome

23
Outline
  • Goal Modeling
  • Goal Model Analysis
  • Hypotheses Benefits of Systematic, Interactive
    Goal Model Analysis
  • Background i Syntax
  • Background Interactive Goal Model Analysis
  • Case Study Design
  • Group study
  • Individual Studies
  • Results
  • Threats to Validity
  • Conclusions and Future Work

24
Goal Model Analysis
  • Work has argued that more utility can be gained
    from goal models by applying systematic analysis
  • Many different types of analysis procedures have
    been introduced
  • Metrics (Franch, 06) (Kaiya, 02)
  • Model checking (Fuxman et al., 03) (Giorgini et
    al., 04)
  • Simulation (Gans et al., 03) (Wang Lesperance,
    01)
  • Planning (Bryl et al., 06) (Asnar et al., 07)
  • Satisfaction Propagation (Chung et al., 00)
    (Giorgini et al., 05)
  • Most of this work focuses on the analytical power
    and mechanisms of the procedures
  • What are the benefits of goal model analysis?
  • Do these benefits apply only to a systematic
    procedure? Or also to ad-hoc (no systematic
    procedure) analysis?

25
inflo (Group) Case Study
  • inflo back-of-the-envelope calculation and
    modeling tool
  • Support informed debate over issues like carbon
    footprint calculations
  • Four grad students, 1 professor, and 1
    facilitator
  • Three two hour modeling sessions one hour
    analysis session
  • Most of each session devoted to developing the
    model discussion
  • Used systematic model analysis at the end of each
    session

26
Individual Studies (Round 1)
  • Analysis Questions 12 questions provided
  • 4 per model (3 models)
  • 2 for each analysis direction (forward, backward)
    per model
  • Example (forward)
  • If every task of the Sustainability Chair and
    Local Chair is performed, will goals related to
    sustainability be sufficiently satisfied?
  • Example (backward)
  • What must be done in order to Encourage informal
    and spontaneous introductions and Make conference
    participation fun?

27
Analysis Methodology
  • 1. Alternative Effects (Forward Analysis)
  • a) Implement as much as possible all leaves are
    satisfied
  • b) Implement as little as possible all leaves
    are denied
  • c) Reasonable Implementation Alternatives
    Evaluate likely alternatives
  • 2. Achievement Possibilities (Backward Analysis)
  • a) Maximum targets all roots must be fully
    satisfied. Is this possible? How?
  • b) Minimum targets lowest permissible values
    for the roots. Is this possible? How?
  • c) Iteration over minimum targets try
    gradually increasing the targets in order to find
    maximum targets which still allow a solution.
  • 3. Domain-Driven Analysis (Mixed)
  • a) Use the model to answer interesting
    domain-driven questions

28
Threats to Validity
  • Construct Validity
  • Model changes may not be beneficial
  • Internal Validity
  • Presence of facilitator
  • Think-aloud protocol
  • Choice of model domain
  • External Validity
  • Used students
  • Used i - generalize to other goal model
    frameworks?
  • Reliability
  • Facilitator was i evaluation expert
Write a Comment
User Comments (0)
About PowerShow.com