Introductory Remarks - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

Introductory Remarks

Description:

List several factors that can make it difficult to do evaluation in medical ... Tailor the study to the problem and collect information to address questions ... – PowerPoint PPT presentation

Number of Views:46
Avg rating:3.0/5.0
Slides: 35
Provided by: cours7
Category:

less

Transcript and Presenter's Notes

Title: Introductory Remarks


1
Introductory Remarks Points of Departure
  • What we do here will overlap to some extent with
    classical statistics and research design courses,
    but
  • We will focus on specifically on evaluation
  • We will cover both objectivist (quantitative)
    and subjectivist (qualitative) approaches
  • We will stress issues indigenous to informatics
    and use informatics examples, including a case
    study
  • We will use the term information resource very
    generally to refer to the kinds of interventions
    one might wish to study

2
The Plan
  • Session 1 Intro and Overview
  • Session 2 Objectivist (Quantitative) Studies
  • Session 3 Subjectivist (Qualitative) Studies

3
Session 1 Objectives
  • Describe the purposes and key features of
    evaluation in medical informatics.
  • List several factors that can make it difficult
    to do evaluation in medical informatics
    effectively, as well as strategies that can be
    employed to address these difficulties.
  • Distinguish objectivist (quantitative) and
    subjectivist (qualitative) approaches, describe
    the assumptions that underlie them, and explain
    why both methods are used.
  • Identify the major steps in the process of
    conducting objectivist and subjectivist studies.
  • Identify the individuals or groups comprising the
    audience for an evaluation study.
  • Describe how the methods and purposes of a study
    can be matched the level of maturity of an
    information resource.

4
Things to Keep in Mind
  • We expect too much of evaluation. We expect
    studies to be definitive to tell us exactly what
    to do, to pass ultimate judgment on an
    information resource, and to appeal to a
    universal audience.
  • In doing so, we set ourselves up to fail.
    Successful evaluations need only be helpful to a
    identified audience for which the study is
    performed. They need to inform decisions, not
    dictate them.
  • Rarely, if ever, is a single study (even a
    randomized trial) definitive.

5
Definition of Evaluation
  • After House (1980)
  • Evaluation leads to the settled opinion that
    something is the case, usually, but not always
    leading to a decision to act in a certain way.

6
Key Features of Evaluation
  • Evaluations are done for some group or
    groups--the audience(s)
  • Evaluations answer questions of interest to the
    audience(s)
  • Evaluations answer questions with data that can
    take many forms
  • Evaluation is an empirical process, using the
    methods of science
  • Evaluations are successful if they are
    informative to the audience(s)

7
What Folks Often Want to Know Before and During
Development
  • Is there a need for the resource?
  • What are the needs?
  • What functions should be built into the resource
    to meet the identified needs?
  • How can specific features be optimally designed
    for intended users?
  • Based on performance of prototypes, does the
    resource have potential to meet the needs?

8
What Folks Often Want to Know After Deployment
  • Is it working as intended?
  • How can it be improved?
  • Does it make any difference?
  • Are the differences it makes beneficial?
  • Are the differences those envisioned by the
    developers?
  • Add to all of the above Why or why not?

9
Why Addressing these Questions Can be Difficult
  • We live in a pluralistic world. There will be
    many points of view on need, benefit, quality.
  • It is hard to know in advance what is really
    going to be important and how much evaluation is
    enough.
  • Tsystem change lt Tstudy execution
  • (Its often unacceptable to freeze a system
    long enough to study it.)
  • When the focus of study is real people using
    deployed technology, things can go wrong for very
    complicated reasons

10
Why Addressing these Questions Can be Difficult
(More)
  • Fitting a study into a complex work environment
    and obtaining compliance can be difficult. (The
    paradox of changing a resource in order to study
    it.)
  • Audiences often have unrealistic expectations of
    evaluation
  • Sometimes they dont really want to know...

11
A Formula for Success The Evaluators Mindset
  • Evaluation, like politics, is an art of the
    possible
  • Have realistic goals informative, not definitive
  • Tailor the study to the problem and collect
    information to address questions posed by the
    audience
  • Be
  • Focused (always have plan)
  • Open (to intended and unintended effects)
  • Flexible (prepared to change your plan)
  • Be open to doing both lab and field studies. If
    possible, study the resource while under
    development and after deployment.

12
A Formula for Success The Evaluators Mindset
(More)
  • Match the methods to the stage of resource
    development.
  • Understand your relationship with the developers,
    which can take many forms. You may be the
    developer.
  • Keep careful records of everything you do
  • Understand the tradeoffs and resign yourself to
    them (see next slide)

13
The General Process of Evaluation
14
Roles in Evaluation The Playing Field
Evaluation
Funder
Public Interest
Evaluation Team
Groups and
Director
Professional Societies
Staff
Development Team
Director
Development
Staff
Funder
Those
Who Use
Similar
Resources
15
The Process Expanded Negotiation and Contract
  • Identify the primary audience(s) and interact
    with them
  • Set general goals and purposes of the study
  • Identify, in general, the methods to be used
  • Identify permissions, accesses, confidentiality
    issues and other key administrative aspects of
    the study (IRB considerations)
  • Describe the result reporting process
  • Reflect this in a written agreement

Negotiation
"Contract"
16
The Process Expanded Questions
  • More specific questions derive from the general
    purposes of the study
  • They will be grounded in the particulars of the
    information resource and its intended audience
  • Questions should be 5-10 in number
  • They do not have to be stated as hypotheses
  • Depending on methods used, the questions can
    change over the course of the study

Questions
17
The Process Expanded Investigation
  • Choose data collection methods that can address
    the study questions
  • There are two major families of investigational
    approaches objectivist and subjectivist
  • Although some studies use both families,
    typically you will choose one or the other

Questions
Report
Investigation
Negotiation
"Contract"
18
The Process Expanded Report
  • Think of a report as the process of communicating
    findings reporting is often done in stages
  • It doesnt have to be a written document
    exclusively could also include private and
    town meetings
  • Communication must be targeted at the audience(s)
    and conveyed in language they can understand
  • Report must conform to ground rules set forward
    in the evaluation agreement
  • A published paper is not necessary and may be
    inappropriate in some cases

Questions
Report
Investigation
Negotiation
"Contract"
19
Doing Studies The Great Schism and Some Bad
Terminology
  • Objectivist Approaches
  • Dont call them
  • quantitative approaches (bad)
  • objective approaches (worse)
  • Subjectivist Approaches
  • Dont call them
  • qualitative approaches (bad)
  • subjective approaches (worse)

20
Clarification of Terms
  • Objectivist and Subjectivist differing
    approaches based on differing philosophical
    assumptions
  • Qualitative and quantitative different kinds of
    data that can be collected
  • Objective and subjective descriptive qualities
    of data

21
Objectivist Approaches Underlying Assumptions
  • Properties inhere in the object under study
  • An investigator can measure these properties
    without affecting the object. The result should
    be independent of the observer.
  • Everyone agrees, or can be brought to consensus,
    on what is good and right
  • Numerical measurement is prima facie superior to
    verbal description

22
The General Process of Evaluation
23
Anatomy of an Objectivist Study
Linear Investigative Sequence
Instrumentation
Questions
Preliminary
Negotiation
Report
Final
Report
"Contract"
24
Staging of Objectivist Studies

25
Example of Objectivist Study Clinician Decision
Support System
  • See de Bliek, et. al. SCAMC Proceedings, 225-228,
    1988.
  • Negotiation Goal of study is to inform the
    further design of patient-specific advisories
    relating to drug therapy. Decision support
    system was well-developed but not yet deployed.
    Study arose out of close tie between evaluators
    and development team.
  • Staging Laboratory Design/Function Study
    (pre-deployment)

26
Example of Objectivist Study Clinician Decision
Support System
  • Questions
  • 1) Do users prefer informational or educational
    formats?
  • 2) Do preferences vary by user type or medical
    content?
  • 3) How satisfactory is the preferred format?
  • 4) What is the overall, pre-deployment
    receptivity to the system?
  • Investigation Sample, but realistic (simulated)
    advisories shown to clinicians from eventual user
    group. Same advisory content presented with
    different format. Clinicians express preferences.

27
Example of Objectivist Study Clinician Decision
Support System
  • Results
  • 1) Preference for terse, informational formats.
  • 2) Preferences consistent across practitioner
    groups.
  • 3) Preferred format highly acceptable.
  • 4) Mid- to high-receptivity to system overall.
  • Report Meetings of evaluators and design team
    SCAMC paper
  • Decisions Adoption of preferred format.

28
Underlying Assumptions Subjectivist Approaches
  • When phenomena involve people and become complex,
    there is no a single truth about them
  • Different observers will disagree
  • Individuals and groups legitimately hold very
    different perspectives on what is good and right.
  • Verbal description is essential to portraying
    these varying perspectives

29
The General Process of Evaluation
30
Anatomy of a Subjectivist Study
Iterative Investigative Loop
Data
Immersion Initial Questions
Preliminary
Collection
Negotiation
Report
Analysis
Final
Reflection/
Report
Reorganization
"Contract"
31
Progressive Focusing in Subjectivist Studies
Issue
Issue
Issue
Issue
Issue
Observation
Observation
Issue
Issue
Issue
Issue
Issue
Issue
Issue
Interviews
Interviews
Issue
Issue
Issue
32
Example Subjectivist Study Patient Education
System
  • See Forsythe. SCAMC Proceedings, 505-508, 1992.
  • Negotiation Study was to inform design of a
    patient education system to explain migraine.
    System at a very early stage of development.
  • Initial Questions
  • 1) Do migraine patients request explanatory
    material?
  • 2) What do they want to know?
  • 3) Are there patient types to which this
    material should be tailored?
  • 4) Will tailoring information increase compliance?

33
Example Subjectivist Study Patient Education
System
  • Investigation Observation of physician-patient
    interactions. Interviews of patients. Results
    contrasted with original assumptions of system
    developers.
  • Results Two of four assumptions shown to be
    wrong the other two needed modification.
  • Report Meetings of evaluators and design team
    several papers
  • Decision Substantial rethinking of system design.

34
Main Take Home Points
  • We expect too much of evaluation. We expect
    studies to be definitive to tell us exactly what
    to do, to pass ultimate judgment on an
    information resource, and to appeal to a
    universal audience.
  • In doing so, we set ourselves up to fail.
    Successful evaluations need only be helpful to a
    identified audience for which the study is
    performed. They need to inform decisions, not
    dictate them.
  • Rarely, if ever, is a single study (even a
    randomized trial) definitive.
Write a Comment
User Comments (0)
About PowerShow.com