Title: Evaluation in Medical Informatics
1Evaluation in Medical Informatics
- Kevin Johnson, MD, MSVanderbilt University
- Joan Ash, PhD, MLS, MBAOregon Health Science
University
2What Questions Would You Like Addressed?
3Case Report
- 1999 Development of a tool that used speech
recognition technology to complete an endoscopy
summary - 2001 Deployment of this tool in the endoscopy
suite - 2003 Developer of the tool, though not needed
to maintain it, leaves. - 2004 Tool removed from endoscopy
- 2005 New tool evaluated. You are asked to
support the purchasing of this tool.
4Questions
- What do you think happened the first time around?
- What could have been done to prevent this
outcome? - What were the issues you would want to resolve
before purchasing?
5Issues
- What happened?
- Lack of support
- Poor hardware, software, networking
- Lack of training
- Changing expectations
- Natural history of adopting this technology
- ???
6Issues (2)
- How to prevent this?
- Recruit new physician champion
- Force use of system
- Recognize foolishness of implementing system in
the first place - Could not prevent this without more information!
7Issues(3)
- Before purchasing
- commitment from users
- commitment from vendor
- costs
- benefits
- in short,
8Evaluation
- Evaluation is the systematic application of
social research procedures to judge and improve
the way information resources are designed and
implemented. - Step back and assess value
- rigorous methods, stick to a plan, reduce bias,
hire an evaluator - value broadly defined
9Why We Evaluate
10Why evaluate information technology?
speech recognition technology
Outcomes research databases
Scanner systems
Personal digital assistants
Decision aids
- Nowadays people know the price of everything
- and the value of nothing.
- Oscar Wilde
The Internet
Computer-based patient records
Expert systems
Physician order-entry
11Current Issues in IT System Evaluation
- Lack of measurement tools (Friedman and Abbas,
2003) - Studies are technical and descriptive (What I did
last summer) and ignore users/organizations/outcom
es (Darbyshire, 2004)
Evaluation involves more factors than just
involving technologies such as changing
attitudes, cultures and healthcare practices.
Realistic evaluation could provide configurations
of context-mechanism-outcomes that explain the
underlying relationships to understand why and
how a program or intervention works.
(Oroviogioicoechea, C Jnl Clin Nurs, 2007)
12Your Expertise is Required!
- You are implementing a PDA-based tool to help
with deciding what medication to prescribe,
creating the prescription, and routing it to a
pharmacy This tool would be integrated with the
electronic health record. Your institution is
willing to fund this project for 5 years, and you
get additional funding to evaluate it over that
period. Your results will impact whether the
institution supports this for the long term.
Note that your institution understands the
importance of patient safety to their overall
bottom line.
13Assignments (5 minutes)
- Row 1 - given 1 month before the tool is built,
what might you evaluate and how? - Row 2 - Now that you've seen the tool, there is
a decision to pilot it for a month with highly
motivated users. What might you evaluate and how? - Row 3 - The pilot is over and was successful.
There were a few pilot users who wished there was
more training, but by the end they were all good
at using it. What might you do to help the next
phase? - Row 4 - The tool is about to roll out. What would
you like to evaluate in the next 2 years, and
how?
14See you in 10 minutes
15Results
- Row 1 - given 1 month before the tool is built,
what might you evaluate and how? - Row 2 - Now that you've seen the tool, there is
a decision to pilot it for a month with highly
motivated users. What might you evaluate and how? - Row 3 - The pilot is over and was successful.
There were a few pilot users who wished there was
more training, but by the end they were all good
at using it. What might you do to help the next
phase? - Row 4 - The tool is about to roll out. What would
you like to evaluate in the next 2 years, and
how?
16Typical Questions
- Does it work with what effect
- Assign a value sell it, decide to buy it
- How to make it better
- Formative vs. summative
- Where does it work best
- What problems does it cause
- Uncover basic principles of medical informatics
17What can be studied?
Need for resource
Development process
Resource development
Plans for change
Feedback
Deployment
Stabilization in workplace
18Stead Evaluation Framework
Stead, JAMIA 1994
19Small Ball!
Results Smallball evaluations have several
important advantages over powerball evaluations
before system development, they ensure that
information resources address real community
needs during deployment, they ensure that the
systems are suited to the capabilities of the
users and to community constraints and, after
deployment, they enable as much as possible to be
learned about the effects of the intervention in
environments where randomized studies are usually
impossible.
20Why study IT resources?
- Promotional studies demonstrate safety,
efficacy, benefits - Scholarly studies increase understanding of
structure, function, and effects of IT resources - Pragmatic studies learn what works and what
doesnt work - Ethical studies Assure that IT resources are
safe, cost-justified - Medicolegal studies Reduce risk of liability
21Promotional Rationalization
- Safety
- Benefit (and to whom)
- cost-effective
- reengineering impact
22Scholarly Rationalization
- Potential of technology
- Need for further research
- Uncovering the principles of Medical Informatics
23Pragmatic Rationalization
- enabling comparisons
- enabling decisions
- defining mistakes
24A Pragmatic ToolThe Request for Proposals
- Specifies a (business) need and a rationale for
that need - Specifies the criteria by which a solution to the
need will be chosen - May be sent to all potential solutions
providers who can choose to reply if they feel
that their product is aligned with the needs of
the requestor
The Response is a tool for EVALUATING SOLUTIONS
25RFP Components
- Detailed requests for information (hardware,
software specifications, people, process, other) - specify minimum and mandatory information needed
- specify where additional information may be
provided
- Introduction and Background
- Invitation to submit proposal and ground rules
- Timeline
- Evaluation methodology
26Ethical Rationalization
- REALLY understanding safety
- REALLY justifying it over other resources and
innovations that compete for the same budget
27Medicolegal
- Reduce liability risk of developer and users
- Establish role of resource in the spectrum of
decision-support tools - product liability
- professional services liability
28Consider the stakeholders
- Developers
- Purchasers
- Users
- Patients
- Payers
29Study Designs
30Descriptive study
Acad Med 20047955763
- Uncontrolled
- Look at a single group to assess some property at
one point in time
intervention
31Historically controlled
- Before-after studies
- single cohesive unit (ICU alerting system)
- other explanations (new antibiotic, awareness)
- simultaneous control
control
intervention
32Time series
control
intervention
control
intervention
Ann Intern Med. 2004141196-204.
33Simultaneous randomized controls
- McDonald CJ. Reminders to physicians from an
introspective computer medical record. Ann Intern
Med 1984. - physicians respond to outpatient reminders
- physician teams were randomized to study and
control - can infer causality
intervention
study subjects
control
34Randomized crossover study
- McDonald CJ. Physician response to computer
reminders. JAMA 1980. - randomized crossover
- C-S1(no lit)-S2(lit), S1-C-S2,
- compare subjects to selves better power
- worry about carryover (learning)
- physicians respond to outpatient reminders
- effect is not learned
- never used the associated literature
control
intervention
study subjects
control
intervention
35Survey research
- Basic survey design
- How important for you is it to know that the
results of your self-testing can be reviewed in
the medical center immediately after the test? - Extremely important Very important Uncertain
Not important at all - Measurement
- (paper ruler)
36Error
- Reliability
- precision (vs. random error)
- are you measuring something
- Validity
- accuracy (vs. systematic error)
- are you measuring what you want
37Hierarchy of populations
external population (wish to generalize to)
External validity
target population (what you tried to sample)
Internal validity
actual population (what you sampled)
Statistical inference
study population (your actual subjects)
38(Validity)
- Statistical inference
- is there an association
- p-value
- Internal validity
- are the conclusions (cause and effect) valid
within the setting of the study - reminders reduced infection rate by 30
- External validity
- can the conclusions be applied in other settings
- we can expect a reduction in other hospitals
39(No Transcript)
40Anything Wrong With This?
- You are studying a new rule in your decision
support system that should improve
formulary-based prescribing. You have a set of
data including, for each row, patient id,
provider id, order data, and whether or not the
order was picked up. Your unit of analysis is
the order. You find no difference in whether the
prescription was picked up pre and post.
41Anything Wrong With This?
- You are asked to determine whether a new method
for information retrieval is superior to older
methods. You develop a set consisting of 50
articles that should be retrieved and 50 that
should not be. You find the new method superior,
with a positive predictive value of 90 compared
with the published data from the old method, with
a PPV of 45.
42Anything Wrong With This?
- A study conducted over a 6-month period and
evaluating the impact of CPOE on the mortality
rate in an Academic Medical Centers Intensive
Care Unit. You see a 2-fold, statistically
significant increase in the mortality rate
compared to the previous 6 months (before CPOE
was instituted.)
43What are qualitative methods?
- The design plan is iterative and flexible
- Data are words rather than numbers (usually)
- Larger context is considered
44Qualitative and quantitative are not in conflict
- Different ways of seeking truth
- Inductive vs. deductive
- Perceptive/Subjective vs. objective
- Contextual vs. generalized
45- Qualitative research INDUCTIVE
- Purposes
- generate theory from observation
- oriented to discovery, exploration
- Procedures
- emergent design
- flexible timeframe
- Quantitative research DEDUCTIVE
- Purposes
- tests theory through observations
- oriented to cause and effect
- Procedures
- predetermined design (protocol, frozen)
- fixed timeframe
46Qualitative methods can yield discoveries
- "Not everything that can be counted counts, and
not everything that counts can be counted." - - Albert Einstein (1879-1955)
47Qualitative methods take many forms
- Particularly appropriate for answering certain
research questions - Interviews, focus groups, participant
observation, artifact and document analysis, and
various combinations like case studies
48Strategies for rigor enhance trustworthiness
- Reflexivity (know thyself)
- Triangulation
- Member checking
- Saturation in the field
- Audit trail
49Interviews can be effective
- Interviews are not conversations
- Can be structured, semi-structured, or fairly
unstructured - Hear the human voices of assessment
- Discover motives, uncover multiple perspectives
50Use purposive selection and dont ignore the
curmudgeons
51Focus groups are another way to gather data
- Are more than group interviews
- Benefit from synergy and energy
- Are not easier than interviews
52Observation is another major method
- "Where observation is concerned, chance favors
only the prepared mind." - -- Louis Pasteur
53Use your antennae
- You can observe a lot just by watching
- -- Yogi Berra
54Get others to use their antennae
- I only wish theyd come spend a day with us
- -physician about administrators
- Youre so lucky you get to watch so much
- -chief information officer to me
-
55Observation can verify interview data
- Can watch typical daily tasks in context
- Can be more or less participative
- Must be planned and rigorous
56Action research is a mixed methods approach
- Definitions
- A collaborative approach to research that
provides people with the means to take systematic
action in an effort to resolve specific problems - Action research is an approach that aims to both
take action and create knowledge or theory about
that action - Researchers get to do research (publish) and
insiders get help
57Rapid Ethnographic Assessment (REA) is another
mixed methods approach
- Also called Quick Ethnography
- Speedy but effective ethnography
- Uses careful project management and mixed method
ethnographic tools
58Case study research is also a mixed methods
approach
- An in depth look at one or more organizations
- Exploratory
- Descriptive
- Explanatory
59Our CPOE study is a good example using multiple
qualitative methods
- In 1997, computerized physician order entry was
touted as the answer to medical error problems - We did national surveys in 1997 and 2003 to
measure diffusion - We mistrusted the results
60We used observation, interviews, and focus groups
- Interns look and feel like this
- So theres a need to be unobtrusive
61Good field notes were critical
- Handwritten during observation and interviews
- Improved upon and typed in full soon after
62Transcripts and field notes were carefully
analyzed by individuals
- Coding the transcripts
- Use of software
- Building themes
63We conducted further analysisduring
multidisciplinary team meetings
64The results were rich descriptions and useful
insights
- Principles for successful implementations and
types of unintended consequences of CPOE
65Case 2
- Project 2 Assessing the Impact of Bedside
Information Tools (Electronic Textbooks) - Goal Measure the impact of the tools on health
care quality - Project Summary
- This project will take the best resources from
project 1 and attempt to increase their use at
the point of care, perhaps by supplying
patient-specific links in the clinical
information system (Infobuttons) or by providing
easy access from bedside terminals. It should be
designed as an experiment, with an intervention
and a control group. The measured outcomes should
be desirable and likely to improve.
66Case
- Describe program
- Stakeholders
- Purpose
- What to evaluate
- Needs
- Designs skills
- Structure
- Function
- Impact
- Metrics, insights
- Methods
- Lab trial (fnc)
- Field trial (RCT)
- Survey instruments
- Ethnographic
- Focus group
- Cognitive
- Case studies
- Rapid ethnographic assessment
- Action research
67EvaluationThe Rubber Meets the Road!
68Your Expertise is Required!
- You are implementing a PDA-based tool to help
with deciding what medication to prescribe,
creating the prescription, and routing it to a
pharmacy This tool would be integrated with the
electronic health record. Your institution is
willing to fund this project for 5 years, and you
get additional funding to evaluate it over that
period. Your results will impact whether the
institution supports this for the long term.
Note that your institution understands the
importance of patient safety to their overall
bottom line.
69Assignments (5 minutes)
- Row 1 - given 1 month before the tool is built,
what might you evaluate and how? - Row 2 - Now that you've seen the tool, there is
a decision to pilot it for a month with highly
motivated users. What might you evaluate and how? - Row 3 - The pilot is over and was successful.
There were a few pilot users who wished there was
more training, but by the end they were all good
at using it. What might you do to help the next
phase? - Row 4 - The tool is about to roll out. What would
you like to evaluate in the next 2 years, and
how?
70Well Done!