Embracing Technology is DUMB' Embracing WellDesigned Technology is Smart - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Embracing Technology is DUMB' Embracing WellDesigned Technology is Smart

Description:

We know that patient safety problems are the result of the interaction between ... When is a defibrillator not a defibrillator? When it is like a clock radio... – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 31
Provided by: gmar60
Category:

less

Transcript and Presenter's Notes

Title: Embracing Technology is DUMB' Embracing WellDesigned Technology is Smart


1
Embracing Technology is DUMB.Embracing
Well-Designed Technology is Smart
  • Dr. Ben-Tzion (Bentzi) Karsh
  • Associate Professor
  • Industrial and Systems Engineering Department
  • Systems Engineering Initiative for Patient Safety
  • University of Wisconsin-Madison
  • (AHRQ R01 HS013610 (PI-Karsh))
  • (NIH 1R01LM008923-01A1 (PI-Karsh))

2
Take Away Messages
  • We know that patient safety problems are the
    result of the interaction between people, the
    technology they use, and the system in which they
    workSO LEARN HOW TO UNDERSTAND YOUR SYSTEM
    (dont focus on technology)
  • There is a science devoted to understanding how
    to design technology to support human performance
    (physical and cognitive). Its called human
    factors engineering (HFE)LEARN ABOUT IT!
  • Technology needs to be designed to be a team
    playerANYTHING LESS IN UNACCEPTABLE
  • HFE research makes clear that technologies are
    NOT solutions they should be ASSISTIVE DEVICES!

3
Main Take Away Message
  • THE ROAD TO PATIENT SAFETY AND HIGH QUALITY
    PATIENT CARE RUNS THROUGH THE PERFORMANCE OF YOU
    AND YOUR STAFF
  • So if your technology is bad, your performance
    will be bad. If your performance is bad, quality
    and safety suffer

4
Is there evidence HIT improves patient safety or
quality?
  • CPOE and CDSS?
  • Yes (Kaushal Bates, 2001 Kaushal, Shojania,
    Bates, 2003 Mekhjian et al., 2002 King, Paice,
    Rangrej, Forestell, Swartz, 2003 Potts, Barr,
    Gregory, Wright, Patel, 2004)
  • Bar coding?
  • Yes (Poon et al. 2006 Kaushal, Barker, Bates,
    2001 Puckett, 1995 Wald Shojania, 2001)
  • EMRs?
  • Yes (Mitchell and Sullivan 2001 Gill et al.
    2001 Legler and Oates 1993, Ornstein and Bearden
    1994, Solomon and Dechter 1995, Mitchell and
    Sullivan 2001, Garrison et al. 2002)

5
But something isnt right
  • EMR
  • Concerns related to cost, a lack of tested
    systems, number of steps to complete tasks,
    problems with data entry, inexperienced vendors,
    confidentiality, and security found (Ornstein and
    Bearden 1994, Wager et al. 2000, Mitchell and
    Sullivan 2001)
  • Entire systems have been abandoned because of
    problems with poor reliability, poor credibility,
    poor consistency, and problematic user interface
    design (Lawler et al. 1996)

6
But.
  • CPOE
  • Systems have been abandoned (Prabhu, 2003)
  • CPOE can increase the incidence rate of errors,
    adverse events and mortality (Koppel et al. 2005,
    Nebeker et al. 2005, Thompson et al. 2005, Han et
    al. 2005)
  • Bar coding
  • Nurse dont use it as they are supposed to and
    mistakes can still be made (McDonald 2006
    Patterson et al. 2002, 2006)

7
But.
  • Smart IV pumps
  • We found no measurable impact on the serious
    medication error rate technological and nursing
    behavioral factors must be addressed if these
    pumps are to achieve their potential for
    improving medication safety (Rothschild et al.
    2005)
  • CDSS in CPOE
  • Physicians override up to 90 of drug alerts
    (Weingart et al., 2003)

8
  • Oh my

9
Confession
  • We dont know to what extent technology DESIGN
    has caused patient safety problems.
  • In existing studies, it is nearly impossible to
    determine what was related to design, to
    implementation, to new workflow.

10
Whats the Problem? The Prevailing Paradigmfor
healthcare technology? (stolen from Matt
Scanlon, MD)
11
How do we get beyond this state of technology
will save me and technological determinism?
  • A few propositions

12
Proposition 1 Stop blaming everything on human
error!
13
  • Human error?

14
If the wrong number is keyed in, is it human
error?
  • At the bar code data committee meeting yesterday,
    we discussed the fact that nurses sometimes need
    to scan several times before the bar code
    'takes'. The nurse educator said that she
    teaches nurses that if they don't succeed on the
    first couple scans, to key in the number from the
    med bottle.

15
Human Error?
16
Proposition 2 Workarounds and violations of
technology protocols are GOOD! When you see
them, you should thank the violator!
17
Heroes or Dummies?
18
Heroes or villains?
  • Maybe workarounds/violations are the right choice
    when the technology is not appropriate for the
    situation?
  • Maybe workarounds are responses to poorly
    designed technologies and are therefore symptoms
    of the actual problems?

19
Proposition 3 Healthcare professional rejection
of technology is wonderful!
20
  • Maybe rejection is a sign that smart people dont
    want to be forced to work in ineffective ways
  • Maybe the technology is not designed to meet the
    needs, SIMULTANEOUSLY of
  • Staff
  • Patients
  • Other technologies
  • Physical layout

21
Proposition 4 Only purchase well-designed
technology
22
Technology as Assistive Device
  • What is the goal of a walking cane?
  • What makes a cane well-designed for a
    particular person?

23
Technology as Assistive Device?
  • What are the goals of bar code scanner?
  • What makes an bar code scanner well-designed
    for the users?

24
Well-designed technology
  • Better feedback to the user
  • Better cooperation with the user
  • Better visibility and transparency of what the
    technology is doing
  • Better matching of designs to mental models of
    the USER, not the designer and not the purchaser

25
Example
  • an A300 crashed in Nagoya, Japan, after the
    pilots inadvertently engaged the autopilots
    go-around mode. The pilots countered the
    unexpected pitch-up by making manual inputs,
    which turned out to be ineffective. Essentially,
    the pilot attempted to continue the approach by
    manually deflecting the control column, which in
    all other aircraftand in this aircraft in all
    modes except the approach modewould normally
    disconnect the autopilot. However, in this
    particular aircraft and in this particular mode,
    the autopilot had to be manually deselected and
    could not be overridden by control column inputs.
    Consequently, a power struggle developed between
    the pilot and the autopilot, with the pilot
    attempting to push the nose down through elevator
    control and the autopilot attempting to lift the
    nose up through trim control. This caused the
    aircraft to become so far out of trim that it
    could no longer be controlled.

26
Proposition 5 Use Human Factors Engineering
Expertise
27
Use Human Factors Engineering Design
Thinking(modified from Sanders and McCormick
1993)
  • Technologies need to be designed for and to work
    with people
  • Systems must be designed to accommodate the range
    of users
  • How systems are designed will influence human
    behavior and therefore system performance
  • Design needs to be evidence-based, not common
    sense or designer driven
  • All design must take into account the system of
    use

28
For more information
  • Karsh, B. and Scanlon, M. (2007). When is a
    defibrillator not a defibrillator? When it is
    like a clock radio. The challenge of usability
    and patient safety in the real world. Annals of
    Emergency Medicine, 50, 433-435.
  • Holden, R. J. and Karsh, B. (2007). A theoretical
    model of health information technology behavior.
    Behaviour and Information Technology, 1-17. DOI
    10.1080/01449290601138245. URL
    http//dx.doi.org/10.1080/01449290601138245
  • Karsh, B., Holden, R. J., Alper, S. J., and Or,
    K. L. (2006). A human factors engineering
    paradigm for patient safety designing to
    support the performance of the health care
    professional. Quality and Safety in Healthcare,
    15(Suppl I), i59-i65.

29
For more information
  • Carayon, P. (Ed.) (2007) Handbook of Human
    Factors in Health Care and Patient Safety.
    Lawrence Erlbaum Associates Mahwah, New Jersey.
  • Nielsen, J. Usability Engineering. New York AP
    Professional, 1993.
  • Norman, DA. The Design of Everyday Things. New
    York Doubleday. 1998.
  • Salvendy (Ed.), (2006) Handbook of Human Factors
    and Ergonomics (3rd ed). John Wiley and Sons.
  • Sanders, M. S., McCormick, E. J. (1993). Human
    Factors in Engineering and Design (7th ed).
    McGraw-Hill, Inc.
  • Spath (2000). Error Reduction in Health Care A
    Systems Approach to Improving Patient Safety.
    Jossey Bass Wiley.
  • Wickens, C. D., Lee, J. D., Liu, Y., Becker, S.
    E. G. (2004). An Introduction to Human Factors
    Engineering (2nd ed.). Prentice Hall.
  • Woods and Hollnagel (2006). Joint Cognitive
    Systems. CRC Press.

30
THANK YOU! QUESTIONS???
  • Ben-Tzion Karsh, Ph.D.
  • Associate Professor
  • Department of Industrial Engineering
  • UW-Madison
  • Contact InformationIndustrial EngineeringUnivers
    ity of Wisconsin-Madison1513 University Avenue,
    Room 3218Madison, WI 53706Tel
    608-262-3002Fax 608-262-8454E-mail
    bkarsh_at_engr.wisc.edu
  • www.engr.wisc.edu/mesh
Write a Comment
User Comments (0)
About PowerShow.com