Huw T' O' Davies, PhD - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Huw T' O' Davies, PhD

Description:

Centre for Public Policy & Management. Research Unit for Research Utilisation ... (John Wilder Tukey) From the Draft Strategy Document: ... – PowerPoint PPT presentation

Number of Views:23
Avg rating:3.0/5.0
Slides: 17
Provided by: huwtod
Category:
Tags: phd | davies | huw | inducing | wilder

less

Transcript and Presenter's Notes

Title: Huw T' O' Davies, PhD


1
Huw T. O. Davies, PhD
  • Professor of Health Care
  • Policy Management
  • A response to the draft strategy from a research
    perspective
  • Centre for Public Policy Management
  • Research Unit for Research Utilisation
  • University of St Andrews

St A Picture/crest
2
Road-testing policy against research evidence -
There is nothing a politician likes so little as
to be well-informed. It makes decision making so
complex and so difficult. John Maynard Keynes
3
Issues in using comparative clinical indicators
  • Developing quality indicators,
  • that engage active local attention
  • without inducing dysfunctions.
  • (within a nuanced and appropriately
    socialised view of service dynamics)

4
Challenges in indicator development
  • Data inadequacies
  • validity, reliability, completeness etc.
  • Interpretation difficulties
  • case-mix adjustment, up-staging, context,
    statistical variation etc.
  • Presentation challenges
  • Creating actionable communications that allow
    ready identification of signal from noise.

Much in Draft Strategy aimed at rising to these
challenges and todays workshops are oriented
here too, BUT
5
Still some risk of interpretation drift?
When we cant measure what we want to measure,
we usually measure something else. i.e.
Quality/Performance -- Clinical Indicators
6
From the Draft Strategy Document
Clinical indicators rarely, if ever, provide a
direct measure of quality some reassurance
here, but
  • Indicators are typically presented at the level
    of NHS Board, Operating Division or hospital
    allowing comparisons to be made between service
    providers. - a recurrent theme.
  • Such language smuggles in expectations that
    comparisons can be fair and valid but

7
Challenges revisited need for vigilance!
  • Data inadequacies
  • validity, reliability, completeness etc.
  • Interpretation difficulties
  • case-mix adjustment, up-staging, context,
    statistical variation etc.
  • Presentation challenges
  • Creating actionable communications that allow
    ready identification of signal from noise.
  • And also these are often dated data-sets
    historical rather than contemporary accounts.

8
Rightly cautions against strong conclusions
emphasises learning rather than judgment
  • Plenty in the Strategy to support this
    orientation -e.g. Clinical indicators do not
    provide answers, they can rarely be used to
    make reliable external judgements about clinical
    quality/performance The indicators should be
    used in the context of learning rather than
    judgement.
  • But not always right off this particular fence!
    e.g. Indicators can be used as part of
    performance management and accountability
    activities and also to inform strategic review
    and planning of services.
  • USE implies USERS so, who are the indicators
    for?

9
Evidence Users and Use of Indicators
?
  • Do consumers/users use these data?

?
  • Do purchasers/commissioners use these data?

?
  • Do referring physicians use these data?

?
  • Do provider organisations, such as hospitals,
    use these data?

?
  • What about unintended impacts

10
Can measurementhave unintendedconsequences?
  • Research suggests unwanted organisation
    responses, such as Tunnel vision
    Sub-optimization Short-termism Convergence
    Risk-aversion Gaming Misrepresentation
    RECOGNISED but addressed only rather weakly.
  • More broadly does the Strategy contribute to
    The Audit Society In God We Trust, All
    Others Bring Data and does that matter?

11
The Biggest Challenge connecting national data
to local service delivery
  • Balancing top-down/bottom-up drivers
  • Gaining deeper clinical buy-inaddressing
    patient/user perspectives
  • Aligning both intrinsic and extrinsic
    motivators
  • Getting (customised) indicator support
    activities aimed at aidingself-directed local
    activities
  • Understanding the various actorsand their
    different types of use
  • Curbing misuse/abuse.

12
Summing up a balancing act!
  • Strategy needs to think through
  • Who really is/are the intended audience(s)?
  • Extent to which differences are interpretable
    and the clear communication of this.
  • The incentives needed to garner attention.
  • How the information can be actioned locally
    and how local learning/change can be supported.
  • How dysfunctional responses can be identified,
    tracked and minimised.
  • How all of the above relates to any indicator set.

13
And finally, a plea A commitment, please, to
ongoing research and evaluation of all aspects of
the new strategy. Studying not just indicator
development/testing But also implementation
processes and impacts (the good, the bad and the
ugly). Thank You
14
END
15
Checking TrustingExternal controls
Internal controls
The accountability pendulum...
16
  • Oversight orientation
  • performance is scrutinised
  • comparison with explicit standards
  • incentives rewards punishment
  • drives a compliance culture (of
    opportunists?)

Extrinsic motivation
Intrinsic motivation
Write a Comment
User Comments (0)
About PowerShow.com