Chris Graham - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Chris Graham

Description:

6 Review Jason Cox, Department of Health. 1. Background to the ... Follow accepted canons of good questionnaire design. Problems with measuring satisfaction ... – PowerPoint PPT presentation

Number of Views:46
Avg rating:3.0/5.0
Slides: 39
Provided by: natalie54
Category:
Tags: canons | chris | graham

less

Transcript and Presenter's Notes

Title: Chris Graham


1
The Health Perspective Patient Surveys
  • Chris Graham
  • Patient Survey Manager,
  • Healthcare Commission
  • 13th November 2007

2
Agenda
1 Background to patient survey programme
2 Model used to carry out patient surveys
3 Design and methodology
4 Lessons learnt
5 Development work
6 Review Jason Cox, Department of Health
3
1. Background to the patient survey programme

4
History of the survey programme
  • NHS Plan (2000) Commitment to collect the views
    of patients
  • All sectors expected to do at least one survey
    acute, primary care, mental health
  •  
  • To date, 24 surveys carried out
  • Responses from around one and a half million
    patients, across different sectors

5
Why conduct national surveys of patients?
  • Patient experience
  • Standard method allows for comparison in
    performance locally and nationally
  • Provides a national picture of services
  • Centrally developed surveys consistency across
    survey programme, constant revision of methods
    for improvement, quality controls

6
Purposes/applications
  • Trusts improving quality of services
  • Healthcare Commission/DH regulation, rating
  • Research generating new knowledge
  • Patients information on services

7
National NHS Patient Survey Programme
  • Provides valuable information to Trusts in terms
    of performance
  • Overview of services nationally
  • Data used extensively across Healthcare
    Commission and by stakeholders in assessment of
    services
  • Research into differences across patient groups

8
2. Model used to carry out patient surveys

9
How surveys are carried out
  • Depending on purpose, either 
  • National sample implemented centrally
  • Typically smaller sample, national results only
  • Trust-based sample devolved model
  • Very large samples, local level results
  • Here I will focus on the latter approach

10
Devolved, trust-based surveys
  • Co-ordination centre oversee implementation/
    fieldwork
  • Surveys carried out by Approved Contractors
  • 13 survey companies, approved by Healthcare
    Commission
  • Avoids need for trusts to follow same process
    for awarding contracts
  • Provides assurance of quality and cost of work

11
Approved Contractors
  • Print questionnaire and other materials from
    Coordination Centre website
  • Mail out questionnaires and reminders
  • Monitor responses
  • Receive completed questionnaires
  • Enter survey data
  • Send dataset to Coordination Centre

12
Survey process
Develop and test questionnaire, design materials,
run pilot
Coordination Centre
NHS Trust
Ethics committee (MREC) approval
Draw sample
Approved Contractor
Sample checks
FIELDWORK
Data cleaning and checks
13
Patient Confidentiality
  • Honorary Contract Approved Contractor staff
  • Sampled patients Involvement of Caldicott
    Guardian
  • Respondents Addresses separate from responses
  • Trusts do not receive patient-identifiable
    results
  • All surveys obtain MREC approval

14
3. Design and methodology
15
Questionnaire design
  • Principles
  • Focus on experiences, not satisfaction
  • Ask about specific reportable events
  • Focus on issues that matter to patients and
    service users
  • Follow accepted canons of good questionnaire
    design

16
Problems with measuring satisfaction
  • Lack conceptual/methodological rigour
  • Results tend to be highly positive
  • Lack discriminative ability
  • Overly subjective
  • The influence of expectations
  • Difficult to interpret dont identify causes
  • Not useful for quality improvement dont
    identify priorities

17
Problems with measuring satisfaction
18
Measuring experiences
  • Focus on recent care episodes
  • Minimises telescoping
  • Define scope of work
  • Look at specific aspects of care
  • Ask patients to report on what actually happened
    NOT to evaluate how satisfied they were

19
Good practice in design
  • Simple, direct questions
  • Ask about one thing at a time
  • Plain English
  • Avoid potentially leading wording
  • Give respondents opportunity to indicate if
    question is not relevant to them or if they
    cannot remember

20
Example questions (Adult inpatients survey 2007)
21
Involve patients in design
  • Focus groups Importance studies
  • Identify the issues that matter most to patients
  • Cognitive interviews
  • Ensure comprehensibility and acceptability
  • Iterative design process
  • Pilot testing

22
Cognitive interviews
  • Detailed interviews with patients
  • Invite patients to go through qnnaire and probe
  • Comprehension
  • Retrieval
  • Evaluation
  • Response
  • Iterative design process until saturation occurs

23
Methodology
  • Sampling
  • Representative sample of patients / service
    users
  • Trust-based surveys - usually 850 patients /
    service users sampled
  • Sampling procedure designed by specialists at
    Coordination Centres

24
Examples of sampling procedures
  • Inpatient survey Retrospective (flow) sample
    consecutive sample back until reach 850
  • PCT survey Population (stock) sample - random
    sample of all people on PCT list (systematic
    sample by age and gender)
  • Diabetes survey Sample from 10 practices per
    PCT, sample at each practice proportionate to
    practice list size

25
Questionnaire mailing
  • -NSTS check carried out (1st time)

First questionnaires mailed out
  • Respondents removed from mailing list
  • NSTS check carried out (2nd time)

2 week interval
Reminder slips mailed out
  • Respondents removed from mailing list
  • NSTS check carried out (3rd time)

2 week interval
Second questionnaire mailed out
26
Analysis
  • Data entered locally collated cleaned
    centrally
  • Rigorous data standards
  • All technical details published
  • Data standardised to take account of the effect
    of confounding variables (eg age, sex, admission)
  • Organisations benchmarked against each other
  • Year-on-year trends shown nationally

27
4. Lessons learnt

28
The value of patient surveys
  • Extremely rich, robust source of information
  • Useful at both a local and a national level
  • Understandable and actionable both for
    professionals and patients
  • Need to ensure buy-in amongst providers

29
Unique strengths of survey data
  • Surveys yield otherwise inaccessible data the
  • personal experiences of patients and service
    users
  • Locally identifying areas for improvement, and
    testing implementation
  • Nationally exploring relationship between
    clinicians and patients experiences building
    an holistic understanding of how care is working

Measure
Remeasure
Improve
30
Understanding variations in experiences
  • Different groups of patients have different
    experiences of healthcare - understanding these
    is critical to reducing inequalities
  • eg, younger people, BME groups give less
    favourable reports of their experiences
  • Looking in detail at these differences, both
    nationally and locally, can help to identify
    areas where services would benefit from changes

31
(No Transcript)
32
Developing support and buy-in
  •  
  • Research with providers shows increasing
    understanding and appreciation of surveys
  • It would be awful if the surveys were stopped
    now. It takes 2-5 years for a new initiative to
    be accepted. The surveys are really starting to
    get accepted now. A few years ago, there was a
    lot of resistance.
  • But acceptance depends on good methods
  • We have nothing else that is so sophisticated
    and would give us such useful data.

33
5. Development work

34
Why undertake development work?
  • Tradition of development work from conception of
    survey programme
  • Has helped to ensure
  • High standards
  • Methodological rigour
  • Accountability
  • Academic credibility

35
Content of development work
  • Both planned and ad hoc development work
  • Research projects recently undertaken to
  • Improve methodology
  • Make surveys more accessible to wider audience
  • Increase response rates
  • Improve reliability and data quality

36
Planned development work
  • Utilises many social research methods such as
  • Literature review
  • Stakeholder consultation
  • Focus groups
  • Qualitative interviews

37
Ad hoc analysis
  • As well as data about patients experiences,
    surveys provide paradata about the surveys
  • This can be used for ad hoc analyses to improve
    future surveys for example
  • Improving data quality
  • Improving response rates extended fieldwork
    period
  • Aim to be truly evidence-based

38
Thank you
  • Any questions?
Write a Comment
User Comments (0)
About PowerShow.com