Paths for All Scheme Coordinators Day - PowerPoint PPT Presentation

1 / 53
About This Presentation
Title:

Paths for All Scheme Coordinators Day

Description:

(May impact on existing participants maintenance rather than adopters/inactive? ... Are our walks tailored to needs of the inactive/low active? ... – PowerPoint PPT presentation

Number of Views:45
Avg rating:3.0/5.0
Slides: 54
Provided by: avr99
Category:

less

Transcript and Presenter's Notes

Title: Paths for All Scheme Coordinators Day


1
Paths for All Scheme Coordinators Day
  • Who is benefiting from your scheme?
  • What do we need to know to sustain or schemes and
    improve our local and national outcomes?
  • avril.blamey_at_ntlworld.com

2
Outline
  • What do we know about the impact of physical
    activity interventions in Scotland?
  • How can we begin to know more (and make a bigger
    impact)?
  • What is evaluation and why do it?
  • What are we evaluating and who for?
  • Tools that answer these questions (aid planning
    and evaluation)
  • How do we address these issues in Paths for All?
  • What is the theory behind your scheme?
  • What key assumptions do you need to address?

3
Recent research
  • Work completed with Jacki Gordon and SPARcoll
  • Review of existing plans, reports and evaluations
    of key physical activity initiatives
  • Review of contribution of these initiatives to
    National Physical Activity Strategy and the
    context within which they were commissioned
  • Interviews with Programme Leads regarding
    recommendations to improve planning and
    evaluation

4
Key Initiatives
  • Girls on the Move
  • Fit for Girls
  • Y Dance
  • Jog Scotland
  • Play _at_ Home
  • Workplace
  • Paths for Health

5
What do we know about the impact?
  • Programmes not meant as key mechanisms to
    increase population physical activity
  • Large scale projects established
  • Substantial reach/enhanced opportunities
  • Where there are independent evaluations - some
    evidence of short-term increases in physical
    activity among participants who attend/adhere
  • Some evidence in some projects of engagement of
    those from deprived areas

6
Key findings
  • Limited evidence of
  • widespread increases in physical activity
  • long-term increases in physical activity
  • if we are actually engaging the low or inactive
  • whether inequalities will be widened or reduced
  • Programmes may be successful but we have limited
    proof

7
Limitations in planning
  • Clarity of outcomes and linkages between outcomes
    and activities
  • The absence of anticipated timelines and
    thresholds of change
  • Over-ambitious aims and objectives
  • Limitations in the use of population health and
    demographic data for planning
  • Lack of information on how services area tailored
    and targeted
  • Tensions between reach and quality or intensity
    of support
  • Limitations in the use of the existing evidence
    base

8
Limitations in monitoring and evaluation
  • The lack of appropriate baseline information
  • A lack of focus in evaluations
  • Monitoring outputs rather outcomes
  • Self-reported rather than objective/observational
    data
  • Limited contextual information to aid attribution

9
Suggested solutions
  • Standardised proforma for planning
  • Indicative core data set
  • Improved process for commissioning, support and
    reporting
  • Capability building
  • Suggestions reviewed with Programme Leads

10
Feedback
  • Lack of influence on end user/target group
    (inactive/low active)
  • Multi-site programmes
  • Reliance on volunteers or employed leaders
  • Delivered in partnerships
  • Detract from delivery
  • Limited capacity/capability in programmes
  • Limited field measures for physical activity

11
Standard but Flexible Proforma
  • Encouraging planning around OFP tools (the Re-Aim
    framework/ Logic Models)
  • Clarity re
  • Target groups and reach
  • Types of inequality
  • Outcomes and linkages to activities
  • Timescales/ thresholds
  • Adoption and maintenance
  • Use and adaptation of evidence
  • Intended monitoring and evaluation
  • Funding

12
Indicative data set stage A or B
  • Focus on intermediaries
  • Reach as of agreed target group
  • Inequalities (excluded groups/SIMD
  • Adoption
  • Maintenance
  • Changes in attitudes, knowledge and ideally
    practice
  • (project specific measures)
  • Focus on end users
  • Reach as of agreed target group
  • inactive
  • Adherence
  • Changes in activity levels
  • (BHF question)
  • Mock ups provided in report

13
Improved process
  • Funding announcements/warning
  • Completion of short proforma
  • Short listing and feedback
  • Support/training to complete full proforma
    (critical friend)
  • Expert panels
  • Agreement of whether project can influence end
    user
  • Evaluability assessment
  • Agreement on where responsibilities lie for
    monitoring and evaluation and availability
  • Agreed reporting mechanism

14
Key links for HIPM resources
  • http//www.healthscotland.com/scotlands-health/eva
    luation/hi-performancemanagement-nhs.aspx
  • http//www.improvementservice.org.uk/health-improv
    ement/health/tools-for-soa-processes/

15
What is evaluation?

16
What is evaluation?
  • Evaluation is the systematic assessment of
    the operation and or/the outcomes of a programme
    or policy, compared to a set of explicit or
    implicit standards, as a means of contributing to
    the improvement of the programme or policy.
  • Weiss 1997
  • More specifically, evaluation researchers
    (evaluators) use social research methods to
    study, appraise, and help improve social
    programmes in all their important aspects,
    including the diagnoses of the social problems
    they address, their conceptualisation and design,
    their implementation and administration, their
    outcomes, and their efficiency.
  • Rossi, P,H. Freeman, H,E. Lipsey, M, W. (1998)

17
Key elements in definitions
  • Systematic (robust)
  • Process and/or outcomes of real life programme
  • Assesses merit or worth against criteria
  • Purpose is programme or policy improvement
    social betterment

18
Why evaluate?
19
Why evaluate?
  • Programme improvement/ learning
  • Accountability
  • To gain funding

20
Why do we evaluate?
  • Are the changes in outcomes found due to the
    intervention? (attribution)
  • Is it worth the money?
  • Will it work for everyone or only some?
  • Does it work in all contexts ?
  • Should it continue, be ended, changed ?
  • Mid course corrections?
  • Choosing between options to improve health ?
  • Programme improvement

21
Why do we evaluate?
  • What is the problem (scope, location, who how
    it effects)?
  • What are the feasible interventions, appropriate
    target groups ?
  • What is the programme or policy?
  • How is the programme conducted?
  • What is it actually doing and to whom?
  • Does it have any unintended/negative
    consequences?
  • Is it following guidelines good practice?
  • What outcomes is it achieving?
  • Is it addressing the purpose for which it was
    established?

22
(No Transcript)
23
Who wants evaluation?
24
Who wants evaluation?
  • Funders to ensure funds are being put to good
    use?
  • Policy staff/ national programmes to see what is
    happening in the fields/localities?
  • Programme managers what to know how effective
    their programmes are to compare effectiveness?
  • Evaluators?
  • Generally for the higher tiers in organisations
  • Policy makers interested in overall effects
  • Participants ?
  • Managers may be more interested in improvement
    and differential effects of different strategies

25
What are you evaluating?
26
The real challenge of evaluation!
A man in a hot balloon realised he was lost. He
reduced altitude and spotted a woman below. He
came lower and shouted, Excuse me, can you help?
I promised a friend I would meet him an hour ago,
but I dont know where I am. The woman below
replied, Youre in a hot air balloon hovering
approximately 30 feet above the ground. Youre
between 40 and 41 degrees north latitude and
between 59 and 60 degrees west longitude. You
must be a researcher, said the balloonist. I
am, replied the woman, How did you know?
Well, answered the balloonist, everything
you told me is technically correct, but Ive no
idea what to make of your information and the
fact is I am still lost. Frankly youve not been
much help at all. If anything, youve delayed my
trip. The woman below responded, You must be
a manager. I am, replied the balloonist but
how did you know? Well, said the woman, you
dont know where you are or where you are going.
You have no map, and no compass. You have risen
to where you are due to a large quantity of hot
air. You made a promise, which you have no idea
how to keep, and you expect people beneath you to
solve your problems. The fact is you are in
exactly the same position you were before we met,
but now, somehow its my fault.  
Taken from Research Policy and Practice Worlds
Apart. Social Policy and Society 34, 35 384,
2004.
27
Examples of tools to aid planning and evaluation
  • Logic Models
  • Re-Aim
  • Help us to identify what to evaluate, why and for
    whom?

28
(No Transcript)
29
(No Transcript)
30
(No Transcript)
31
Is it plausible?
  • Will the passport to recreation scheme increase
    PA levels?
  • Assumptions
  • Money is key barrier (rather than
    time/access/distance/motivation)?
  • Limited available evidence about concessionary
    schemes
  • Marketed mainly in centres or at one off events.
  • (May impact on existing participants maintenance
    rather than adopters/inactive?)
  • Sphere of influence

32
Is it doable?
  • Financial implications of high passport uptake on
    income generation
  • Staffing levels and skills by way of encouraging
    adherence

33
Is it testable ?
  • Limited data that can indicate repeat usage by
    individuals.
  • PIs count swims/attendances per 100,000
    population
  • Activity of passport holders can be tracked but
    is it used or shared with health partners ?
  • SMART

34
(No Transcript)
35
Criteria for maximising population health impact
  • Reach large numbers of individuals most in need
    and likely to benefit (maximise
    exposure/saturation)
  • Can be adopted by majority of appropriate
    settings/organisations
  • Can be consistently implemented by staff with
    moderate levels of expertise/training
  • Can produce replicable (consistent) and long
    lasting results at reasonable cost
  • No unintended harm

36
How do we address these issues inPaths for All?
  • Clarity on the learning that we need to improve
    schemes locally or nationally?
  • Must haves
  • Like to have
  • Agreement on core data and link to national data
    set?
  • Identifying means to gather learning for other
    areas within or across schemes?

37
Whats the theory behind your scheme?
38
(No Transcript)
39
What are the key assumptions that need tested?
  • We can recruit sustain enough (volunteer)
    leaders?
  • We can recruit low and inactive?
  • We can a get a of above to adhere?
  • We can find ways to sustained their walking?

40
Using logic modelling to aid planning ?
  • Convince local partners that your scheme is
    contributing to the SOA and National government
    outcomes
  • Identify different roles and contributions form
    different partners
  • Agree timescales
  • Specify realistic targets

41
Ninewells Cross-sector Contributions
Improved physical health and fitness and mental
health Reduced health and social inequalities
People Increased physical activity, Increased
social interaction, people value and use
greenspace
Place High quality open spaces meeting needs of
residents, business and visitors
Uptake engagement with opportunities
Understanding risks, attitudes to inactivity,
greenspace
Those at risk or with CHD/MH problems
General public - targeted
System, training, for referrers,
Key messages on benefits of activity and
greenspace
Brief advice, referral
Media campaigns
Scottish Govt
NHS
42
Using logic modelling to aid evaluation?
  • What do we want to learn and for whom?
  • What are we already monitoring and for whom?
  • What other things might we want to learn and why?
  • What are the assumptions that underpin our model?
  • Can we use existing data?
  • Do we need new information?
  • Who should do it (our scheme or others)?

43
What outcomes are we already monitoring?
  • What are the key outcomes that are vital for your
    scheme to report on so that national and local
    SOA target are met?
  • Use of PfA data base for this purpose.

44
(No Transcript)
45
What areas might we want to learn about locally?
  • Are our partners signing up for same outcomes ?
  • Why dont our target group walk in our area?
  • Why might they walk (health, weight, social,
    mental health)?
  • Are we marketing walks to right people /places?
  • Are our walks tailored to needs of the
    inactive/low active?
  • Who goes onto become walk leaders and why?
  • Is maintenance best achieved via moving to other
    group or sustaining existing groups?
  • What of those engaging are inactive?
  • Are we getting different outcomes for different
    groups?

46
(No Transcript)
47
What areas might we want to learn about
nationally?
  • Reach
  • what of inactive are we engaging?
  • what of inactive are we adhering?
  • Efficacy
  • Why are some schemes/areas more successful than
    others?
  • Can the good practice be transferred?
  • Adoption
  • What areas are running schemes (and not)?
  • What sorts of people are volunteering?
  • Implementation
  • What practice models are used/adapted?
  • What means of increasing adherence for
    participants and volunteers?
  • Maintenance
  • How long are schemes running for?
  • Why are some stopping and others sustained?
  • How do we get people walking independently or
    sustained to ensure we reach the inactive

48
Prioritise!!!
49
How do we prioritise them?
  • Use modelling and Re-Aim as a means of discussing
    priorities?
  • Whats needed by PfH
  • What need by the CPP
  • Id this learning that will improve the programme?
  • Identify core information needed (link to data
    base/ SOA)
  • Case studies to uncover other areas
  • Cross cutting learning

50
Who evaluates?
  • Not everyone measuring everything
  • In-house
  • Commissioned
  • Differentiate moniroting and evaluation

51
Possible case studies?
  • Main motivations for becoming a walk leader?
  • Will identifying clearer links to SOA help unite
    partners gain commitment?
  • Identifying why some areas can get more
    participants walking independently?
  • Differential outcomes for men/women different
    activity levels?

52
What would help you?
53
Why bother?
  • SOA context important for funding
  • Learning and improving programme locally and
    nationally
  • Better experiences for your participants
  • Better results (walking and health)
  • That is why we are here!!!
Write a Comment
User Comments (0)
About PowerShow.com