Research in SEE - PowerPoint PPT Presentation

About This Presentation
Title:

Research in SEE

Description:

Research in SEE volunteering and innovation Including results from: The impact of long-term youth voluntary service in Europe Steve Powell, proMENTE social ... – PowerPoint PPT presentation

Number of Views:37
Avg rating:3.0/5.0
Slides: 20
Provided by: SteveP120
Category:
Tags: see | research | shaping

less

Transcript and Presenter's Notes

Title: Research in SEE


1
Research in SEE volunteering and innovation
  • Including results from The impact of long-term
    youth voluntary service in Europe

Steve Powell, proMENTE social research, Sarajevo
2
Objectives
  • Present briefly
  • Evaluation of SEEYN workcamps
  • AVSO/proMENTE review of impact studies
  • Main findings
  • Main challenges in evaluating the impact of youth
    voluntary service programs
  • Intercultural learning generic or dyadic?
  • main results of researches on volunteerism done
    by you. We want to ensure audience understand
    that volunteerism contribute to social inclusion
    and can be used to serve different purposes and
    goals

3
Our review
  • presented in European Parliament June 2007
  • www.promente.org/avsoreview
  • Funders included Global Service Institute /
    Center for Social Development

4
Studies reviewed
  • Over 300 documents analysed published and
    unpublished research and evaluation studies
  • 40 directly relevant to the impact of youth
    long-term voluntary service in Europe.

5
European perspective
  • Long history and tradition of YVS
  • Intensive, long-term placements, sometimes alone
  • EC-funded EVS program
  • Very heterogenous, country specific
  • Surprisingly poor research base tradition
  • Poor responses from EVS national agencies

6
  • In spite of some encouraging results, overall the
    research conducted in Europe to date on the
    impact of voluntary service has had neither the
    methodological teeth nor the mandate to really
    test whether voluntary service works as
    advertised.
  • But whose job is it to demonstrate impact?

7
The studies sources
  • ? Some evaluation studies used many different
    sources.
  • All included information from volunteers and/or
    from implementing organisations
  • ? None included direct data from users /
    beneficiaries

8
The studies data and designs
  • 25 studies qualitative data collection and
    analysis
  • 30 studies quantitative data and analysis
  • ? Some studies used both kinds of data /
    analysis.
  • ? Very few referred to any evaluation framework
    or toolkit
  • ? Almost none considered drop-outs, attrition
  • ? None used validated / composite scales or
    instruments

9
Toolkits and frameworks
  • Looking for
  • a meta-framework to present our findings
  • frameworks useful for individual studies

None of the studies made much use of toolkits or
frameworks
  • Model of voluntary activities and civic learning
    (Mutz and Schwimmbeck 2005)
  • VIVA (Gaskin 2003)
  • Framework for organising service-related research
    (Perry and Imperial 2001)
  • The functional approach (Snyder, Clary et al.
    2000).
  • AmeriCorps general theory of change model
    (Jastrzab, Giordono et al. 2004)
  • Independent sector / UNV Measuring volunteering
    a practical toolkit (Dingle 2001)
  • GSI (Tang, Moore McBride et al. 2003)
  • Council of Europe and European commission
    International Voluntary Service T-kit (Amorim,
    Constanzo et al. 2002)
  • UNV (Daniel, French King 2006)
  • IVR (Institute for Volunteering Research 2004)

Great diversity in types and purposes of study,
approaches, focus on internal validity
10
IVR framework
11
But ...
  • Low comparability of studies
  • poor designs reporting standards
  • lack of comparison, counterfactuals
  • Not possible to conduct a formal meta-analysis

Design Number of studies
Retrospective impressions Nearly all a few used narrative follow-ups
Comparison group 5
Before/After 2
Quasi-experiment 0
12
Overview of findings
  • Compare with results from todays 3 studies?

Future volunteers
Reduced costs?
Improved staff skills?
Many studies, at least some with good designs
Improved organisational climate?
Value of services delivered
Social communication skills
generic skills and work experience
Many studies, narrative / economic evidence
employment?
active citizenship
employability
Tolerance / intercultural competence /
Personal growth
Many studies, weaker evidence
life-long learning
broadened horizons
better education career choices
interest in social studies
Anecdotal or mixed evidence
Decreased career indecision
Basic European identity
less wasted time at schools
intention to continue to volunteer?
bridging social capital?
improved discipline?
job creation?
Potential to involve disadvantaged groups
13
Main results
  • Compare with results from todays 3 studies?
  • You get the impact you program for.
  • So voluntary service regularly produces those
    kinds of impact for which voluntary service by
    its nature provides the input
  • Personal growth, independence
  • Career orientation, etc

14
Good news
  • Compare with results from todays 3 studies?
  • Voluntary service works for everyone!
  • Often disadvantaged youth benefit more
  • Ceiling effect? ? tailor programs
  • Artefact? ? improve instruments
  • Differential effects ? matchmaking

15
Cold showerof numbers?
  • Compare with results from todays 3 studies?
  • Even more objective methods do indeed reveal some
    significant benefits
  • But always much less encouraging than
    retrospective reports
  • And there are some disappointments
  • .. Takes courage

16
Evaluation functions
Studies varied greatly in their functions
Learning management Control accountability Demonstrating impact
Individual projects programs ... More room for ad-hoc, emic approaches ... Individual studies? RBM? Or (partial) adherence to model?
Policy level ditto ... Occasional, high quality studies ? standards models
17
Main challenges mainstream YVS impact evaluation
culture
Fixable?
  • Subjective
  • The whole gamut of biases
  • Vested interests
  • Stronger internal validity and very varied
    approaches at program level
  • Low motivation to evaluate
  • Idealism the deep fear of the cold shower
  • Bad influence of RBM LFA
  • (every implementation has to demonstrate
    hard-to-measure impact)
  • Non-science lack of counterfactuals, even
    before/after comparison
  • Pre-science lack of
  • comparability/generalisibility
  • seminal papers
  • accepted research paradigm tools

18
Final suggestions
  • Impact measurement toolkit, not a
    one-size-fits-alls framework
  • Wide selection of free, documented validated
    impact measurement tools and methods (e.g.
    leverage WVS)
  • Dont require evaluations at project level! ...
  • ... Fewer, higher-quality impact studies at
    sector / country level
  • ... which can do some of the work for
    program-level studies
  • Validate the tools
  • Standardise some program elements
  • Answer research questions
  • Typical 1-year change in non-volunteers on key
    measures like job self-efficacy
  • Is better improvement on key outcomes in
    disadvantaged youth an artefact?
  • Cost benefits of reducing career indecision?
  • ....

19
  • Thanks!
  • steve_at_promente.org
Write a Comment
User Comments (0)
About PowerShow.com