Working with Qualitative Evidence - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Working with Qualitative Evidence

Description:

Margaret Wheatley, Leadership and the New Science, 1999 ... Use participatory methods to prompt stories of the journey' or experience. ... – PowerPoint PPT presentation

Number of Views:96
Avg rating:3.0/5.0
Slides: 36
Provided by: leapSc
Category:

less

Transcript and Presenter's Notes

Title: Working with Qualitative Evidence


1
  • Working with Qualitative Evidence
  • Workshop for SCDC
  • LEAP Support Unit
  • May 2006
  • Cathy Sharp
  • info_at_research-for-real.co.uk
  • www.research-for-real.co.uk

2
Learning outcomes
  • Participants will.
  • Develop a fuller understanding of the validity of
    qualitative evidence in evaluation.
  • Have confidence to challenge taken-for-granted
    assumptions about the value and roles of 'hard'
    and 'soft' data in evaluation.
  • Understand the usefulness and specific
    contribution that qualitative data makes to
    evaluation.
  • Explore a range of ways of using stories in
    evaluation.

3
  • Small group exercise
  • How do you know
  • what you know?

4
(No Transcript)
5
Different kinds of knowing
  • Experiential knowing direct face to face
    encounter with person, place or thing empathy,
    resonance, in-depth knowing. Almost impossible
    to put into words.
  • Presentational knowing Grows out of experiential
    knowing provides the first form of expression
    through story, drawing, sculpture, movement,
    dance etc.
  • Propositional knowing Draws on concepts and ideas
    knowing about something expressed in
    informative statements.
  • Practical knowing Knowing how to do something -
    a skill, knack or competence.
  • A Layperson's Guide To Co-Operative Inquiry Peter
    Reason John Heron
  • http//www.bath.ac.uk/carpp/layguide.htm

6
  • Knowing will be more valid - richer, deeper, more
    true to life and useful, if these four ways of
    knowing are congruent with each other
  • If research and evaluation only generates
    propositional knowledge (explicit, objective,
    scientific, able to measure outputs)
  • it will not easily translate into action (tacit,
    subjective, experiential, feelings, acceptance)

7
Arguments for qualitative evidence
  • Why is qualitative evidence so easily dismissed
    or ignored?
  • Task think of three main reasons and propose a
    counter argument.

8
  • Qualitative evidence often dismissed as
  • lacking legitimacy and validity
  • 'anecdotal', 'isolated case', 'just one person's
    view not representative.
  • lack of quantity (small number of accounts) and
    measurement
  • lacking objectivity
  • assumed 'risk' of subjective interpretation
  • unable to demonstrate causality

9
What do we mean by representative?
  • Representation often a democratic goal (giving
    everyone chance to have their say) but gets
    mixed up with statistical meanings.
  • True statistical representation difficult and
    costly to achieve efforts to achieve statistical
    representativeness may hinder the use of
    approaches that might be more realistic,
    meaningful and useful.
  • Depends on purpose. do we actually need/want to
    generalise?

10
Meaning rather than measurement
  • There is a lack of confidence in the validity of
    other research approaches and methods that might
    be more meaningful and useful.
  • Using statistics is not the only way of being
    representative and of being able to generalise
    from research findings.
  • Qualitative research uses these concepts
    differently.

11
Representativeness in qualitative research
meaningful representativeness
  • Sampling based on selecting whatever phenomena
    will provide valuable information on the issues
    of importance - people are sampled on the basis
    that they are experts who are able to shed
    light on whats going on.
  • Generalisability in qualitative research is
    gained through constantly comparing between
    different events or groups undergoing the same
    processes.
  • This enables common themes and differences across
    different groups to be identified.

12
Not just what works
  • .but why a programme works, for whom, in what
    circumstances?
  • there is no objective reality out there
    waiting to reveal its secrets. There are no
    recipes or formulas, no checklist or expert
    advice that describe reality. If context is
    crucial.then nothing really transfers
    everything is always new and different and unique
    to each of us.
  • we should move away from arguing
  • about whos right and whos wrong and instead
    focus our concerns on issues of effectiveness, on
    reflective questions of what happened and what
    actions might have served us better. We should
    stop arguing about truth and get on with figuring
    out what works best (Margaret Wheatley)

13
  • So many of the things in organisations that we
    argue and worry about come from our belief in
    objective reality. Something is out there, we
    believe, challenging our skills of analysis and
    perception. We just have to hire the right
    experts in order to see it clearly. But this
    search for discernible, objective futures has
    been, if we can admit it, a great cosmic joke
  • Margaret Wheatley, Leadership and the New
    Science, 1999

14
An example of changing attitudes Realist
Evaluation
  • Instead of asking if an initiative works or not
    (or comparing it to some other initiative),
    realist evaluation tries to develop an
    understanding of why a programme works, for whom,
    in what circumstances.

15
If context is all, how can we generalise?
  • It is important to understand clearly that
    social research concerns itself with two rather
    different types of questions, namely the study of
    general laws of group life and the diagnosis of a
    specific situation. These laws do not tell what
    conditions exist locally, at a given place at a
    given time. In other words, the laws dont do
    the job of diagnosis which has to be done
    locally. Neither do laws prescribe the strategy
    for change. (Kurt Lewin, 1946)
  • it is possible to generalise from research at a
    certain level. Its possible to generalise about
    processes, rather than deriving general laws of
    group life. So that its possible to say
    something like that in particular settings, with
    such people, with these forms of interventions,
    its likely that..
  • Eliot Stern, Tavistock Institute, UKES Seminar,
    Edinburgh 2004.

16
Using stories in evaluation
  • What is a story?
  • a first person descriptive account of something
    that happened.
  • Find a suitable term in context.
  • Stories can be collected and used individually or
    collectively.
  • Can be collected from staff or service users.

17
Benefits of using stories
  • Challenge perceptions and assumptions and provide
    new insights
  • Provide deeper understanding of how services are
    experienced by users and staff eg. partnership
    working show how complex, cross-cutting
    services work in practice.
  • Help to explore distance travelled
  • Help to notice and explore unanticipated outcomes
  • Provide strong motivating energy for change
  • Reinforce the value given to the perspectives of
    the story tellers.
  • Good for inclusive practice, but can be used with
    any group.
  • Can be used in the dissemination process.

18
Practice pointers in using stories
  • Need to be systematic and rigorous.
  • Be clear about the purpose with which the story
    is told/collected.
  • Consider how much structure is needed to guide
    storytellers.
  • Consider whether to give an appreciative steer
    ask about what has worked well?
  • Ethical concerns eg. anonymity safeguards

19
Alternative criteria for judging quality and
credibility
  • Acknowledgement of subjectivity - reflexivity
  • Trustworthiness
  • Authenticity
  • Triangulation capturing and respecting multiple
    perspectives
  • Demonstration of lived experience
  • Opening up the world in some way
  • Particularity do justice to the integrity of
    unique cases and understand the wider
    significance of particular cases
  • Enhancement deepening of understanding
  • Contribution made to dialogue
  • Vitality, aesthetic qualities, creativity,
    provocation, voice, connection with audience,
    feels true or real

20
Contrasting approaches
  • The classic qualitative approach - evaluator
    interviews individuals, writes up case study.
  • Analysis interpretation usually done by
    evaluator.
  • Thematic analysis across case studies usual.
  • Can be adapted for peer evaluation interviewing
    each other using staff to collect stories
  • Analysis could be done through wider discussion
    eg conference.

21
More participatory and group approaches
  • Use participatory methods to prompt stories of
    the journey or experience.
  • Facilitation approach (rather than evaluator)
    enabling others to tell their own story
  • Video, drama, arts, Photovoice etc
  • Benefits builds analysis and interpretation and
    validation into the process
  • Supports collective story telling and sharing

22
Photovoice
  • Photovoice blends a grassroots approach to
    photography and social action.
  • It provides cameras not to health specialists,
    policy makers, or professionals, but to people
    with least access to those who make decisions
    affecting their lives.

23
Photovoice has three goals
  • Enables people to record and reflect their
    community's strengths and problems.
  • Promotes dialogue about important issues through
    group discussion and photographs.
  • Engages policymakers.
  • "What experts think is important may not match
    what people at the grassroots think is
    important." Caroline C. Wang

24
Flexible and adaptable method
  • Can be used with different communities on
    different issues eg. needs assessment, asset
    mapping, and evaluation mental illness,
    reproductive health, homelessness
  • Participants choose the photographs, tell stories
    about what they mean, identify issues, themes or
    theories through discussion. Good way of
    promoting participation in data analysis.
  • Avoids the distortion of fitting data into a
    predetermined paradigm people make meaning
    themselves - what matters to them what is
    worth remembering and what needs to be changed.

25
Adapting Photovoice - using visual material
  • Can use visuals to produce storyboards as input
    and feedback into continuing inquiry process. Eg
    CitiStat

26
(No Transcript)
27
Visions of Success
There is a finishing line that theyre aiming
for, its being clear about what our finishing
line is going to be.
There is a bid element of good luck in the hope
you dont get asked a question you havent
prepared for and an element of the outcomes
because there are several threadsthrough the
system that have to somehow hang together for it
to work.
Its really important for me to make sureI get
all the information in and its accurate and
valid and that I will get it on time, and that I
pass it on in the same way
Whatever foundation that we are laying is for the
future, for the children that will follow on
behind.
These two cats probablywere quite adversarial
and probably defended their own territory and
maybe had a few battles along the way, but then
theyd probably realised that they were working
to the same aim and theyve decided to do things
together.
Theres a bottom and an end to this if we want to
make the journey.
It might seem a bit overpowering but theres
definitely a destination and thats where we want
to get to.
If we are really going to address the issue of
inequalities and the whole business of an
integrated whole system for the NHS then this
kind of approach, where the tails are
intertwined, is of paramount importance.
Its a mass of people but actually theyre all
individuals and what we need to be looking at is
their individual needs and we need to be
listening to them and looking at their quality of
life.
How can you get this person out of a jam jarand
the success of it is actually finding that
solution and a way forward for doing it.
Citistat is more about clearing the air and
breaking down the boundaries between primary care
and acute care.
Its not about how skilled you are explaining
yourself, its about the footprints you leave
behind and the impact you make.
In the midst of all this data its remembering
that what were doing is about helping people be
healthy and staying at home, making the most of
their lives
Citistat joins a lot of functions, groups,
processes within the organisation and it also
joins up a lot of things acute, primary care,
the board, operating divisions, those sort of
things.
We want this baby to grow into being a really
good strong healthy adult, knowledgeable, so
therefore the inputs have to be really good so
that the outcomes at the end of the day are good.
Theres a direction of travel there, were making
steady progress and were measuring how far weve
gottheres a clear sense of getting closer to
our eventual outcomes.
Its a weed, it gets poisoned and it always
survives. It has the most effective mechanism of
propagating and spreading itself which I think is
one of the things I have to carry in terms of
Citistat.
We need some very contemporary, modern thinking
if were to move forward. Solets stop looking
back, lets move forward as we mean to go on.
28
It might seem a bit overpowering but theres
definitely a destination and thats where we want
to get to.
29
Storyboards
30
(No Transcript)
31
Storydialogue
  • A systematic, structured way of collecting and
    analysing stories in group settings.
  • Story telling in small groups
  • Structured dialogue based around four types of
    questions
  • What? (description)
  • Why? (explanation)
  • So what? (synthesis)
  • Now what? (action)
  • Development of insight cards each member of the
    group writes a few of their insights (or
    ah-ahs) down to share with the rest of the
    group.
  • Devising categories a collective process of
    analysing and sorting insight cards to identify
    themes.
  • Plenary Discussion.

32
(No Transcript)
33
Imaginethe implications for evaluation practice
  • Principles and design issues
  • Allow those most closely involved to determine
    what is of importance challenge to funders and
    commissioners.
  • Dont be concerned with prediction and control -
    allow meaning and interpretation to evolve and
    change - adapt to the questions that arise.
  • Remain open to challenge and surprise.
  • Value self- and participatory evaluation use
    external input to facilitate process and be a
    critical friend not an expert.
  • Make every effort to ensure that all
    project/programme beneficiaries or stakeholders
    have a number of opportunities to give their
    views, if they wish.
  • Ensure the explicit, conscious practice of your
    values and those values espoused by the
    organisation.
  • Don't expect to be perfect, but encourage the
    development of ways of working and behaving which
    helping us all to walk our talk.
  • Retain a critical perspective.

34
  • Practice/methodological issues
  • Draw on the stories and experiences of local
    people and staff.
  • Adapt methods appropriately for the participants
    to maximise participation.
  • Use diverse methods, conducted in ways that are
    as inclusive as possible and designed to provide
    actionable information for practitioners and
    decision makers.
  • Dont apologise for small sample sizes!
  • Dont judge qualitative research by quantitative
    standards.

35
  • Definitely achieved this for me green sticker
  • Achieved partially - there are still some issues
    for me amber sticker
  • Definitely did not achieve this for me red
    sticker
  • Comments write on post-its
Write a Comment
User Comments (0)
About PowerShow.com