Was it all worth it - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Was it all worth it

Description:

Can you think of one thing that you have done or observed that ... Some 'back of email' research. Emailed a dozen people with the following three questions ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 27
Provided by: Vaug64
Category:
Tags: bravery | cling | coca | emailed | worth

less

Transcript and Presenter's Notes

Title: Was it all worth it


1
Was it all worth it?
  • Is your evaluation and research gathering dust
  • OR
  • influencing strategies and programme
    implementation?

2
Can you think of one thing that you have done or
observed that helped evaluation and research
findings to be used?
3
Some back of email research
  • Emailed a dozen people with the following three
    questions
  • 1. What things in your experience have been
    some of the barriers to findings from evaluation
    and market research being applied in programmes
    aimed at social and behaviour change?
  • 2. What things in your experience have
    increased the chances of evaluation and market
    research findings being applied in programmes
    aimed at social and behaviour change?
  • 3. How many years have you been involved in
    programmes aimed at social and behaviour change.

4
Heard back from 10
  • 4 researchers/evaluators
  • 4 project managers
  • 2 communications / advertisers
  • Total of 137 years experience

5
Themes
  • Why are we doing it , do we really want to know?
  • Time, timing and not enough of it
  • Methodolgy (have we got it right?)
  • Analysis and expertise
  • Are we able to take it on board and make changes?

6
WHY ARE WE DOING IT (DO WE REALLY WANT TO KNOW?)
7
  • My observation is that evaluations of services
    are often done because there is a requirement at
    senior management level to demonstrate that
    services are effective or prove that something
    has 'worked' - the evaluation is often used to
    justify expenditure rather than as a service
    development tool. The term, "the evaluation was
    very positive" is often all that is used in
    decision making.

8
  • In the social sector we are also not that keen
    on conflict, which is inherent in finding fault.
    Therefore organisations (and sometimes their
    advisors) may cling to the positive aspects of
    the research findings and overlook the negatives.

9
  • Organisational bravery the willingness to
    accept when something wasnt perfect first time,
    and amend accordingly and the willingness to
    make changes that might not be simple,
    straightforward or politically acceptable

10
TIME, TIMING AND NOT ENOUGH OF IT
11
  • Timely research findings not keeping up with the
    decision timeframe needs of the programme

12
  • Contracting environment where the research team
    is able to influence the nature and objectives of
    the research to be undertaken

13
  • The difficulty of measuring behaviour change
    ......the short duration of programmes which
    means that you don't have the time to measure
    long term change

14
METHODOLGY (HAVE WE GOT IT RIGHT?)
15
  • Lack of confidence in the data (quantitative
    especially) - some questions can elicit ambiguous
    responses

16
  • In qualitative research failure to segment other
    than along gender or ethnic lines is not useful.
    Some/many behaviours can be examined by
    segmenting along attitudinal or behavioural
    lines

17
  • Confidence in the questions, good segmentation,
    recommendations we can apply, good information
    about the key audiences (often Maori, PI, etc),
    experienced researcher, understanding of the
    sector and environment we work in.

18
ANALYSIS AND EXPERTISE
19
  • Managers/sponsors having adequate knowledge of
    evaluation and allowing for it in project
    planning and budgeting

20
  • The researcher really exploring what is behind
    the more obvious impressions of an
    anamatic...Presenting findings and debating the
    detail of what people said and how they
    interpreted the concepts .The client
    interpreting the results and looking for
    opportunities to adapt and improve, not seeing
    them as just a safety net for decisions

21
Taking evaluation findings and simplifying and
generalizing them would assist in getting the
relevance of findings used across different
programmes
22
ARE WE ABLE TO TAKE IT ON BOARD AND MAKE CHANGES?
23
  • Lack of an infrastructure to implement the
    findings we are a sole office in Wellington
    relying on the goodwill of others to deliver
    locally we dont have a supply chain like coca
    cola!

24
  • Its just too hard negative findings mean you
    have to accept you got it wrong . which can be
    hard enough but you also have to then fix it.
    That is sometimes more effort (and resource) than
    organisations can face or afford

25
  • The relevance of evaluation practice to
    outcomes. Perhaps an answer lies in increasing
    the capacity within organisations to self
    evaluate and adopt reflective practice. the
    process of strategic planning and evaluation
    should sit together.

26
  • Problems are people are too busy to network and
    talk
Write a Comment
User Comments (0)
About PowerShow.com