Putting It All Together - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

Putting It All Together

Description:

Title: Introduction to Program Evaluation Author: Gail Johnson Last modified by: Gail Johnson Created Date: 7/12/2000 7:21:05 AM Document presentation format – PowerPoint PPT presentation

Number of Views:103
Avg rating:3.0/5.0
Slides: 41
Provided by: gai112
Category:

less

Transcript and Presenter's Notes

Title: Putting It All Together


1
Putting It All Together
  • Summing Up/Starting Up
  • TESC MPA Winter 2008
  • Dr. Gail Johnson

2
Critical Thinking Skills
  • To be open to the emerging situation, as more
    information is brought to the table
  • To find at least three alternatives or possible
    explanations or strategies to any given problem
  • To search for new ways of looking at the world,
    ourselves and others

3
What is Research?
  • A systematic search for answers to questions.
  • Search to uncover, examine, find by exploration,
    to investigate, to inquire.
  • Research "the systematic inquiry into a subject
    in order to discover or revise facts, theories
  • Scientific method to reduce bias

4
Guiding Values
  • Technically Correct use the right tools for the
    right situations
  • Reliable and Valid Measures
  • Objectivitya worthy goal
  • The point is to find the most accurate and honest
    data that can be used to inform programs and
    policies, not win your argument
  • Do No Harm no one should be harmed by your
    research

5
Guiding Values
  • Tell the truth--always.
  • Interpretations can have spin may not be agreed
    criteria for what is good.
  • Build in checks to assure accuracy.
  • Be honest about the limitations of your research.
  • Do not conclude more than you can.
  • Dont take cheap shots at other peoples
    research dont accuse of wrong-doing without
    evidence.

6
Social Justice
  • Given the difficulty in proving things work
    using social science tools, we run the risk of
    wrongly assuming that programs dont work just
    because we cant measure an effect.
  • We have merely failed to find an effect that
    might be there.
  • Why do we seek to prove that social service
    programs are effective but not tax loopholes or
    payments to corporations?

7
Steps in the Research Process
  • Planning (The Design Logic)
  • 1. Determining Your Questions
  • 2. Selecting a Research Design
  • 3. Identifying Your Measures and Measurement
    Strategy
  • 4. Developing Your Data Collection Strategy
  • a. Methods
  • b. Sample
  • 5. Identifying Your Analysis Strategy
  • 6. Reviewing and Testing Your Plan

8
Steps in the Research Process
  • Doing
  • 1. Gathering the data
  • 2. Preparing data for analysis
  • 3. Analyzing and interpreting the data

9
Steps in the Research Process
  • Reporting the results telling the story
  • 1. Executive summary
  • 2. Reports and articles
  • 3. Use of charts and tables
  • 4. Oral briefings

10
Putting the Research Planning Steps Together
  • End of Fall and Early Winter Term
  • Time to finalize your research plan
  • All pieces of your design should connect
  • An Iterative process
  • Make changes to earlier ideas and plans as you
    obtain new information
  • Some questions are not researchable

11
Step 1Types of Questions
  • Descriptive what is, how many, etc
  • Normative meeting a target
  • Impact/Causal, Cause/Effect
  • logical theory
  • time-order
  • co-variation
  • exclude all other rival explanations

12
Step 2 Narrow Design
  • Experimental
  • Random assignment to treatment or control
  • Quasi-experimental
  • Non-random assignment, maybe no assignment
  • May not be able to control treatment
  • Correlation using statistical controls creates
    comparison groups
  • Non-experimental
  • One-shot design

13
Question-Design Connection
  • One-shot designs make sense for descriptive
    questions
  • However, impact or cause-effect questions should
    use at least a quasi-experimental design and
    ideally an experimental design.
  • In public administration, experiments are hard to
    do. Sometimes one-shot is as good as it gets
  • Tale of IRCA

14
Broader Design
  • Time series single, multiple, interrupted
  • Cross-sectional design
  • Statistical controls
  • Panel study same people over time
  • Longitudinal different people over time

15
Broader Design
  • Case study
  • Meta analysis
  • Content analysis
  • Cost-benefit analysis
  • Policy analysis

16
Step 3 Developing a Measurement Strategy
  • Conceptual definition
  • Key terms
  • Operational definition
  • How it will be measured in numbers
  • the operations which translates a concept or idea
    (or construct) into a measurable phenomenon
  • Boundaries
  • Who, time frame, geographic locations

17
Class Discussion
  • Suppose we framed a question that asked whether
    the quality of instruction of the MPA program was
    adequate?
  • What terms need to be defined?
  • How would we operationalize
  • quality of instruction?
  • adequate?

18
Step 4 Data Collection Options
  • The decision depends upon
  • What you want to know
  • Numbers or stories
  • Where the data resides
  • Environment, files, people
  • Resources available time, money, staff to
    conduct the research

19
The Big Choice
  • Quantitative
  • Use when you want to do statistical analysis,
    want to be precise, know exactly what you want to
    measure and/or want to cover a large group
  • Qualitative
  • Use when you want anecdotes or in-depth
    information, when you are not sure what you want
    to measure, and/or there is no need to quantify

20
Multiple Methods A
  • Quantitative and qualitative data collection
  • Available data with surveys
  • Surveys with observations
  • Observations with available data
  • Surveys with focus groups

21
Step 4A Data Collection Methods
  • Locate sources of information
  • Data collection methods
  • available data
  • archives, documents
  • observation
  • surveys, interviews, focus groups

22
Discussion Data Collection Options
  • You want to look at the qualifications of those
    admitted to the MPA program. Assume you have
    already decided on how to measure
    qualification.
  • One option is to gather the data is on their
    admission files.
  • Another option is to survey the MPA students and
    ask them their qualifications?
  • Which would you choose and why?

23
Step 4B How Many and How Selected Two Big
Options
  • Non-random samples
  • Quota
  • Accidental
  • Snow-ball
  • Judgmental
  • Convenience
  • Random samples
  • Based on probability ever item has an equal
    chance of being selected
  • I believe, I believe!!

24
Random Sample
  • A random sample allows us to make estimates about
    the larger population based on what we learn from
    the sample.
  • Each person has an equal chance of being
    selected.
  • Eliminates selection bias.
  • Challenge
  • To locate a complete listing of the entire
    population from which to select a sample.

25
Random Samples Are Imperfect
  • Random samples have a probability of error.
  • Statistics estimates for the probability that
    the sample results are not representative of the
    population as a whole.
  • If you use a random sample, then you will also
    include tests of statistical significance in your
    analysis plan.
  • What do tests of statistical significance tell
    you?

26
Step 5 Analysis Plan
  • This will be clearer after we go through the
    statistics refresher
  • There is a connection between the data
    collection, sampling choice and data analysis

27
Research Process The Proposal
  • The Whole Plan Blue print
  • Ready Aim Fire
  • Most errors in research are made in the planning
    phase!
  • Fancy statistics will not correct design errors.

28
Writing the Research Proposal(aka Design)
  • Class Discussion
  • What was your experience in writing your proposal
    for the end of the fall term?
  • What might you do differently?
  • What might help make it easier?

29
Design Matrix
  • A tool that can help focus on all the details
  • It is a visual
  • Focus is on content not writing style
  • It is a living document
  • Planning is an iterative process
  • This is generic format
  • Change it to fit your style

30
Design Matrix
  • Follows methodology planning steps
  • Basic categories
  • Questions
  • Design
  • Information required
  • Data sources
  • Data collection approaches
  • Data analysis approaches
  • Generic Design Matrix Handout

31
Guidelines
  • Work on one question at a time.
  • Leave blanks, fill in as more information becomes
    available.
  • Use comments to identify areas that require more
    information or state your assumptions.

32
Case Evaluating the Education Program
  • Instituted a teachers college in hopes of
    improving the quality of teachers
  • Create a model
  • Institute a teachers college? graduates trained
    teachers ? hired in the primary schools ? improve
    the teaching ?better performance by students

33
Case Evaluating the Education Program
  • Issue How has the program worked in terms of
    improving the quantity and quality of primary
    school teaching?

34
Case Evaluating the Education Program
  • Question 1 Is the teacher college (TC) being
    fully used?
  • a. Has it met its goal of 540 students per year?
  •  2. How many graduate or drop out each year?
  • 3. What proportion of graduates are hired by
    local schools?
  •  

35
Case Evaluating the Education Program
  • 4. Are students in primary school performing
    better?
  • a. Are test scores improving?
  •  b. Are more students going on to secondary
    schools
  •  c. Do school officials see a difference in the
    performance of graduates?

36
Case Evaluating the Education Program
  • Review Education Handout
  • Look at how each question was addressed.
  • Note how the questions form the outline for the
    final report.

37
Class Exercise
  • A new MPA director wants to know how effective
    the MPA program is. He has asked you to develop 3
    research questions and a rough outline of what a
    research design might look like.
  • Form groups of 3 and complete the design matrix
    as best you can. Note where you are making
    assumptions or need to check for information.
  • Put names on matrix and turn in at end of class.

38
Step 6 Putting It All Together
  • Test all your data collection instruments and
    plans to make sure they work the way you expect
    them to
  • Pre-test in real settings
  • Expert review
  • Cold-reader review

39
Putting It All Together
  • It is worth the time planning and testing your
    plan.
  • Begin with the end in mind.
  • If you do not know where you are going, you can
    wind up anywhere.

40
Applying to Your Research Project
  • Can you see how you might use this?
  • Questions?
Write a Comment
User Comments (0)
About PowerShow.com