Understanding the Terminology: What is EvidenceBased and ResearchBased Programming - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

Understanding the Terminology: What is EvidenceBased and ResearchBased Programming

Description:

Theresa M. Ferrari, Ph.D. CYFAR Pre-Conference. May 24, 2005 ... The experiential learning model and its effect on: Motivation. Self-regulated learning ... – PowerPoint PPT presentation

Number of Views:77
Avg rating:3.0/5.0
Slides: 46
Provided by: theresa62
Category:

less

Transcript and Presenter's Notes

Title: Understanding the Terminology: What is EvidenceBased and ResearchBased Programming


1
Understanding the Terminology What is
Evidence-Based andResearch-Based Programming?
  • Theresa M. Ferrari, Ph.D.
  • CYFAR Pre-Conference
  • May 24, 2005

Acknowledgements Slides on evidence-based
programming adapted from Danny Perkins, Penn
State University
2
Evidence Based Practices
  • What are they?
  • What are the pros and cons?
  • What is the situation for evidence-based
    practices in the afterschool field?
  • Whats an educator to do? How to deal with lack
    of strong evidence?

3
The SituationMandate for Evidence-Based Programs
  • No Child Left Behind Act of 2001 and many grant
    programs require use of scientifically-based
    research to decide which interventions to use
    and those that will be funded.
  • Federal government and private funders are
    demanding increased accountability.

4
The Gold Standard Strong Evidence of
Effectiveness
  • Strong evidence of effectiveness
  • Randomized controlled trials that are well
    designed and implemented (e.g., use valid
    measures, implemented with fidelity, low
    attrition rate)
  • Trials showing effectiveness in two or more
    settings (including a setting similar to that in
    which the program will be implemented at least
    300 students or 50-60 classrooms.)
  • Quality Quantity Strong Evidence
  • (U.S. Department of Education, 2003)
  • www.ed.gov/rschstat/research/pubs/rigorousevid/rig
    orousevid.pdf

5
Possible Evidence of Effectiveness
  • Randomized controlled trials that are good
    quality, but fall short of strong evidence
  • Comparison-group studies in which the
    intervention and comparison groups are very
    closely matched (academic achievement,
    demographics, and other characteristics)
  • (U.S. Department of Education, 2003)

6
Interventions Not Supported by Meaningful
Evidence
  • Pre-post studies with no comparison group.
  • Comparison-group studies in which the
    intervention and comparison groups are not
    closely matched.
  • Meta-analyses that include the results of such
    lower-quality studies.
  • (U.S. Department of Education, 2003)

7
Defining Scientific Quality
  • Use of the scientific methods with emphasis on
    experimental design.
  • Replication of results, with multiple studies by
    different investigators.
  • Ability to generalize from sample to the
    population.
  • Fulfillment of rigorous standards, with an
    emphasis on peer review.
  • Convergence (consistency) of results among
    studies.
  • (Bouffard, 2003)

8
Implementation Issues
  • Implementing with fidelity is critical to success
    of evidence-based programs.
  • Many implementation challenges threaten fidelity.
  • Meaningless to know whether a program has
    achieved a certain outcome unless the process by
    which this outcome has been achieved is
    documented.
  • Many studies have gaps in the reporting of basic
    elements (demographics and intervention
    parameters).

9
Where to Find Evidence-Based Interventions
  • The International Campbell Collaboration
  • http//www.campbellcollaboration.org
  • The What Works Clearinghouse
  • http//www.,w-w-c.org/
  • The Promising Practices Network
    http//www.promisingpractices.net/
  • Blueprints for Violence Prevention
    http//www.colorado.edu/cspv/blueprints
  • Social Programs That Work http//www.evidencebased
    programs.org

10
Gold Standard Evidence
  • One-on-one tutoring by qualified tutors for
    at-risk readers in grades 1-3
  • Life Skills Training for junior high school
    students
  • Reducing class size in grades K-3
  • Instruction for early readers in phonemic
    awareness and phonics
  • (U.S. Department of Education, 2003)

11
Preliminary Evidence
  • High-quality, educational child care and
    preschool for low-income children
  • (U.S. Department of Education, 2003)

12
Arguments for Evidence-Based Practices
  • Rigorous review of research provided by panel vs.
    single individual
  • Systematic approach
  • Provide starting point for designing programs
    have shifted practice
  • (Dunifon et al., 2004)

13
Criticism of Evidence-Based Practices
  • Infrastructure needed to conduct systematic
    reviews is resource intensive
  • Context-dependent nature of many programs
  • Culture of autonomy at local level
  • (Dunifon et al., 2004)

14
The Problem State of Current Research in the
Afterschool Field
  • Bottom Line Very little meets the gold
    standard.
  • Lack of rigorous designs
  • Presence of selection bias
  • Most evaluations assess overall program impact
    rather than linking specific program activities
    with outcomes. (Little Harris, 2003)

15
Problems Applying the Gold Standard to the
Afterschool Field
  • Promotes exclusive use of one research design.
  • Challenges of implementing this design in applied
    settings.
  • Bias toward quantitative vs. qualitative
    methodology.
  • Appropriateness of methodolgy for research
    question

16
Problems Applying the Gold Standard to the
Afterschool Field
  • Afterschool is an emerging field that has only
    lately attracted attention of researchers.
  • Reviews are complicated by wide view of what
    could possibly be included as afterschool.
  • Programs are diverse there are differing program
    philosophies and goals, and thus differing
    outcomes to be expected.
  • Differing motivations and levels of participation
  • Programs offer intangibles that are hard to
    quantify. (Miller, 2003)

17
Evidence-Based vs. Research-Based
  • 4-H curricula is research-based
  • Content is research-based
  • Does it promote life skill development?
  • Beginning to use evidence-based curricula in 4-H.
  • Applying what we know from research to guide
    program development and implementation

18
What Do We Do in the Meantime?
  • Good outcomes require good programming.
  • Examine what contributes to program quality.
  • Document program quality.

19
The Evidence Suggests . . .
  • Youth benefit from after-school programs,
    provided they involve
  • Consistent participation
  • Quality, well-run programs
  • Thus, defining quality is an important part of
    what we need to do.

20
Be There or Be Square Participation Matters
  • If youre not there, you cant benefit!
  • Significant positive relationship between time
    spent in a program and the desired outcomes.
  • Limited or sporadic attendance is not likely to
    produce desired effects.

21
Measuring Attendance Dosage
  • Absolute attendance yes or no
  • Frequency/Intensity how often
  • Duration how long
  • Breadth in what variety of activities
  • (Fiester, Simpkins, Bouffard, 2005 Harvard
    Family Research Project, 2004)

22
Program Activities What You Do Matters
  • Variety of activities
  • Flexibility of programming
  • Positive emotional climate
  • (Rosenthal Vandell, 1996)
  • Collecting activity implementation data is a
    critical first step in evaluation.

23
Program Activities What You Do Matters
  • Organized activities
  • Involve voluntary participation
  • Contain structure
  • When engaging in these activities youth
    experience
  • High intrinsic motivation
  • Positive mood
  • Cognitive engagement
  • When these conditions exist
  • Develop initiative
  • Dworkin, Hansen, Larson, 2003
  • Hansen, Larson, Dworkin, 2003
  • Larson, 2000
  • Larson, Hansen, Walker, 2005

24
Engagement within Various Youth Contexts
25
Why Is This Important?
  • High intrinsic motivation
  • High concentration
  • Time
  • INITIATIVE

26
Staff Who Youre With Matters
  • It does make a difference who is interacting
    with youth.
  • Caring youth-staff relationships may be the most
    critical element to program success.
  • (Rhodes, 2004 Shortt, 2002)
  • Ability of staff member leading the activity may
    be more important than the activity itself.
  • (Grossman et al., 2002)

27
Staff Who Youre With Matters
  • Evidence from mentoring studies
  • Engaging in social activities
  • Engaging in academic activities
  • Meeting regularly
  • Using youths interest to guide interaction
  • Seeking input and making decisions jointly
  • Taking a less judgmental approach
  • Jekielek, Moore, Hair, Scarupa, 2002
  • Rhodes, 2004

28
Staff Who Youre With Matters
  • Processes by which staff mediate effects of
    after-school programs
  • By enhancing youths social skills and emotional
    well-being
  • By improving youths cognitive skills through
    instruction and conversation
  • By serving as role models and advocates
  • (Rhodes, 2004)

29
Academic Achievement
  • Indirect contribution
  • By increasing student engagement in learning
  • Greater school attendance
  • Improved work habits and behavior
  • Positive attitudes toward school
  • Miller, 2003

30
If you expect certain content to be delivered . .
.
  • It must be reflected in the program design and
    implementation in intentional ways.
  • Use of evidence-based practices to enhance
    effectiveness.
  • Success ultimately rests with the staffs ability
    to deliver the content effectively.

31
How Do You Bring Existing Evidence to Bear?
  • Dont base decisions on results of one study.
  • Look at original research when possible.
  • Rely on others reviews when necessary, but take
    this filtering into account.
  • Look at types of studies conducted.
  • Look for conceptual frameworks that help organize
    what we know.

32
How Do You Bring Existing Evidence to Bear?
  • Look for similarities and differences between
    settings.
  • Ask What is this an example of?
  • Look at literature in related areas that will
    help to understand the underlying processes at
    work.
  • Look for how the evidence stacks up.

33
Dont Rely on One Study
  • Example Mathematica study
  • Found no effect for 21st CCLCs.
  • Focused on some findings and not others.
  • Criticism used results as basis for 40
    reduction in funding recommendation.

34
Rely on Original Research
  • Allows you to evaluate similarities or
    differences compared with your situation.
  • Not on others interpretations of it (e.g.,
    reviews of literature).

35
Rely on Others Reviews
  • Meta-analyses
  • Have criteria for inclusion.
  • Need to have enough studies and those that are
    high quality, and include enough information
    about the intervention.
  • Effect size (calculated from posttest Ms SDs
    from treatment and controls).
  • Can show conditions or settings in which
    intervention is most effective.
  • Literature Reviews
  • e.g., New Directions for Youth Development

36
Look at Types of Studies
  • What can you get from quantitative vs.
    qualitative studies?
  • What can you get from small vs. large studies?
    Single site vs. multi-site?
  • What can you get from longitudinal studies?

37
Look for Conceptual Frameworks
  • Help to organize what we know.
  • Suggest relationships among variables.

38
Theory of Change - Study of Promising
After-School Programs
Family Background and Child Prior Functioning
Improved Work Habits, School Attendance Social
Skills Reduced Misconduct
Improved Grades, Achievement, Future Orientation
Reduced Risky Behaviors
Program Structural Institutional Features
Program Processes and Content
Program Dosage
Vandell, D. L., Reisner, E. R., Brown, B. B.,
Pierce, K., Dadisman, K., Pechman, E. M.
(2004). Retrieved from www.wcer.wisc.edu/childcare
/statement3.html
39
Look for Similarities and Differences
  • What is the pattern of evidence that emerges from
    reviewing multiple studies?
  • Converging?
  • Conflicting?
  • For some but not others (e.g., male vs. female
    differences)?
  • Gaps what is missing?

40
How Does the Evidence Stack Up?
  • Synthesis
  • What more do you know after reading several
    studies than if you had read only one of them?

41
What is This An Example Of?
  • Directs us to other literature
  • Helpful when limited research is available in
    specific area.
  • Provides broader theoretical framework.
  • Example
  • Camp counseling
  • Is an example of cross-age teaching (to
    understand developmental processes at camp)
    (McNeely, 2004)
  • Is an example of transfer of learning (to
    understand long-range benefits beyond immediate
    setting) (Digby, 2005)
  • In addition to what we know about camp
    counselors, there is a body of literature in each
    of these areas that can inform how we think about
    what we do.

42
Look for Related Literature to Understand the
Underlying Processes at Work
  • Example
  • The experiential learning model and its effect
    on
  • Motivation
  • Self-regulated learning
  • Transfer of learning

43
Example Peer-Assisted Learning Meta-Analysis
  • Positive effect sizes indicating increases in
    achievement
  • Most effective under for these audiences
  • Younger students
  • Urban
  • Low-income
  • Minority
  • Under these conditions
  • Interdependent reward contingencies
  • Ipsative (individual) evaluation procedures
  • Provided students with more automony

44
References
  • Bouffard, S. (2003). Doing what works
    Scientifically based research in education.
    Evaluation Exchange, 9(1), 15, 17. Retrieved from
    http//www.gse.harvard.edu/hfrp/evalu/issue21/bbt1
    .html
  • Digby, J. K. (2005). The experience of a
    lifetime Alumni perceptions of the development
    and transfer of life and workforce skills in the
    Ohio 4-H camp counselor program. Unpublished
    masters thesis, The Ohio State University,
    Columbus.
  • Dunifon, R., Duttweiler, M., Pillemer, K.,
    Tobias, D., Trochim, W. M. K. (2004).
    Evidence-based Extension. Journal of Extension
    On-line, 42(2). Available at
    http//www.joe.org/joe/2004april/a2.shtml
  • Dworkin, J. B., Larson, R. W., Hansen, D. M.
    (2003). Adolescents accounts of growth
    experiences in youth activities. Journal of Youth
    and Adolescence, 32(1), 17-26.
  • Fiester, L. M., Simpkins, S. D., Bouffard, S.
    (2005). Present and accounted for Measuring
    attendance in out-of-school time programs. In New
    Directions for Youth Development (no. 105, pp.
    91-107). Hoboken, NJ Wiley.
  • Grossman, J. B., Price, M. L., Fellerath, V.,
    Jucovy, L. Z., Kotloff, L. J., Raley, R.,
    Walker, K. E. (2002). Multiple choices after
    school Findings from the extended-service
    schools initiative. Philadelphia Public/Private
    Ventures. Retrieved from http//www.ppv.org/ppv/pu
    blications/assets/116_publication.pdf
  • Hansen, D. M., Larson, R. W., Dworkin, J. B.
    (2003). What adolescents learn in organized youth
    activities A survey of self-reported
    developmental experiences. Journal of Research on
    Adolescence, 13(1), 25-55.
  • Jekielek, S. M., Moore, K. A., Hair, E. C.,
    Scarupa, H. J. (2002). Mentoring A promising
    strategy for youth development. Washington, DC
    Child Trends.
  • Larson, R. W. (2000). Toward a psychology of
    positive youth development. American
    Psychologist, 55(1), 170-183.
  • Larson, R., Hansen, D., Walker, K. (2005).
    Everybodys gotta give Adolescents development
    of initiative within a youth program. In J.
    Mahoney, R. Larson, J. Eccles (Eds.), Organized
    activities as contexts of development
    Extracurricular activities, after-school and
    community programs (pp. 159-183). Mahwah, NJ
    Erlbaum.

45
References
  • Little, P. M. D., Harris, E. (2003). A review
    of out-of-school time program quasi-experimental
    and experimental evaluation results
    (Out-of-School Time Evaluation Snapshot 1).
    Cambridge, MA Harvard Family Research Project.
    Retrieved from http//www.gse.harvard.edu/hfrp/con
    tent/projects/afterschool/resources/snapshot1.pdf
  • McNeely, N. N. (2004). The Ohio 4-H camp
    counseling experience Relationship of
    participation to personal, interpersonal, and
    negative experiences. Unpublished doctoral
    dissertation, The Ohio State University,
    Columbus.
  • Miller, B. M. (2003). Critical hours Afterschool
    programs and educational success. Brookline, MA
    Nellie Mae Education Foundation. Available at
    http//www.nmefdn.org/uimages/documents/Critical_H
    ours.pdf
  • Rhodes, J. (2004). The critical ingredient
    Caring youth-staff relationships in after-school
    settings. In G. G. Noam (Ed.), After-school
    worlds Creating a new social space for
    development and learning (New Directions for
    Youth Development, no. 101, pp. 145-161).
    Hoboken, NJ Wiley.
  • Rosenthal, R., Vandell, D. L. (1996). Quality
    of school-age child care programs Regulatable
    features, observed experiences, child
    perspectives, and parent perspectives. Child
    Development, 67, 2434-2445.
  • Shortt, J. (2002). Out-of-school time programs
    At a critical juncture. In G.G. Noam B. M.
    Miller (Eds.), Youth development and after-school
    time A tale of many cities. (New Directions for
    Youth Development, no. 94, pp. 119-124). Hoboken,
    NJ Wiley.
  • U.S. Department of Education. (2003). Identifying
    and implementing educational practices supported
    by rigorous evidence A user friendly guide.
    Washington, DC U.S. Department of Education,
    Institute of Education Sciences, National Center
    for Education Evaluation and Regional Assistance.
    Retrieved from http//www.ed.gov/rschstat/researc
    h/pubs/rigorousevid/rigorousevid.pdf
  • Vandell, D. L., Reisner, E. R., Brown, B. B.,
    Pierce, K., Dadisman, K., Pechman, E. M.
    (2004). Retrieved from www.wcer.wisc.edu/childcare
    /statement3.html
Write a Comment
User Comments (0)
About PowerShow.com