Ain - PowerPoint PPT Presentation

About This Presentation
Title:

Ain

Description:

... by Christina J. Borbely, ... A program will be considered 'Model' if the NREPP review team ... Model programs also provide training and technical assistance to ... – PowerPoint PPT presentation

Number of Views:170
Avg rating:3.0/5.0
Slides: 66
Provided by: CJB
Learn more at: http://www.ca-sdfsc.org
Category:
Tags: ain | christina | model

less

Transcript and Presenter's Notes

Title: Ain


1
Aint No Mountain High Enough
  • Climbing the Peaks of Program Excellence

Facilitators Christina Borbely Kerrilyn
Scott-Nakai Produced and Conducted by the Center
for Applied Research Solutions, Inc. for the
California Department of Alcohol and Drug
Programs SDFSC Workshop-by-Request March 22, 2006
Ventura County Authored by Christina J.
Borbely, Ph.D. Safe and Drug Free Schools and
Communities Technical Assistance Project
2
Trail Map
  • Why Are We Doing This
  • Value
  • Opportunities
  • Opportunity for Recognition
  • Advanced Program Essentials
  • Program Essentials
  • Key Considerations
  • Advancing Program Through Evaluation
  • Methodology Design Instrumentation
  • Data Plan Analysis
  • Reporting

3
Why Are We Doing This?
  • The Value of Advancing Programs
  • Opportunities for Advancing Programs

4
Value
  • Replicating innovative strategies
  • Fill in gaps
  • Integrate latest science and/or practice
  • Making contribution through dissemination
  • Participate in science-service dialog
  • Advance the field
  • Provide effective program to others

5
Opportunities
  • Expansion
  • Demonstrate need/value of new or additional
    funding
  • Bolster capacity to sustain programming
  • Recognition
  • Validation from field
  • Potential for supplemental support/resources
  • Publications

6
Opportunity for Recognition
  • Validation from the Prevention Field
  • Service to Science
  • NREPP
  • Exemplary Programs

7
National Registry of Effective Prevention
Programs (NREPP)
  • NREPP is coordinated by the Center for Substance
    Abuse Prevention (CSAP) under the federal
    Substance Abuse and Mental Health Services
    Administration (SAMHSA)
  • NREPP is
  • a system designed to support informed decision
    making and to disseminate timely and reliable
    information about interventions that prevent
    and/or treat mental and substance use disorders.
  • http//modelprograms.samhsa.gov/template.cfm?page
    nreppover

8
Original NREPP Designations
  • A program will be considered Model if the NREPP
    review team appointed your program as an
    effective program, and an agency agrees to
    participate in CSAPs dissemination efforts.
    Model programs also provide training and
    technical assistance to practitioners who wish to
    adopt a program in order to ensure that the
    program is implemented with fidelity.
  • A program is considered Effective if it is
    science-based, and produces consistently positive
    patterns of results. Only programs positively
    effecting the majority of intended recipients or
    targets are considered effective.
  • A program will be considered Promising if it
    provides useful and scientifically defensible
    information about what works in prevention, but
    has yet to gather sufficient scientific support
    to standards set for effective/model programs.
    Promising programs are sources of guidance for
    prevention practitioners, although they may not
    be as prepared as Model programs for large-scale
    dissemination.

9
(No Transcript)
10
NEW NREPP Eligibility Criteria
  • Open submission review based on alignment of
    intervention with NREPP priorities
  • SAMHSA's three Centers -- the Center for Mental
    Health Services, the Center for Substance Abuse
    Prevention, and the Center for Substance Abuse
    Treatment -- will establish priorities for the
    types of interventions to be reviewed and
    highlighted on NREPP.
  • Priorities will be established and provided to
    the public annually through notices on the NREPP
    Web site.
  • These priorities based on dialogues with
    treatment and prevention stakeholders as well as
    with SAMHSA's Federal partners

11
NEW NREPP Review Criteria
  • the sole requirement for potential inclusion in
    the NREPP review process is for an intervention
    to have demonstrated one or more significant
    behavioral change outcomes.

12
NEW NREPP Review Process
  • A trained Ph.D.-level evaluation specialist works
    with applicants to assure that adequate materials
    have been submitted before initiating an NREPP
    review.
  • The evaluation specialist serves as collaborator
    in the application process and liaison to the
    reviewers.
  • A scientific review of the intervention is
    conducted by two independent Ph.D.-level
    reviewers.
  • Completed review summaries, including
    descriptive components, reviewer ratings, and
    explanations are provided to the applicant for
    approval before they are posted on the NREPP Web
    site.

13
NEW NREPP Application Process
  • Application materials include one or more of the
    following types of documents
  • formal evaluation reports,
  • published and unpublished research articles,
  • narrative sections of grant applications,
  • training materials, and
  • implementation or procedural manuals.
  • concise summary of the intervention that includes
    the intervention name, a description of its main
    components, the population(s) targeted, and the
    behavioral outcomes targeted.

14
The Exemplary Program Awards
  • The Exemplary Program Award is designated by CSAP
  • The Exemplary Awards program recognizes
    prevention programs in two tracks Promising
    Programsthose that have positive initial results
    but have yet to verify outcomes scientifically,
    and Model Programsthose that are implemented
    under scientifically rigorous conditions and
    demonstrate consistently positive results.
  • The Exemplary Awards recognize prevention
    programs that are innovative and effective and
    that successfully respond to the needs of their
    target populations, both as Promising Programs
    and Model Programs.

15
Exemplary Program Award Review Process
  • A multifaceted procedure is used identify and
    select Promising Programs to receive an Exemplary
    Substance Abuse Prevention Program Award
    annually. All nominated programs submit to a
    three-level review process.
  • First, state agency personnel and national
    organizations submit their formal nominations.
  • Applications are then reviewed by experts in the
    field of substance abuse prevention and former
    Exemplary Substance Abuse Prevention Program
    Award winners.
  • Finally, the National Review Committee reviews
    and scores the top applications according to
    eight criteria and recommends those that merit an
    Exemplary Substance Abuse Prevention Program
    Award. Final selections are made jointly by
    NASADAD, CADCA, and SAMHSA/CSAP.

16
Exemplary Program AwardApplication Process
  • Applications for the Innovative Programs may be
    obtained from State Alcohol and Drug Agencies,
    the NASADAD/NPN Web page (www.nasadad.org) and
    office.
  • Applicants must submit their application to their
    national nominating organization (see application
    appendix) for sign-off. Applicants should then
    return the original signed, completed application
    (including cover sheet) and three copies to the
    NASADAD/NPN central office in Washington, D.C.
    For more information about the application
    process, call or write
  • NASADAD/NPN
  • 808 17th Street, NW, Suite 410
  • Washington, DC 20006
  • Attention Exemplary Programs
  • Web page www.nasadad.org
  • E-mail amoghul_at_nasadad.org
  • (202) 293-0090, Fax (202) 293-1250

17
Exemplary Program Award 8 Review Criteria
  • Philosophy
  • Background and need (program planning)
  • Goals and objectives
  • Population(s) to be served
  • Activities and strategies
  • Community coordination
  • Evaluation
  • Program management

18
Service to Science
  • Service to Science is a national initiative
    supported by SAMHSA/CSAP to enhance the
    evaluation capacity of innovative programs and
    practices that address critical substance abuse
    prevention or mental health needs.
  • http//captus.samhsa.gov/northeast/special_project
    s/service_to_science/main.cfm

19
Service to Science Academy
  • Designed to enhance capability of community-based
    prevention strategies, programs or practices that
    demonstrate effectiveness.
  • Each Academy is customized to support the needs
    of the groups/organizations and programs accepted
    to attend,
  • Emphasis on the development of a strong
    evaluation and/or research design.
  • Participants receive training and technical
    assistance helping them move along the
    evidence-based continuum

20
Service to Science AcademyEligibility Criteria
  • 1. Primarily focused on ATOD prevention, but may
    also address the prevention of violence,
    HIV/AIDS, STDs, etc. Expected outcomes or areas
    of focus include, but are not limited to, efforts
    to decrease high-risk behaviors by children or
    adults eliminate use of illicit drugs reduce
    underage use of alcohol, tobacco, and other
    drugs, and decrease DUI/DWI rates.
  • 2. Nominated for recognition by a State Alcohol
    and Drug Agency, by the Community Anti-Drug
    Coalitions of America (CADCA), or by other
    national organizations or their affiliates.
  • 3. Able to document and demonstrate success in
    the form of quantifiable outcome data.
  • 4. In operation for a minimum of two (2) years.

21
Service to Science AcademyReview Criteria
  • Philosophy
  • Needs Assessment
  • Population Served
  • Goals Objectives
  • Activities Strategies
  • Evaluation
  • Program Management
  • Community Coordination

22
Service to Science Academy Application Process
  • The application to attend a Service to Science
    Academy is a modified National Association of
    State Alcohol Substance Abuse Directors
    (NASADAD) application for Innovative/Exemplary
    Programs.
  • Applications are reviewed by a panel who makes
    recommendations for acceptance to the Academy.

23
Application Criteria as Program Practice
  • Live it!
  • SDFSC Santa Cruz County Service to Science
    Academy
  • Santa Cruz County submitted an application and
    was awarded a program slot with the current
    cohort for the Service to Science Academy. The
    Santa Cruz team will receive a series of
    trainings and technical assistance to assist them
    in moving their program towards being recognized
    as a model or promising program.
  • SDFSC Butte County NPN Exemplary Program Award
  • Butte County submitted 3 of their prevention
    programs for review Friday Night Live Mentoring,
    Friday Night Live, and Youth Nexus. Two of these
    programs are being recognized nationally, with
    only 6 programs receiving this national
    recognition by the National Prevention Network
    Research.
  • Andrea Taylor, Ph.D. NREPP Model Program Status
  • Andrea Taylor evolved a local program, Across
    Ages, an intergenerational mentoring program that
    promotes positive youth development and helps
    prevent school failure, substance abuse and teen
    pregnancies into to an NREPP Model Program that
    is implemented nation-wide. The process spanned
    1991-1998.

24
Advanced Program Essentials
  • Put Your Finger On It
  • Logic Model
  • Core Components
  • Documented Need and Value
  • Defining Population
  • Defining Need for Service within the Community

25
Logic Model
  • A logic model is a systematic and visual way
    to present and share your understanding of the
    relationships among the resources you have to
    operate your program, the activities you plan,
    and the changes or results you hope to achieve.
  • (W.K. Kellogg, Logic Model Development Guide,
    2004)

26
Value of a Logic Model
  • A Picture is Worth a 1000 Words
  • Builds understanding about what the program is,
    what its expected to do and what measures of
    success will be used.
  • Provides a research-based theory behind your
    strategies
  • Promotes communication and a common understanding
    amongst staff and funders

27
Core Program Components
  • What are the active ingredients in the formula
    for program success?
  • In theory, core components must be implemented
    precisely as intended in order to achieve
    demonstrated outcomes.
  • Core components cannot be adapted.

28
Define Core Components
  • Core components might be
  • program structure (e.g. the sequence of sessions
    or context of delivery),
  • program content (e.g. specific concepts or skill
    sets), or
  • method of delivery (e.g. homework assignments,
    classroom infusion, or youth-led group
    activities).

29
Define Population
  • Institute of Medicine (IOM) Classifications
  • Universal preventive interventions are
    activities targeted to the general public or a
    whole population group that has not been
    identified on the basis of individual risk.
    Selective preventive interventions are activities
    targeted to individuals or a subgroup of the
    population whose risk of developing a disorder is
    significantly higher than average. Indicated
    preventive interventions are activities targeted
    to individuals in high-risk environments,
    identified as having minimal but detectable signs
    or symptoms foreshadowing disorder or having
    biological markers indicating predisposition for
    disorder but not yet meeting diagnostic levels.

30
Defining Need for Service
  • Integrating key stakeholders in process
  • Bonus points for youth
  • Representative of community
  • Strategic Prevention Framework
  • Needs/Resource Assessment
  • Strategic Planning
  • Evidence-based Implementation

31
Key Considerations
  • Advancing Programming
  • Whats the yardstick?
  • How do I measure up?
  • Where do I want to go from here?

32
Considerations Participation
  • Recruitment
  • Are we meeting target s consistently?
  • Are we using strategic recruitment methods?
  • Retention
  • Do we have sufficient completion rates?
  • Have we defined a program graduate/drop-out?
  • What do we do to encourage retention?

33
Considerations Fidelity
  • Fidelity
  • To what degree are we consistently implementing
    core components? Is this sufficient?
  • What system do we use to reflect on areas of
    challenge? How does that inform our process?
  • What method do we use to monitor implementation
    across sites? Are we vigilant enough? Does
    feedback get incorporated?

34
Considerations Innovation
  • Degree to which program is novel, cutting edge,
    innovative.
  • How is this different than whats already
    available?
  • What aspects of the program are unique?
  • Grounded but Innovative program alignment with
    already-proven models of service
  • What proven methods are incorporated in what we
    do?
  • Did we take an evidence-based strategy to the
    next level or use it in a novel way?

35
Considerations Population
  • How culturally appropriate are services to
    identified population?
  • Program content
  • Program materials (e.g. translation)
  • Staff (training and protocol)
  • Tested across ethnic/cultural groups
  • Link to evidence-based strategies demonstrated
    with specific populations

36
Considerations Marketing
  • Have materials/curriculum been packaged
  • Sequencing
  • Branding
  • Training protocol tested and established/documente
    d

37
Considerations Replication
  • Protocol
  • Program curriculum
  • Training process
  • Evaluation
  • Packaged program materials
  • Curriculum
  • Evaluation
  • Strategic replication
  • Varied populations
  • Varied context

38
Advancing Programs through Evaluation
  • Rigor
  • Methodology
  • Data Plan Analysis
  • Reporting

39
Increasing Evaluation Rigor Across the Board
  • Methodology/Design
  • Instrumentation to Analysis
  • Reporting

40
Tips for Optimal Evaluation Rigor
  • Use external evaluator to lend credibility
  • Especially valuable for publishing findings
  • Conduct evaluation of replication sites
  • Evidence of impact in varied settings
    populations
  • Evaluate program effect and sustainability of
    effect
  • Pre/post demonstrates immediate effects
  • Follow up (longitudinal) proves how those effects
    are sustained.

41
Advancing Methodology
  • Process Outcome
  • Evaluation Design
  • Tips for Optimal Design

42
Role of Process and Outcome Methods
  • Process
  • Allows for continuous learning about how the
    program is working as it is implemented
  • Focuses on clearly describing and assessing
    program design and implementation.
  • Makes it possible to answer questions concerning
    why and how programs operate the way they do
    and what can be done to improve them.
  • Outcome
  • The outcome evaluation focuses on producing clear
    evidence concerning the degree of program impact
    on program participants.
  • Assesses the immediate or direct effects of
    program activities (as compared to long-term
    impact).

43
Level of Rigor Outcome Evaluation Design
44
NREPP Source of Evidence Hierarchy
45
Tip for Optimal Design Matched Data
  • Making a Match
  • Requires tracking of individuals
  • Allows for analysis of individual-level impact,
    not just aggregate level
  • Can control for dosage or other factors

46
Tip for Optimal Design Longitudinal Data
  • Looking at the long run
  • The majority of programs use a pre/post
    assessment schedule.
  • The utilization of follow-up points is
    recommended based on length of program
  • Consider a follow up point at 1, 3, 6, 9, or 12
    months after completion.
  • Programs with continuous enrollment vs. cohorts
    of youth need
  • strong tracking systems
  • Continuous evaluation schedule (e.g. every 3 or 6
    months)

47
Tip for Optimal Design Comparison Groups
  • Shall I compare thee to a summers day
  • Comparison groups can sometimes be fairly easy to
    develop
  • School data
  • Low dosage service groups can sometimes be
    utilizedmake the distinction between program
    drop-out versus evaluation drop-out
  • Use standardized measures and compare program
    groups to school, district, state results.

48
Tip for Optimal Design Control Groups
  • Control freak!
  • Controls groups require resources and may deter
    participants due to randomization.
  • The trick is in the approach and the ability to
    provide services at a later date.

49
Advancing Instrumentation
  • Standardized v. Locally Developed
  • Tips for Optimal Instrumentation

50
Survey Options
Standardized
Locally Developed
  • Pros
  • Already developed, lots of choices.
  • Psychometrics established
  • Allows for comparison of resultsnational, state,
    district levels
  • Scoring and analysis sometimes available
  • Cons
  • Cost
  • May not be specific to your population
  • May not capture novel aspects of program
  • Pros
  • Can tap into novel program aspects/impact
  • Can be tailored to population
  • No cost
  • Cons
  • Dont know reliability/validity
  • Doesnt allow for comparison

51
Tip for Optimal Instrumentation
  • Next level of locally developed measures
  • performance measure/psychometrics of
    instruments
  • reliability validity (done by your evaluator)
  • Track at individual level
  • Confidential Ids
  • Develop comprehensive database

52
Advancing Data Management Processing
  • Data Plan
  • Sample Size
  • Data Analysis

53
Data Plan
  • Develop plan for analyzing data based on proposed
    outcomes (logic model)
  • What questions to ask of the data?
  • What piece of the data answers each question?
  • Potential sub-group comparisons
  • (e.g. by gender, dosage, site)

54
Tips for Optimal Data Plan
  • Specify cutoff points for baseline
    assessment (Defined for program)
  • e.g. Baseline assessments are defined as those
    completed prior to session 2 of the curriculum.
  • Define completers vs. dropouts
  • e.g. Parents attending 85 of sessions are
    defined as program completers less than 10
    are defined as dropouts.
  • Ensure matched pre/post
  • Individual vs. aggregate level findings

55
Planning a Sample Size
  • How Much Wood should a Woodchuck Chuck?
  • Sample size (N) refers to population
    participating or being measured (e.g. of
    participants of sites)
  • Power Probability of finding a true effect
  • Type I error state a finding when there isnt
    one (a false positive)
  • Type II error state no finding when there is one
    (a false negative)
  • Sample size Power
  • Influences types and sensitivity of analysis
  • Larger sample size increases power

56
Tips for Optimal Data Plan Strategic Sample Size
  • Calculate necessary sample size for appropriate
    statistical power
  • http//www.surveysystem.com/sscalc.htm
  • http//www.macorr.com/ss_calculator.htm
  • Resource limitations consider using a strategic
    sub-sample

57
Data Analysis Beyond Percentage Reporting
  • Means with standard deviation
  • SD reflects the variability of values
  • Tests of significance (comparative analysis)
  • Correlations
  • As participation level increases, attendance rate
    significantly increases.
  • Chi square analysis
  • Youth demonstrated statistically significant
    improvements in communication skills over time.

58
Tips for Optimal Data Analysis Techniques
Strategies
  • Leverage variability in data/dosage to program
    advantage
  • e.g. Youth who completed the program were more
    likely to have negative attitudes toward use than
    youth who did not complete the program.
  • Identify potential comparison data sets (e.g.
    school records)
  • e.g. School records show that participating youth
    had significantly fewer discipline referrals than
    the general student population.

59
Advancing Reporting Methods
  • Venues for Dissemination
  • Cater to the Crowd
  • Tips for Optimal Reporting

60
Where to Disseminate
  • Evaluation Reports
  • Summary Reports
  • Applications
  • Grants
  • Press Release
  • Professional Publications
  • Academic/research Publications

61
Cater to the Crowd
  • What information is relevant to your audience?
  • Note preferred models/frameworks, rhetoric
  • Highlight information that is of value to them
  • To what extent is detail or brevity important to
    your audience?
  • Are pieces of program information weighted
    differently (e.g. a reviewer point system)
  • Work with your evaluator in developing a brief
    findings report as well as a full evaluation
    report

62
Tips for Optimal Reporting Frame It
  • The evaluator is responsible for providing the
    full and objective picture
  • Program Director may choose to highlight the most
    positive findings when reporting to funders or
    stakeholdersif appropriate
  • Wording can make a difference! The same findings
    can be written in a variety of waysbe conscious
    of the wording.

63
Tips for Optimizing Reporting The Message
  • Say It In Pictures
  • The appropriate use of charts and graphs can be a
    powerful tool in conveying findings.
  • Bring It Home
  • The use of personal quotes and case examples can
    be powerful when they are used to supplement key
    quantitative findings.
  • Personal experiences make the impact real to the
    reader
  • However, when misused they can make the
    evaluation seem less credible

64
Climbing the Mountain Whats Your Next Step?
  • Action Planning Exercise
  • Defining short-term, intermediate, and long-term
    goals (e.g. 1 yr, 3yr, and 5 yr goals)
  • Programmatic Goals
  • Evaluation Goals
  • Opportunity Goals
  • How can we support you in your climb to the top?
  • Customized TA and Training plans

65
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com