Navigating 21st CCLC Program Evaluation Requirements Evaluation authorized by the Pennsylvania Depar - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Navigating 21st CCLC Program Evaluation Requirements Evaluation authorized by the Pennsylvania Depar

Description:

compilation or summary of what occurred during the intervention or a particular ... Portfolios or compilations of materials. Observations. Common Evaluator Tasks ... – PowerPoint PPT presentation

Number of Views:36
Avg rating:3.0/5.0
Slides: 37
Provided by: leslie132
Category:

less

Transcript and Presenter's Notes

Title: Navigating 21st CCLC Program Evaluation Requirements Evaluation authorized by the Pennsylvania Depar


1
Navigating 21st CCLC Program Evaluation
Requirements Evaluation authorized by the
Pennsylvania Department of Education
  • Yolanda Yugar
  • Leslie Kirby
  • Allegheny Intermediate Unit

2
Overview
  • Overview of evaluation
  • Evaluator characteristics tasks
  • 21st CCLC evaluation roles
  • 21st CCLC reporting requirements
  • Tips tricks for evaluation

3
What is Evaluation?
  • Evaluation is the process by which organizations
    examine the implementation and outcomes of an
    intervention to determine whether or not (and to
    what extent) the intervention contributed to a
    change.

4
Understanding the Differences
  • Compliance Monitoring
  • Checklist, documentation review
  • Required by funding agency
  • Separate from the evaluation
  • Research
  • Uses research methods such as control group,
    well-matched comparison group, quasi-experimental
    design to determine effectiveness

5
Caution The Notion of Causation
  • It is common to think about a program as causing
    change. Programs do not occur in a vacuum. A
    single intervention usually cannot be the sole
    cause of an outcome when dealing with people.
    Other programs, interventions, personal
    situations, and even chance contribute to the
    outcomes discovered during evaluation.

6
Evaluations are used to
  • Demonstrate accountability
  • Fulfill reporting requirements
  • Assess needs
  • Improve programs
  • Determine cost/benefit relationship
  • Determine feasibility replicability

7
Types of Evaluation
  • Program implementation To what extent was the
    program implemented as designed?
  • Program outcomes To what extent has the
    initiative influenced behaviors or practices?
  • Program impact To what extent has the
    initiative influenced student achievement?

8
Formative Evaluation
  • occurs while the intervention is in progress
    (usually during the developmental phase)
  • allows management to improve program for current
    term
  • concentrates on implementation and feedback from
    participants and/or program staff

9
Summative Evaluation
  • compilation or summary of what occurred during
    the intervention or a particular term, the
    results of the intervention, and whether goals
    were achieved
  • assesses overall impact

10
Mixed Method Approach
  • Most evaluations use a combination of these forms
    of evaluation to gain a clear understanding of
    the program in order to give possible outcomes a
    context. Using more methods also allows the
    evaluator to examine a single idea or portion
    using more sources of data to strengthen
    interpretation (triangulation).

11
Types of Data
  • Participant demographics
  • Participant achievement
  • Processes
  • Perceptions
  • (adapted from Victoria Bernhardt)

12
Quantitative Data
  • Quantitative data are data that can be counted or
    measured.
  • Includes
  • Assessment/test (achievement) data
  • Participation and attendance rates
  • Forced-choice survey questions (i.e. select
    answer from a list)

13
Qualitative Data
  • Qualitative data are data that cannot be easily
    counted. Instead, data are used to identify
    themes or trends.
  • Includes
  • Focus group responses and Interviews
  • Open-ended survey questions
  • Portfolios or compilations of materials
  • Observations

14
Common Evaluator Tasks
  • Develop or help to develop the evaluation plan.
  • Help staff understand evaluation and how it can
    help a program plan implement more effective
    interventions.
  • Involve train project staff to carry out
    specific evaluation tasks.
  • Help staff use data gathering methods/instruments
    in a reliable way.

15
Common Evaluator Tasks
  • Build capacity to collect, store, analyze, and
    use data to inform decision-making.
  • Help staff interpret data to make decisions
    consistent with results.
  • Write reports that facilitate local use of data
    collected for decision-making.
  • Serve as liaison to state evaluator.

16
Selecting an Evaluator
  • You should select an evaluator who
  • Reflects the needed characteristics or
    requirements
  • Understands the organization he/she is evaluating
  • Is willing to work collaboratively to identify or
    develop an evaluation that meets the requirements
    of the initiative and the needs of the
    organization

17
Characteristics of an Evaluator
  • Understands confidentiality of student data.
  • Ability to develop or find appropriate assessment
    instruments.
  • Capacity to collect, manage, analyze, and
    interpret data produce written reports based on
    those data.
  • Capacity to guide programs in using data for
    decision-making.
  • Understands state and local needs.

18
Where would I find an evaluator?
  • University or college
  • Intermediate Unit
  • Internal staff with specialized skills
  • Independent contractor
  • For-profit evaluation company

19
Establishing an Evaluation Team
  • Organization leadership
  • Program implementers
  • Evaluator(s)
  • Other stakeholders (partners, support services,
    etc.)

20
21st Century State Evaluation
21
Role of AIU Evaluation Team
  • Support grantees their local evaluator(s) in
    data collection, analysis, reporting
  • Preparing state reports that summarize data from
    on-site monitoring and grantee results
  • Provide grantees/evaluators with resources for
    using report results for decision-making

22
Role of Your Local Evaluator
  • Help grantees identify, collect, analyze data
    for APR state reporting
  • Help grantees use APR/monitoring reports for
    decision-making
  • Other tasks particular to the individual grantee
    needs

23
Required Reporting
  • Federal Grantee Profile Annual Performance
    Report (APR) through Profile and Performance
    Information Collection System (PPICS) system
  • due September 30, 2008
  • State Grantee State Report
  • due September 30, 2008
  • Monitoring formal monitoring occurs once per
    funding cycle

24
Grantee Profile APR
  • Center information
  • Partner information
  • Operations
  • Activities information
  • Program attendance
  • Teacher survey results
  • Report card grade results

25
Grantee State Report
  • Assessment/report card grade results
  • Feedback from parents, students, center staff,
    school administrators, partners
  • Comparison group results
  • School attendance results
  • Discipline behavior results
  • Teacher survey results

26
Monitoring
  • Documentation review
  • Compliance with grantees contract
  • Compliance with program standards/regulations
  • Interviews with staff, participants, managers

27
Performance Measures
28
PM 1 Academic GainsData Collection
  • Individual student data
  • Report card grades
  • Assessment data
  • Program attendance data
  • Feedback data
  • Grantee-specific data

29
Evaluating Academic Gains
  • Report card grades compare fall grades to spring
    grades
  • Report card grades compare spring grades for
    prior year to spring grades for current year
  • Assessment data compare fall/baseline score to
    spring score
  • Assessment data compare spring score for past
    year to spring score of current year
  • Results from teacher survey

30
PM 2 Attendance, Classroom Performance,
Discipline Data Collection
  • Individual student data
  • Program attendance data
  • Feedback data
  • Grantee-specific data

31
Evaluating Attendance, Classroom Performance,
Discipline
  • Attendance compare prior year attendance ( days
    absent/tardy) to current year ( days
    absent/tardy)
  • Discipline compare prior year discipline (/type
    of incidents) to current year (/type of
    incidents)
  • Classroom Performance results from teacher
    survey, including homework completion, class
    participation, attentiveness

32
PM 3 Social Behavioral Data Collection
  • Individual student data
  • Feedback data
  • Grantee-specific data

33
Evaluating Social Behavioral Benefits
  • Results from teacher survey
  • Behaving in class
  • Getting along with others
  • Motivated to learn

34
Recommendations from Evaluators
  • The majority of students served should have an
    academic need (reading/math)
  • Ensure that students are served according to
    their individual needs
  • Code students at program beginning to ensure
    students are attributed to correct cohort
    consistently
  • When providing a breakdown of students within a
    category, ensure that the disaggregations add to
    the total
  • When in doubt, ask questions

35
Evaluation Resources for Grantees
  • Data collection parent permission form template
    for non-school grantees
  • Data safeguarding statement template
  • Evaluation planning framework
  • Spreadsheet templates
  • Links information about national state
    reporting

www.aiu3.net/evaluations click 21st CCLC link
36
Contacting PA 21st CCLC State Evaluators
  • Allegheny Intermediate Unit
  • Yolanda.Yugar_at_aiu3.net
  • Leslie.Kirby_at_aiu3.net
  • www.aiu3.net/evaluations
Write a Comment
User Comments (0)
About PowerShow.com