Aggregating Outcomes for Effort and Effect: What NTAC Learned from its Site Review - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

Aggregating Outcomes for Effort and Effect: What NTAC Learned from its Site Review

Description:

Everyone wants evaluation to be seamless and transparent ... Shawn Barnard. Paddi Davies. Region 2. Jon Harding. Barb Purvis. Region 3. Nancy Donta. Amy Parker ... – PowerPoint PPT presentation

Number of Views:45
Avg rating:3.0/5.0
Slides: 42
Provided by: ellat
Category:

less

Transcript and Presenter's Notes

Title: Aggregating Outcomes for Effort and Effect: What NTAC Learned from its Site Review


1
Aggregating Outcomes for Effort and EffectWhat
NTAC Learned from its Site Review
  • Ella L. Taylor, Ph.D.
  • NTAC
  • Teaching Research Institute
  • Western Oregon University

2
Agenda
  • Site review response to evaluation
  • Revisions to evaluation plan
  • Collecting effort and effect data
  • Aggregating data
  • Questions

3
Site Review Concerns about Evaluation
  • Very complex
  • Less likely to succeed than a simpler plan
  • Needs to be simplified
  • Needs to be made more realistic and appropriate

4
Constraints of Evaluation
  • Everyone wants evaluation to be seamless and
    transparent
  • Reticence to see the direct benefit for improving
    the project
  • Evaluation is seen as cumbersome and confusing

5
Primary purpose of evaluation
  • Did we do what we said we were going to do?
    (effort)
  • What was the impact of what we did? (effect)

6
Effort vs. Effect
  • Effort actions carried out by the project
  • Satisfaction data
  • Numbers of participants, events, etc.
  • Effect impact of the actions
  • What outcome resulted from the activity?
  • Change of awareness
  • Change of knowledge
  • Change in skill/implementation (service provider,
    family, systems)
  • Change in child (child change data)

7
Alignment
8
Did we do what we said we were going to do?
(EFFORT)
  • Grant objectives
  • Met explain how we met them
  • Not met explain why not
  • Data
  • Number of events
  • Number of participants
  • Satisfaction with effort

9
What was the impact of what we did? (EFFECT)
  • What to measure?
  • How to measure?
  • How to report succinctly?
  • How to aggregate?

10
What was the impact of what we did? (EFFECT)
  • Outcome and Performance Indicators (NTACs OPIs)
  • Outcome
  • A statement of a measurable condition or an
    expected result or change. (i.e. increase,
    improvement, progress toward).
  • Performance Indicator
  • A statement that helps quantify the outcome and
    indicates whether the outcome has been achieved.
    Often, multiple indicators may provide better
    evidence of the achievement of an outcome.

11
Kudos to
  • John Killoran
  • Kathy McNulty
  • Paddi Davies
  • Many, many hours of development and refinement of
    NTACs OPIs

12
OPIs
  • Comprehensive outcomes for children (15),
    families (9), service providers (19) and systems
    (6)
  • Embed in all aspects of planning, delivery and
    evaluation
  • Aligns needs assessment, project activities and
    measurement of impact
  • On web (http//www.tr.wou.edu/ntac/evalforms/)

13
How we are using OPIs
  • Planning
  • Identify the stakeholder group (service provider,
    child, family and/or systems)
  • Identify the outcomes you will target
  • Identify the performance indicators that will
    help you determine attainment of targeted
    outcome(s)
  • Delivery of service
  • Implement TA that is targeted to the outcomes
    selected
  • Evaluation
  • Tailor assessment/evaluation measures to targeted
    outcomes and performance indicators

14
An Example
15
Webinar Example Planning
  • Needs Assessment
  • Comments at Project Directors Meeting during
    self-evaluation breakout sessions
  • Conversations with state project directors,
    coordinators and staff
  • NTAC Advisory committee meeting
  • Outcome Goal The use of formative and summative
    evaluation of the systems change and/or capacity
    building has increased (S4)
  • Performance Indicator Uses outcomes measures
    (S4g)

16
Webinar Example Delivery of Service
  • Activity Webinar
  • Align the intensity of our evaluation with the
    intensity of the activity
  • One time activity results in less intense
    evaluation than sustained professional
    development.
  • Align needs with activity with evaluation

17
Webinar Example Evaluation Change of awareness
18
A slightly more complicated example
19
Example 2
  • Planning/Needs Ongoing training and support
  • Outcome Use of formative and summative
    evaluation (Systems 4)
  • Performance Indicators
  • Uses satisfaction measures (S4d)
  • Uses awareness, knowledge or skills measures
    (S4e)
  • Uses outcomes measures (S4g)
  • Activity Series of webinars for one region
  • Evaluation Change of knowledge skill

20
EvaluationChange of knowledge skill
21
Another more complicated example
22
Example 3
  • Planning/Needs Development, implementation,
    analysis support in the use of formative and
    summative evaluation
  • Outcome Use of formative and summative
    evaluation (Systems 4)
  • Performance indicators
  • Uses participant demographic data (S4c)
  • Uses satisfaction measures (S4d)
  • Uses change in awareness, knowledge or skills
    measures (S4e)
  • Uses outcomes measures (S4g)
  • Uses formative and summative evaluation
    information for ongoing feedback and continuous
    improvement (S4j)
  • Disseminates evaluation results of the systems
    change or capacity building activities (S4k)

23
Sustained Professional Development
  • Activity Multiple visits by TAS and evaluation
    specialist to 2 states
  • Evaluation
  • Change of knowledge skill
  • Follow-up

24
Follow-up
25
Aggregating the Data
26
Data Collection
  • What data did we collect?
  • Number of events/activities (effort)
  • Satisfaction (effort)
  • Change of awareness (effect)
  • Change of knowledge skill (effect)
  • Follow-up evaluation (effect)
  • On what, did we collect data?
  • Outcome Use of formative summative evaluation
    (Systems 4)
  • Performance indicators
  • demographic data (S4c)
  • satisfaction data (S4d)
  • awareness, etc. (S4e)
  • outcomes measures (S4g)
  • ongoing feedback (S4j)
  • dissemination (S4k)

27
Data
28
Aggregate the data
  • Each event carries equal weight
  • Assign different weight to each event because
    events have different levels of importance
  • Follow-up evaluation should carry more weight
    because the intensity of the effort was greater

29
Conversion to 4 point scale
  • Convert 5 point scale to 4 point
  • 5 (strongly agree) 4 (achieved)
  • 4 (agree) 3 (nearly)
  • 3 (neither) 2 (emerging)
  • 2 1 (disagree) 1 (non-existent)
  • Convert 3 point scale to 4 point
  • 3 (substantial) 4 (achieved)
  • 2 (some) 3 (nearly)
  • 1 (no) 1 (non-existent)

30
(No Transcript)
31
What does this mean?
  • Effort
  • NTAC conducted one national webinar, a series of
    webinars for one region, and several onsite
    consultations with two states to increase the
    states capacity to use formative and summative
    evaluation systems. Across the trainings,
    participants indicated 90 satisfaction with the
    skill of the consultants and the content of the
    activities.

32
What does this mean?
  • Effect
  • Across the trainings and consultations,
    participants report that they are very near
    achieving the ability to develop, implement, and
    analyze formative and summative evaluation
    measures to increase capacity and systems change
    (m 3.03/4.0 scale).
  • Could elaborate by listing performance indicators
    if needed.

33
Response to NTAC Site Review 2004 2005 Field
test
  • 1) Embed Outcomes and Performance Indicators
    (OPIs) in planning and delivery of service.
  • 2) Embed OPIs in all evaluation measures.
  • 3) Share our evaluation systems and data with our
    state/multi-state partners

34
Addressing the constraints
  • Aligning needs, delivery and evaluation through
    the OPIs yields a more seamless system
    (Constraint 1)
  • Sharing data facilitates the use of data
    (Constraint 2)
  • Consistency helps diminish confusion (Constraint
    3)

35
Questions?
36
Contact
  • Region 1
  • Shawn Barnard
  • Paddi Davies
  • Region 2
  • Jon Harding
  • Barb Purvis
  • Region 3
  • Nancy Donta
  • Amy Parker
  • Region 4
  • Kathy McNulty
  • Therese Madden Rose

37
Additional Examples
  • The following information will not be shared
    during the discussion, but is being provided as
    additional material.

38
Weighting the events differently
  • Using the previous examples, lets say that we
    believe the follow-up data should carry more
    weight since it indicates more long-term
    implementation and attainment of the outcome.
  • We want the follow-up evaluation to carry 40 of
    the weight.

39
(No Transcript)
40
Embed in basic evaluation (Service Provider 1a)
  • Satisfaction data
  • I was satisfied with my opportunity to learn
    about the impact of deaf-blindness on an
    individuals overall development (i.e. social,
    emotional, cognitive).
  • Change of awareness
  • I have increased my awareness about the impact
    of deaf-blindness on an individuals overall
    development (i.e. social, emotional, cognitive).
  • Change of knowledge/skill
  • As a result of the training, I can use my
    knowledge about the impact of deaf-blindness on
    an individuals overall development (i.e. social,
    emotional, cognitive) to plan instruction.

41
Embed in Follow-up
  • Service Providers
  • Based on the recent training provided on
    understanding how a combined vision and hearing
    loss impacts learning and social/emotional
    development, please indicate your progress in
    performing the following tasks
  • Child change
  • Three months ago, you received technical
    assistance on understanding how a combined vision
    and hearing loss impacts learning and
    social/emotional development. As a result of
    that training, please indicate any progress the
    student has made in the following skills
Write a Comment
User Comments (0)
About PowerShow.com