Partnerships, Alliances, and Coordination Techniques - PowerPoint PPT Presentation

1 / 52
About This Presentation
Title:

Partnerships, Alliances, and Coordination Techniques

Description:

A longitudinal study, with multiple phases and purposes, conducted by the Early ... The longitudinal survey research is designed to examine the nature and benefit ... – PowerPoint PPT presentation

Number of Views:61
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: Partnerships, Alliances, and Coordination Techniques


1
Partnerships, Alliances, and Coordination
Techniques
  • Building Capacity to
  • Evaluate Partnership Initiatives
  • February 2008
  • Facilitated By
  • The National Child Care Information and Technical
    Assistance Center (NCCIC)
  • NCCIC Is A Service of the Child Care Bureau

Presented by The National Child Care Information
Center
2
Presenter
3
Todays Agenda
4
It is messy, but doable and could be fun!
Evaluation of Partnership Initiatives
  • This module will show you how to keep your head
    without losing your shirt!

5
Session Objectives
  • Participants will be able to
  • Understand the basics of evaluation approaches,
    data collection, and analysis.
  • Conduct an assessment of current capacity for
    evaluating partnerships initiatives
  • Determine the purpose and scope of their
    evaluation.
  • Understand the role of partners in making meaning
    of and communicating evaluation results.

6
PACT
  • PACT is an initiative of NCCIC, a service of the
    Child Care Bureau, U.S. Department of Health and
    Human Services
  • PACT gives State, Territory, and Tribal
    policymakersparticularly Child Care and
    Development Fund Administrators and their
    partnersthe resources they need to build more
    comprehensive and collaborative early care and
    school-age programs for serving children and
    families

7
PACT Materials
  • PACT Collaborative Leadership Strategies A Guide
    for Child Care Administrators and Their Partners
  • Web-based guide contains an introduction and six
    training modules
  • Fundamentals of Collaborative Leadership
  • Creating, Implementing, and Sustaining
    Partnerships
  • Communication Strategies
  • Management Strategies for Successful Partnerships
  • Financing
  • Building Capacity to Evaluate Partnership
    Initiatives

8
Objective 1 The Basics of Evaluation Getting
Your Feet Wet!
9
Goals of Evaluation
  • Evaluation is a strategy to identify, monitor,
    and track progress of the implementation and
    expected outcomes of a collaborative project.
  • The evaluation plan serves as a guide for
    partners, staff, and others in both day-to-day
    activities and long range planning.

It is critical to be clear on the purpose of the
evaluation and to match approaches and measures
to the purpose!
10
Benefits of Evaluation
  • On the plus side
  • A Good Evaluation .
  • Sets clear targets and goals
  • Provides objective information
  • Assists in project management
  • Builds public awareness and support
  • Improves performance
  • Impacts outcomes
  • Increases funding

Source Child Care Partnership Project. (2000).
Using results to improve the lives of children
and families A guide for public-private
partnerships. Washington, DC Child Care Bureau,
Administration for Children and Families, U.S.
Department of Health and Human Services.
11
Considerations and Cautions
  • --- On the downside ---
  • An Ineffective Evaluation.
  • Sets demands for significant results too quickly
  • Makes unrealistic assumptions about what caused
    change
  • Makes it difficult to collect appropriate data
    given the current state of early childhood
    measurement tools
  • Causes unintended harm to children or families if
    results are used inappropriately
  • Results in a redirection, realignment, or removal
    of program activities

Source Child Care Partnership Project. (2000).
Using results to improve the lives of children
and families A guide for public-private
partnerships. Washington, DC Child Care Bureau,
Administration for Children and Families, U.S.
Department of Health and Human Services.
12
The Language of Evaluation
  • The ABCsof Evaluation
  • Accountability
  • Assessment
  • Aggregate
  • Beta Level
  • Control Group

What Terms Confuse You?
Source Child Care Early Education Research
Connections. (n.d.). Research glossary. Retrieved
March 25, 2008, from www.researchconnections.org/D
iscover?displayPageresources\researchglossary.jsp
13
Why Work with Partners to Build Capacity for
Evaluation?
  • Accountability is being required in many sectors
  • In Head Start
  • In Child Care
  • In Prekindergarten/Education
  • In Early Intervention
  • Multiple partners are increasingly working
    together to align initiatives and programs to
    increase access and effectiveness to early care
    and education services.
  • A number of States and communities are designing
    early childhood systems initiatives or developing
    cross-sector initiatives to meet the multiple
    needs of families and children, and provide more
    comprehensive services.

14
Objective 2 Building Capacity for Evaluation
Is the Water Warm Enough?
15
Why is Evaluation Important to You/Your
Collaborative Project?
  • What specific needs do you have that you would
    like the evaluation to address?
  • What are your goals?
  • What are each partners goals?
  • What do you think are the benefits?
  • To your organization? To children/families/practit
    ioners?
  • What do you think are the challenges?
  • Are costs, capacity, resources available?
  • What are your fears about evaluation?

16
Considerations in Assessing Your Projects
Capacity for Evaluation
  • What progress do you expect?
  • What information will help you document gains?
  • What data is already available and what data is
    needed?
  • What capabilities do you have now? What do you
    need?
  • How much time will it take to get the system
    working well?
  • How much will it require?

17
Six Key Strategies to Build Capacity for
Evaluation
  • Establish a culture of accountability
  • Develop a long-range strategic plan
  • Partner with researchers and experts
  • Ensure data quality
  • Engage families business/legislators
  • Communicate results simply and often

18
Assessing Your Projects Capacity for Evaluation
  • From your small group discussion on building
    capacity
  • What surprised you?
  • What elements are your strengths?
  • What elements need to be addressed?
  • What next steps have you identified?
  • Common Issues to Address in Building Capacity
  • Evaluation Expertise
  • Costs

19
Ensuring You Have Evaluation Expertise
  • Key Partners or an executive committee provide
    oversight to the evaluation team
  • Options for Evaluators Role
  • An outside evaluator (which may be an individual,
    research institute, or consulting firm) who
    serves as the team leader and is supported by
    in-house staff.
  • An in-house evaluator who serves as the team
    leader and is supported by program staff and an
    outside consultant.
  • An in-house evaluator who serves as the team
    leader and is supported by program staff.

Source Office of Planning, Research
Evaluation, Administration for Children and
Families, U.S. Department of Health and Human
Services. (2006). Chapter 2 What will evaluation
cost? In Program managers guide to evaluation.
Retrieved March 25, 2008, from www.acf.hhs.gov/pro
grams/opre/other_resrch/pm_guide_eval/reports/pmgu
ide/chapter_2_pmguide.html
20
Evaluation Cost Considerations
  • Evaluation cost are driven by
  • Evaluation design
  • The number of participants assessed
  • Standardized measures (number used, assessor
    training reliability practices, frequency of
    assessment)
  • Data availability quality (including automation
    of data entry analyses)
  • Methods of reporting communicating results
  • Infrastructure for data collection, level of
    analyses, printing, etc.

Source Golin, S., Mitchell, A., Gault, B.
(2004). The price of school readiness A tool for
estimating the cost of universal preschool in the
states. Retrieved February 24, 2008, from
www.iwpr.org/pdf/G713.pdf
21
From Assessing Capacity to Strategic Action
  • You have conducted a baseline assessment of the
    current capacity for evaluation
  • You have considered costs and expertise needed
  • Now you are ready to
  • Develop a strategic plan for building capacity
    for evaluation

22
Objective 3 Choosing an Evaluation Approach
Wallowing in the Mud!
23
Considerations for Determining the Scope of Your
Evaluation
  • What mandates or expectations for evaluation does
    your partnership project have?
  • What is the current status of your evaluation
    capacity, including resources for funding the
    evaluation?
  • What lessons learned/strengths of the partnership
    can be used in developing an evaluation approach?
  • What are the challenges or sticky issues that
    may impact the success of the evaluation?
  • What data do you have for a baseline and tracking
    outcomes over time, and across agencies?

24
Stages of Evaluation Approaches
Research
Do participants do better than non-participants?
Is one programmatic approach more effective than
another?
Outcome Evaluation
Does program achieve intended outcomes? For
whom? Did organizational or system structure
impact policy, resources, outcomes?
Implementation Evaluation
Did you implement the program as planned? If
not, why not? What changes were made?
Source Oregon State University Family Policy
Program Oregon Child Care Research Partnership
Project. (2000). Results accountability
guidebook. Retrieved February 24, 2008, from,
www.hhs.oregonstate.edu/familypolicy/occrp/publica
tions/2000-Results-Accountability-Guidebook.pdf
25
Match Goal and Purpose to Evaluation Approach
  • The fundamental principle-Evaluation approaches
    match the purpose and goals of partners and the
    initiative.
  • They can be as simple or complex as needed.
  • The following examples show the range of
    complexity and rigor that exist in the field of
    early care and education.
  • What best meets your needs is up to you!

26
State Approaches to Evaluation
  • Leading the Way to Quality Early Care and
    Education
  • CD-ROM
  • Literacy and Early Learning/Assessment and
    Evaluation
  • Florida discusses evaluation of school readiness
    initiatives.
  • Ohio discusses the use of a Logic Model approach
    in evaluating an infant-toddler initiative.
  • California discusses their Desired Results
    Accountability System for child care and early
    education services.

27
California Desired Results for Children and
Families
  • Multi-purpose/multi-year state-level
    accountability system-to inform instruction,
    target technical assistance and monitor trends in
    publicly funded programs
  • Developmental observation profiles for children
    birth to age 14 to inform instruction
  • Family surveys and program self-assessments to
    target technical assistance
  • State level aggregated data to monitor trends
  • Conducted in partnership with a university and
    the training system

Source California Department of Education.
(2007). Introduction to desired results.
Retrieved March 25, 2008, from www.cde.ca.gov/sp/c
d/ci/desiredresults.asp
28
Oklahomas Quality Rating System Reaching for
the Stars
  • A longitudinal study, with multiple phases and
    purposes, conducted by the Early Childhood
    Collaborative of Oklahoma and others
  • 1999 - observational study was conducted of
    implementation
  • 2001-2002 - validation study of centers
  • 2003 outcome study to determine impact of
    tiered rates on quality and relative impact of
    specific indicators on overall quality
  • 2004 validation study of family child care homes

Source Norris, D., Dunn, L., Dykstra, S.
(2003). Reaching for the stars center
validation study executive summary. Retrieved
February 24, 2008, from www.ou.edu/ecco/Executive_
Summary.pdf
29
Marylands Model of School Readiness
  • Multi-purpose/multi-year state-level
    accountability system to inform instruction,
    target technical assistance and, monitor trends
    in publicly funded programs
  • Each fall, all kindergarten teachers assess
    children using a modified version of the Work
    Sampling System and report this data to the
    Department of Education. 
  • The Department of Education submits a report
    based on this and other data to the General
    Assembly each November about the level of school
    readiness Statewide.
  • The Department of Education, which includes child
    care, partners with a nonprofit to deliver and
    assess the training that supports this
    accountability effort.

Source Maryland State Department of Education.
(n.d.). Maryland model for school references.
Retrieved March 25, 2008, from www.mdk12.org/instr
uction/ensure/MMSR/index.html
30
Ohio Child Care/Head Start Partnership Project
  • This is a research study, funded by the Child
    Care Bureau, conducted in collaboration with
    State Policymakers
  • The goal of the partnership project is to provide
    high-quality, seamless services to families with
    low incomes and their children.
  • The longitudinal survey research is designed to
    examine the nature and benefit of partnerships,
    and the impact on outcomes for centers, teachers,
    and children.

31
A Systemic View of Child and Family Outcomes in
Context
32
Assessment and Evaluation
  • Lessons from Research and Professional Wisdom
    from the Field
  • Clips from Child Care Works Research to
    Practice, Assessment and Evaluation Module
  • Involving stakeholders in program evaluation
  • Developing systems of assessment
  • Challenges of measuring quality

33
10 Steps to the Information You Need to Make Good
Decisions (and convince others too!)
  • Determine the purpose and scope
  • Agree on results
  • Select measures
  • Establish a baseline and objective
  • Determine and implement strategies aimed at
    positive change
  • Develop a performance agreement among groups
    responsible
  • Collect data
  • Analyze the data
  • Assess progress and modify strategies and
    resources
  • Publicize results

Source The Finance Project. (2002).
Accountability systems Improving results for
young children. Retrieved February 24, 2008, from
www.financeproject.org/Publications/accountability
.pdf
34
What to Measure in a Partnership Project?
  • It is important to be clear
  • Is increased collaboration a GOAL or an outcome
    in and of itself?
  • ANDOR
  • Is increased collaboration/resource sharing a
    STRATEGY to achieve goals?
  • ANDOR
  • Is effective administration of a project by
    multiple partners a CONDITION (theory of change)
    for success?

35
Short- Intermediate-Term Objectives
  • The Core Services describe activities which are
    designed to meet short- and intermediate- term
    objectives on the way to meeting the long term
    goal
  • Tip/Challenge As you identify program services,
    activities, and short and intermediate-term
    objectives, you must continually recheck and loop
    back to be sure that each element is aligned and
    reasonably links to the long term goal.

36
What Is a Theory of Change Logic Model?
  • It is a TOOL to develop a common understanding of
  • Goals
  • Vision of how program will effect change
  • Program Services
  • Outcomes
  • It serves as a dynamic process to guide program
    development, implementation, and
    evaluation/accountability.

37
How to Develop a Logic Model
  • Gather key stakeholders perspectives on
  • Long-term outcomes
  • Theory of change
  • Program services and activities
  • Short- intermediate-term outcomes
  • Indicators/evidence of progress in meeting
    outcomes

38
Objective 4. Collecting Data and Reporting
FindingsMaking Mudpies!
39
Data Collection
  • Identify data currently being collected to
    determine the fit with indicators chosen.
  • Review the quality of the data and identify gaps
    in data needed to measure progress on the
    indicators.
  • Start small. Its very easy, and pretty common,
    to go way overboard on data collection! It will
    keep you sane, and keep costs reasonable, if you
    choose a few data sources that have the intent
    and power to give you the information you need.

40
Multiple Levels of Data Collection
  • System Level Data - Data on key system or
    partnership indicators
  • Program /Service Level Data Implementation data
    in the first stages and program outcome data in
    the second stage.
  • Individual Level Data Data on adults, children,
    or families, often from a sample, and best
    collected over time, with multiple measures

41
Collect Powerful Data
  • Data Power
  • What are the most accurate and reliable data
    sources available?
  • Proxy Power
  • Are the indicators clearly within the control of
    the program and have shown, in previous research,
    to predict later gains?
  • Communication and Political Power
  • What outcomes are most important to key
    stakeholders?

Source Child Care Partnership Project. (2000).
Using results to improve the lives of children
and families A guide for public-private
partnerships. Washington, DC Child Care Bureau,
Administration for Children and Families, U.S.
Department of Health and Human Services.
42
Decision Points and Options for Data Sources and
Analysis
43
Measuring Outcomes in Early Care and Education
  • Not all measures to assess child outcomes have
    predictive ability to later outcomes, and may not
    be sensitive to young childrens dynamic growth
    or cultural and linguistic differences
  • Observational measures of program quality are not
    applicable to all settings, and may not capture
    adequately the nuances and complexity of quality.
  • Measures of partnership effectiveness, systemic
    impact, and system integration are sparse and
    difficult to adequately attribute
    causality/impact.
  • Choosing measures and methods to document
    outcomesis a fine art balancing what is
    available, appropriate, and useful!

44
FindingsMeaningAction
It is all too easy to collect data.but much
harder to analyze the findings appropriately,
make meaning of the findings, and use the
findings to take (appropriate) action
Source Hebbeler, K. (2006, May). Now comes the
fun part Gleaning meaning from early childhood
outcome data. Retrieved March 27, 2008, from
www.fpg.unc.edu/ECO/pdfs/Data20Meeting205-24-06
.ppt
45
Findings
  • Findings are the numbers, the scores on measures,
    the summary of quarterly reportswhich in and of
    themselves are meaningless!
  • While numbers are not debatable, it is important
    to include enough information about the numbers
    (and the context of the initiative) to make them
    meaningful

Data add substance to what could otherwise be
dismissed as anecdotes, while stories add a
personal element to cold numbers on a page
(Using Results to Improve the Lives of Children
and Families, pg. 7)
Hebbeler, 2006
46
Meaning
  • The interpretation put on the numbers
  • Is this finding good news? Bad news? News we
    cant interpret?
  • Meaning is debatable and reasonable people can
    reach different conclusions from the same set of
    numbers
  • Stakeholder involvement can be helpful in making
    sense of findings

Meaning is derived from the goals and your
theory of change (why you believe you can achieve
results).
Hebbeler, 2006
47
Reporting Results Tell the Story
  • Identify areas where changes may be needed for
    future implementation.
  • Inform policy and/or funding decisions by telling
    the "story" of program implementation and
    demonstrate the impact of the program on
    participants.
  • Build public awareness and support with
    legislators, parents, and community members.
  • Choose a report format that is consistent with
    your program purpose and appeals to the target
    audience.

48
Take Powerful Action
  • A key role of the partnership team is
    communicating results and determining how the
    evaluation results are used
  • To improve program
  • To get more funding
  • To build public awareness
  • To plan next steps in the evaluation approach

49
In Summary Building Capacity for Evaluation
  • You have expertise and resources available to
    assist you
  • You can take a thoughtful, planned approach to
    getting the information and data you need
  • You, and your partners, play a key role is
    determining the purpose, gathering appropriate
    resources, providing oversight, and ensuring
    information is meaningful and useful

50
Closing
  • Personal learning plan
  • Quality improvement
  • Session evaluation

51
Reflections
  • I learned
  • I relearned
  • I will apply
  • I would like to know more about
  • I am surprised by

52
Thank you!
PACT is an initiative of NCCIC, a service of the
Child Care Bureau
Facilitated by the National Child Care
Information and Technical Assistance
Center 10530 Rosehaven Street, Suite 400 ?
Fairfax, VA 22030 Phone 800-616-2242 ? Fax
800-716-2242 ? TTY 800-516-2242 Email
info_at_nccic.org ? Web http//nccic.acf.hhs.gov
Write a Comment
User Comments (0)
About PowerShow.com