Title: Quality Rating and Improvement Systems
1- Quality Rating and Improvement Systems
- Presentation to ECICQRIS Pilot Implementation
Meeting - January 10, 2008
- Gerrit Westervelt, Ph.D.The Build Initiative
2What is the Build Initiative?
- Multi-state partnership of funders, agencies,
policymakers and NPOs - Created by the Early Childhood Funders
Collaborative (ECFC) - Supports state leaders who make policy, provide
services and advocate for children 0-5 - Goal youngest children are safe, healthy, eager
to learn and ready to succeed
3State Early Childhood Development System
Early care and education opportunities in
nurturing environments where children can learn
what they need to succeed in school and life.
Comprehensive health services that meet
childrens vision, hearing, nutrition,
behavioral, and oral health as well as medical
health needs.
Early Learning
Health, Mental Health and Nutrition
Family Support
Early identification, assessment and appropriate
services for children with special health care
needs, disabilities, or developmental delays
Special Needs/ Early Intervention
Economic and parenting supports to ensure
children have nurturing and stable relationships
with caring adults.
4What are Quality Rating and Improvement Systems
(QRIS)?
- A key component of a comprehensive early
childhood system - Tools that assess, monitor, improve, and
succinctly communicate levels of early childhood
program quality to providers, parents,
policymakers and funders.
5Common QRIS Elements
- Tiered program standards (state licensing is
often the baseline, national accreditation is the
ceiling) - For providers, serve as a quality improvement and
monitoring measure, and enable them to access
quality improvement coaching and support - Help parents select quality child care and
ultimately change child care purchasing patterns - Are linked to differential child care
reimbursement strategies (e.g. tiered subsidy
reimbursement) - Provide an accountability measure to child care
funders - (adapted from NCCIC, 2005)
6QRIS The National Landscape
- Currently 14 states and DC implementing QRIS with
these common measurement elements many more in
pilot stage - Environmental Assessment (92)
- Early Childhood Environment Rating Scale (67)
- Parent Involvement (75)
- Staff Professional Development/Business Practices
(100) - Program Accreditation (67)
- Parent Satisfaction (42)
- Licensing Status (100)
- (adapted from The RAND Corporation, 2005)
- Many QRIS require a program to be licensed to
participate, others give points for being
licensed, while others give points for compliance
history.
7Lessons Learned Five Key Areas
- Provider Outreach
- Measuring Quality
- Delivery Infrastructure
- Quality Improvement
- Evaluation
8Lessons Learned Provider Outreach
- Provide incentives for participation (e.g.,
mini-grants for materials, or staff bonuses) - Explain your measures and processes, early and
often. Be Transparent! - Enlist the support of key providers (e.g. local
association leaders) to get rated and promote the
process - Provide personal consultations on the rating
results - If possible, do not publish first year results
without permission public accountability can be
a real barrier to participation in the early years
9Lessons Learned Measuring Quality
- Self-assessment measures can be very problematic
- Ensure that your learning environment measure is
aligned with desired child outcomes - A classroom sampling approach to the learning
environment may be a cost effective approach - Professional development is very challenging to
measure accurately resources are needed to
review transcripts and arrive at a reliable score
10Lessons Learned Measuring Quality
- Ratios/group sizes are dynamic throughout the
day multiple time samples should be collected - Staffing patterns are not an accurate proxy for
actual counts) classroom ratio sampling does not
provide accurate representation - Licensing compliance is usually not a good proxy
for aspects of structural quality (e.g. ratios)
or process quality (e.g. health and safety) - Resist linking high stakes to QRIS results until
after pilot work has been completed
11Lessons Learned Delivery Infrastructure
- Involve licensing officials in QRIS development
process to promote future collaboration - Classroom observation measures need solid
infrastructure to support inter-rater reliability - Seasoned practitioners sometimes find getting
reliable more challenging than those newer to the
field - Local or regional delivery system as you scale
up? - Develop web based data collection technology
12Lessons Learned Quality Improvement
- Quality Improvement is a relatively new field and
many states have been working to - Develop coaching and training professional
standards that have associated educational
requirements and proficiencies - Develop training institutes to support these new
professionals in their mentoring and technical
assistance skills
13Lessons Learned Quality Improvement
- Provide personal debriefing on rating results
- Do not assume that directors can translate the
results to classroom staff - Involve coaches in consultations and assure that
coaches understand the measures and rating
process - Assure that raters do not use an expert
approach to consultation, but can - Translate results/standards into practice
- Engage providers in a problem-solving approach to
developing a QI plan to address areas for
improvement
14Lessons Learned Quality Improvement
- In-classroom mentoring is particularly helpful
for low quality programs - Involve directors in mentoring process to build
leadership skills - Director-level coaching is helpful, esp. for
higher quality programs - Lower quality programs need more guidance in
prioritizing QI spending
15Lessons Learned Quality Improvement
- Facilitated director groups and home provider
peer groups appear to be a cost-effective QI
strategy - Formal scholarships appear to be more effective
when staff have been in the program for at least
3 years (and under 10) and when programs are not
at a very low quality level - Process and structural quality improve at
different rates and require different resources - Consider measuring process quality (e.g, Learning
Environment) for all programs at outset
16Lessons Learned Evaluation
- Formative evaluations to improve QRIS measures
and processes can be very useful QRIS will
change as lessons are learned - Collect as much data as feasible work with an
evaluator during the initial stages of QRIS
development - Collect consistent data on QI interventions, in
order to - Show how quality has improved over time
- Understand which interventions work for which
types of programs - Hold off on child outcome studies until AFTER a
few years of pilot work
17Michigans Opportunities
- Piloting several different QRIS and understanding
what works in different contexts - Creating a statewide QRIS based on best practices
- Establishing model communities that are able to
train new communities - Availability of QI funding to enhance quality and
encourage programs to participate - Assessment and QI expertise at High/Scope
18Michigans Challenges
- Developing and implementing QRIS without stable
funding sources - Different QRIS hard to compare results
- Building local capacity to do ratings and QI
- Developing multiple databases and technology
platforms - Local preferences may make the consensus QRIS
harder to create than you think
19The Build Initiative
- For more information, visit www.buildinitiative.or
gContact Gerrit Westervelt, Ph.D. - Executive Director
- gwestervelt_at_buildinitiative.org
- 303-929-5011