Title: Californias SIG
1Californias SIG
- Evaluating Training
- and Technical Assistance
2Cheryl Li Walter, Ph.D.
- Director of Evaluation and Research
- California Institute on Human Services
- Sonoma State University
- Californias SIG Evaluator
3CAs SIG Evaluation Team
- SIG Evaluator Cheryl Li Walter
- SIG Contract Monitor Janet Canning
- CalSTAT Managers Linda Blong Anne Davin
- CalSTAT Evaluation Staff Kelly Bucy
- SIG Evaluation Task Force small grp
4SIG Evaluation Team
- Has been together for 6 years of SIG
- Has taken a Do and Develop approach
- Actively committed to using data to inform
the system change process - in a way thats accessible to all participants
5SIG Evaluation Task Force
- A representative group of stakeholders
- 10-15 members
- Meets twice annually for a day
- Facilitated by the SIG Evaluator
- Reviews activities data
- Reviews outcomes data
- Makes recommendations to the PCSE
6Partnership Committee on Special Education (PCSE)
- Approx 100 stakeholder partners
- Meets annually for 1-2 days
- Communication with and between partners
- Informs ongoing implementation of the SIG
- Discusses activity and outcomes data
- Considers and makes recommendations
- Grapples with issues
- Transparency and accountability of process
7CAs SIG2 Goals
- Improved quality of personnel working with
students with disabilities - Improved educational service coordination for
students with disabilities - Improved academic outcomes for students with
disabilities - Improved behavioral supports and outcomes for
students with disabilities - Improved participation of parents/family members
of students with disabilities - Improved data collection and data dissemination
8CA SIG Training and TA
- Dissemination of research-based
- core messages to the field
- - articulate critical research findings and
- essential components of effective
application - Training and technical assistance provided in
various forms to - teachers, administrators, parents, teacher
aides, program specialists and other
professionals
9Core Message Areas
- Reading
- Positive Behavioral Supports
- Collaboration
- Family-School Partnerships
- Transition
- IDEA
- LRE
10Vehicles for Training/TA
- Regional Coordinating Councils (RCCs)
- groups that receive funding to put on trainings
for teachers and administrators in their region - Technical Assistance by Request (TA)
- mini-grants for training and/or follow-up
coaching at a site, or to visit another site - Leadership Institutes
- statewide and regional events bringing site
teams together to learn system change and content
skills - BEST Cadre Trainings (BEST)
- local staff trained to provide behavioral
support training to school site teams in their
area
11Purpose of the Evaluation
- To monitor the effort and effect of SIG training
and TA activities - To provide feedback to organizers and presenters
enabling them to improve and build upon their
efforts - To engage participants in reflection
- Evaluation as an intervention
- To link activities to outcomes
12 End-of-Event Evaluations
- How is the activity rated overall?
- Have participants increased their knowledge?
- Do participants anticipate implementing what they
learned? - What was most beneficial?
- What could be improved?
- What else is needed?
13(No Transcript)
14(No Transcript)
15What We Learned
- from over 70,000 participant responses
- The fewer questions the better
- More questions did not give more info
- 4.5 average rating (on a 1-5 scale)
- Can compare indiv training ratings
- 1 point average gain in knowledge (on a
5-point scale)
16 Follow-up Evaluations
- Are people implementing what they learned?
- If so, how is it working?
- If not, what are the barriers to implementation?
- Are people sharing what theyve learned with
others?
17Follow-up Emails
- Email addresses gathered at event registration
(from approx 25 of participants) - Follow-up email sent approx 3 months after the
training (often up to 6 mos later) - Click a link to answer a few questions
- Send up to 3 emails to get a response
- Approx 40 response rate on good email addresses
(10 of event participants)
18What We Learned
- Over 80 of participants reported having
implemented what they learned - Over 50 reported having implemented their
learning repeatedly - Over 80 of participants reported having shared
what they learned - Over 60 reported having shared their learning
repeatedly
19(No Transcript)
20What We Learned continued
- Knowing whether participants were using what they
learned took us a step toward linking activities
and outcomes - The support of administrators was often cited as
having facilitated the implementation of
learning - Lack of support from administrators was often
cited as a barrier to implementing learning
21Participant Roles
- Registration forms and event evaluation forms ask
participants their role (SE teacher, GE
administrator, parent, etc.) - Pie Charts show the distribution of roles present
at an event - Need to know who is being reached
- Need to attract who needs to be there
22(No Transcript)
23Mapping the Areas Served
- CA is a BIG state
- Having a distribution of activities throughout
the state is important - Identifying where the concentrations and gaps are
can be done using geographical information system
(GIS) software
24(No Transcript)
25Making the Process Useful
- The evaluation process must be useful to the
front line people were asking to collect data. - They must see the results,
- receive them in a form that is accessible and
understandable to them, - and be able to use the results in what theyre
doing.
26TED
- Local sites and regional organizations putting on
trainings often had no convenient mechanism for
looking at evaluation results (they were adding
things up with a calculator) - SIG developed a training/TA evaluation database
(TED) designed to enable tracking of event info
and automated evaluation reports so the local
level can use the data immediately for its own
purposes
27(No Transcript)
28(No Transcript)
29(No Transcript)
30(No Transcript)
31TED Demonstration
- Wednesdays Poster Session
- 530-7pm
- Pick up a demo CD to take home and check out
- Dont need to have Filemaker Pro to use
- Populated with test data
- Fully functioning runtime model
32Collaborative Sites Survey
- Over 100 sites requested and received TA around
SE/GE Collaboration - This was learned after the fact by looking at the
data - A brief survey was sent to the sites at the end
of SIG1, 42 sites responded - Academic Performance Index (API) scores were
examined for the sites
33(No Transcript)
34(No Transcript)
35(No Transcript)
36SIG1 Outcomes Evaluation
- Designed to look at statewide outcomes
- Over the course of the SIG we looked at and
clarified statewide outcomes - It was often difficult to get the data in a
timely fashion - General activities provided to broad populations
could not be linked to outcomes - With the Collaborative Sites we could begin to
link activities and outcomes
37SIG2 Evaluation Planning
- To link activities and outcomes
- The focus needs to be at the school site level
(or district level when scaling up) - There needs to be a sufficient concentration of
services to make a difference - There needs to be sufficient time allowed to see
a difference
38Focus at School Site Level
- Measurement of change needs to happen in an
identifiable environment that is small enough to
be impacted - We need to be able to link activity participants
to that environment - There need to be specific measures of change
(that the site cares about) - student achievement, of suspensions, degree of
collaboration
39Concentration of Services
- There needs to be a sufficient concentration of
services to make a difference - Enough training or TA days
- a minimum of 3 days of contact
- Enough staff from the site involved
- Teams of 5-6 or more people from a site,
including teachers, administrators, and parents
who are committed to working together to create
change
40Sufficient Time
- There needs to be sufficient time given to see a
difference - Trainings/TA need to happen over a period of time
to allow the process of change to unfold - There needs to be time allowed for the changes to
have an impact on the outcomes being measures
(1-2 years)
41Services Directed To Sites
- Technical Assistance By Request for Sites
- Training in Content Areas
- Follow-up Coaching
- Site-to-Site TA and Visits
- Leadership Institutes for Site Teams
- Statewide and Regional
- BEST Cadre Trainings (BEST)
- Local staff trained to provide behavioral
support training to school site teams in their
area - Broad Regional Training Not Funded by SIG
42Objective 3. To increase the academic
performance of students with disabilities, as
demonstrated by
- 3.a. Increasing proficiency in reading for
middle/high school studentsresulting in an
average five percentage point increase in all
students, and students with disabilities as a
subgroup, who score proficient/advanced on the
California Standards Test, English Language Arts
at all reading Leadership Sites and
school/district sites that receive at least three
days of TA in reading and have at least two years
of involvement in these SIG2 activities
43(No Transcript)
44Reading Proficiency Chart
- Developed an Excel version
- Statewide, District, or School Data can be
entered - The chart is generated by pushing a button
- Sites use the chart to compare themselves with
similar sites - Sites use to ground discussion and focus on their
goals - Poster Session Demo (signups for copy of file)
45(No Transcript)
46Outcome Measures
- Aligned with NCLB
- What the sites are already focused on improving
- Publicly available data
- Comparable with similar school sites
- CA Standards Test/English Language Arts
Proficiency - Statewide and Similar Schools Rankings
- Academic Performance Index (API) scores
47SIG3 Pre-Planning
- Testing a research-based middle and high school
reading approach focused on student fluency and
decoding - Training ongoing coaching for site teams
- Student placement assessments
- Intensive interventions pre/post testing
- Organizing student-level data for teachers to use
in the classroom - Linked to school site level outcomes
48CAs Evaluation Approach
- Try something
- Learn from it
- Build on it
- Design the activity and evaluation to work
together - Present the data in visual form
- Make sure everyone understands what were focused
on trying to change