Title: American Institutes for Research
1 NRS Regional Spring 2007 Training onDesk
MonitoringImproving Program Performance
- American Institutes for Research
- and the
- U.S. Department of Education
- Office of Vocational and Adult Education
2 Introductions and Expectations
Take a moment to look forward to
the end of this training, two days from
now. Make a note of two
things you hope to learn and take with you
from this training on desk monitoring.
Then, as you introduce yourself to the group,
please give your name, title, state or
territory you represent, and two expectations you
have of this training.
Refer to H-1
02
3- Objectives of the Training
- By the end of the training, participants will be
able to
- Discuss the purposes and benefits of desk
- monitoring
- Identify measures, performance standards,
- and evaluation tools
- Design a desk monitoring tool and summative
- rubric
- Develop a plan for implementing and using desk
- monitoring and
- Evaluate the desk monitoring system and make
- refinements, as needed.
03
4- Welcome, introductions, expectations,
objectives - Purposes of local program monitoring
- Why do desk monitoring? Relationship between
- on-site and desk monitoring How monitoring
- fits into the program improvement process
- Warm-up activity Current state of the states
- re on-site and desk monitoring
- Steps to Desk Monitoring
- Step 1. Design a Desk Monitoring Tool
- States select process and outcome measures
- Small group review and feedback
04
5- Step 1. Design a Desk Monitoring Tool (Cont.)
- Demonstration of monitoring tool and template
- State teams use desk monitoring tool to develop
- templates
- Activity on evaluating a summative rubric
- Planning time for states to develop their own
rubrics - Total group processing of issues related to
developing - rubrics
05
6- Jigsaw Activity on Steps 2, 3, and 4
- Step 2. Plan and Develop Implementation
Procedures - Step 3. Implement Your Desk Monitoring Tool
- Step 4. Use and Evaluate Your Desk Monitoring
Tool - State teams plan for implementation and use 50
Questions - State reports
- Next Steps and Adjourn
06
7States Monitor Data on
States Monitor Data on
Financial Information
Program Staffing
Student Outcomes
Instruction
Contact Hours
Professional Development
Student Characteristics
- to ensure that local programs are
- Meeting grant requirements
- Following required procedures and policies
- Providing quality services.
07
8States Use Information from Local Program
Monitoring to
- Meet state and NRS accountability
- requirements (student outcome data)
- Implement and evaluate progress toward
- state policy objectives
- Promote program improvement (identify
- lower performing programs for T/A)
- Identify effective programs and practices
- (and transfer to other programs).
08
9What is Desk Monitoring?
An approach to reviewing and tracking local
program performance by using data from your
state system, and other quantitative data, as
appropriate.
09
10Warm-up Activity
Taking a snapshot of current use of desk
monitoring
Refer to H-2
10
11Reciprocal Relationship between Outcomes and
Program Practice
Good Program Practices
Effective Instruction
Good Data Collection and Reporting Procedures
Professional Development for Teachers
Valid and Reliable Assessment
Good Outcomes
11
12But its a Tenuous Relationship between Process
and Outcomes
-
- Because of the complexities of the
- following
- Uneven resources from year to year
- Varying staffing patterns
- Students with widely diverse
- skills and goals.
-
12
13Why do Desk Monitoring?
-
- Guide program review activities
- Facilitate tracking and evaluation of
- local program performance
- Both enhance and inform on-site monitoring
-
- Plan and provide technical assistance
- Establish mechanism for regular
- communication between state and local
- programs re needs for technical assistance
- Promote continuous program improvement.
13
14Desk Monitoring
Advantages Disadvantages
Programs can submit data and reports electronically. Assumes accurate data requires good data quality and trust in the data
State can make efficient use of staff time, with reviews built into regular workloads over time Does not allow for the voices/ perspectives of multiple local staff members.
Data are quantitative and can be compared with previous years data or with state standards. Most data already collected for NRS. Cannot collect qualitative measures or measures of interaction (e.g., intake processes, professional development activities, and instructional measures).
No travel required no time out of office no travel expenses. Gives static view of program without dynamic interaction with local program staff in context.
14
15Relationship between Desk and On-site Monitoring
- Desk Reviews
- Use findings to focus
- the more-intensive
- on-site reviews and to
- identify T/A needs.
On-site Reviews Use findings to identify need
for more-intensive desk monitoring.
Coordinate both approaches to promote program
improvement efforts
15
16Fitting Desk Monitoring into a Program
Improvement System
Onsite Monitoring
Technical Assistance
Program Improvement
Desk Monitoring
16
17I-P-O
Input/Givens
Process/System
Outcome/Results
Elements that are givens usually beyond our
immediate control, e.g., student
demographics, student skill levels, teacher
characteristics
Elements that describe the results of the
agencys processes, given the input, e.g.,
student achievement, attendance, completion rates
Elements that describe actions programs plan
for/implement to produce the outcomes they want,
e.g., curriculum, instructional strategies
materials
17
18Examples you can cite of
-
-
- Programs that had highly positive
- student outcomes one year and not so
- positive outcomes the next year?
- Possible reasons for the disparity?
- Actions states can take to help local
- programs examine data and verify the
- root causes for the disparity?
- Actions state can take, if any, to provide
- assistance and get them back on track?
Refer to H-3
18
19Steps to Desk Monitoring
4. Use and Evaluate Desk Monitoring
5. Refine Process, as Needed
3. Implement Desk Monitoring
2. Plan Develop Monitoring Procedures
1. Design a Desk Monitoring Tool
19
20Step 1. Designing a Desk Monitoring Tool
- Essential elements of a good desk
- monitoring system
- Measures tied to state priorities that
- accurately reflect performance
- Evaluative standards or other
- benchmarks that define
- good performance
-
- A summative rubric that characterizes
- overall program performance.
20
21Step 1. Designing a Desk Monitoring Tool
A. Measures
Three elements of a good desk monitoring system
B. Evaluative Standards
C. Summative Rubric
21
22First Step in Designing the Tool
- Decide on the measures you will use
- to evaluate program performance.
- Some measures are those required by the
- NRS and other accountability needs
- Include other measures to help you
- understand local programs procedures
- and how the programs collect data
- Select a manageable number of measures.
22
23Three Measures Needed for an Effective Data
Monitoring Instrument
-
- Student outcome measuresthe central
- element (educational gain and follow-up
- measures)what students are achieving
- Data process measureshow programs
- collect outcome measures how procedures
- affect data quality.
- Program process measureswhat happens in
- the program to produce outcomes the
- procedures and services that affect outcomes.
23
Step 1.
24Measures for Desk Monitoring Tool
- 1. Outcome Measures
- Educational gain
- Secondary credential
- Entry into postsecondary
- education
- Entered and retained
- employment
- Additional state measures
- 3. Program Process Measures
- Recruitment and enrollment
- Retention and attendance
- Professional development
- and staff
- Curriculum and instruction
- Support Services
- 2. Data Process Measures
- Assessment procedures
- Goal setting and orientation
- Follow-up procedures
24
25To Select Measures, Your State Needs
- A clear policy and direction for programs.
- A clear idea of the outcomes you want
- programs to achieve that will define
- quality.
- e.g., State policies on the following can guide
- selection of measures
- contact hours,
- number and type of students a programs
- should enroll,
- outcomes expected (e.g., completing educational
- levels or promoting passage of GED Tests.
You measure what you treasure.
25
26- Not everything that can be counted
- counts,
- And not everything that counts
- can be counted.
- -Albert Einstein
26
27A Word about Data Quality
-
- Make sure that local programs are able to
- collect and report information for desk
- monitoring in a valid and reliable way.
- CAUTION The measures will hinder efforts at
- program improvement if they do not accurately
- reflect performance.
-
-
27
Step 1.
281. Student Outcome Measures
- Educational level completion (one for
- each NRS or state level or combined)
- Receipt of secondary credential (adult
- high school diploma or GED)
- Entry into postsecondary education
- Entered employment
- Retained employment
Note You do not need to select all of these for
your desk monitoring tool.
28
Step 1.
292. Data Process Measures
- Number and percentage of students pre-tested
- by a set time
- Number and percentage pre- and posttested
- Average contact hours between pre- and posttest
- Number of students who completed the goal-
- setting process
- Time the goal setting was completed
- Number of students with goals
- Number of students contacted by time
- Number of students with SSNs
29
Step 1.
302. Data Process Measures (Cont.)
- A. State Assessment Policy Procedures
- - Which pre- and posttest standardized
assessments - to use
- - When to administer pre- and posttests
- - How to match scores on the assessment to
NRS levels - State staff can monitor
- - Number and percentage of students
pre-tested by - a set time
- Number and percentage pre- and posttested
- Average contact hours between pre- and posttest.
Step 1.
30
312. Data Process Measures (Cont.)
- B. Goal Setting and Orientation Procedures
- - Set appropriate follow-up goals with
students -
- - Have system in place for tracking students
- - For collecting follow-up measures,
- ? If survey method, process for contacting
- students and achieving sufficient response
- rate for a valid survey
- ? If data matching, process for collecting
- student SSNs accurately.
Step 1.
31
322. Data Process Measures (Cont.)
- B. Goal Setting and Orientation Procedures
- State staff can monitor
-
- - Number of students who completed the
- goal-setting process
- - Time that goal setting was completed
- relative to student enrollment
- - Number of students with goals
Step 1.
32
332. Data Process Measures (Cont.)
- C. Follow-up Procedures
- - Phone survey
- - Data matching
- - Combination of methods
- State staff can monitor
- Number of students contacted by set time
- Number of students with SSNs.
Step 1.
33
343. Program Process Measures
- Total enrollment
- Enrollment by subgroups
- (e.g., educational level,
- demographic variables)
- Average attendance hours
- Average hours by subgroups
- Proportion of scheduled
- hours attended
- Number of credentialed
- teachers
- Number of teachers by
- ethnicity
- Average hours of
- professional development
- received
- Number and types of classes
- offered by educational
- functioning level or subject
- Number of instructional hours
- offered for types of classes
- Number of students receiving
- or referred to support services
- Number and type of agencies
- with which programs partner
- for support services
- Expenditure per student
- Expenditure per outcome
- Expenditure per instructional hr.
- Expenditure per student
- instructional hour
Step 1.
34
353. Program Process Measures (Cont.)
- Offer clues to the reasons for good and
- poor performance
- Help guide program improvement efforts
-
- Can include model indicators of program
- quality in the areas of
- Recruitment and Enrollment
- Retention and Attendance
- Professional Development and Staff
- Curriculum and Instruction
- Support Services
- Expenditures
Step 1.
35
36What do you Gain by Tracking
- A. Recruitment and Enrollment?
- Do programs serve students at all levels
- or primarily those at the higher levels?
- Do ABE classes enroll native-English-speaking
- learners or primarily foreign-born students
- who have completed ESL instruction?
-
- Note Measures on student enrollment can provide
- an indicator of recruitment practices as well as
- evidence that programs need to improve
recruitment - practices.
Step 1.
36
37What do you Gain by Tracking
- B. Retention and Attendance?
- Do students attend programs long enough
- to realize gains?
- Do programs provide the number of
- instructional hours required by the grant?
- Which students exit before completing a level?
- What is attendance relative to scheduled
- class hours?
-
-
Step 1.
37
38What do you Gain by Tracking
- C. Professional Development and Staff?
- Number of credentialed teachers
- Number certified in adult education or TESOL
- Number of teachers by ethnicity
- Mean hours of professional development
- events that instructional staff attended or
- completed
-
Step 1.
38
39What do you Gain by Tracking
- D. Curriculum and Instruction?
- Too complex an issue for desk monitoring, but
can monitor - Number and types of classes offered by
- educational functioning level or subject, and
- Number of instructional hours offered for
- types of classes
- Note This data is useful for monitoring
compliance with - grant requirements. It also can provide clues to
performance - patterns, e.g., few instructional hours and
classes offered - may explain low retention and educational gain.
-
-
Step 1.
39
40What do you Gain by Tracking
- E. Support Services?
- Number of students receiving or referred
- to services, such as
- - Transportation
- - Child care
- either provided by program or
- by program partner agencies
-
-
Step 1.
40
41What do you Gain by Tracking
- F. Expenditures?
- Which funds to include (state, federal,
- local funds, in-kind contributions)
- Unit of analysis
- - Expenditure per student
- - Expenditure per outcome
- - Expenditure per instructional hour
- - Expenditure per student instructional
- hour
-
Step 1.
41
42Determining Measures for your Desk Monitoring
Tool
Refer to H-4ad
42
43Determining Measures for your Desk Monitoring
Tool
- State teams report on their measures in small
group - Two other states in small group ask clarifying
- questions and make suggestions/recommendations
- Process observers report what they observed
and - conduct small group discussion
-
- Reporter shares with larger group the
similarities - and differences among the teams, the types of
- clarifying questions asked, and whether the
- questioning process caused any team to
- re-think or refine the measures it selected.
Refer to H-5
43
44Reflection on Day 1Activities and Learning
- Questions?
- Individual Reflection
- Pluses and Deltas
Refer to H-6
44
45NRS Desk Monitoring Tool
-
- Excel-based template
- Includes 25 measures of student outcomes,
- data collection procedures, and
- program processes
- States select measures most relevant to
- their needs and design report to include
- trends, comparison data, or performance
- targets to evaluate local performance.
45
Step 1.
46Performance Standards and Performance Evaluation
-
- Congratulations!
- Youve identified the
- measures for your
- monitoring tool.
- But youre not finished
- Now you need to define an
- acceptable level of performance.
-
46
47Performance Standards and Performance Evaluation
- The measures a state selects for its desk
monitoring tool - Define what the state believes is important about
- program quality and performance
-
- Reflect state policythey are the basis by which
- the state judges and funds adult literacy
services. - Remember Standard setting is a powerful method
- for changing behavior to implement state policy.
-
Step 1.
47
48Three Models of Performance Standard-setting
- Continuous Improvement
- Trends over Time
- Relative Ranking
- Comparisons with Mean Performance
- External Criteria
- Uniform Standards
Step 1.
48
49Performance Standard-setting Models
Model Policy Strategy Advantage Disadvantage
Continuous Improvement Standard based on programs past performance Designed to make all programs improve compared with themselves Works when there is stability and a history of performance on which to base the standard. Ceiling reached over time so that little additional improvement is possible.
Example Program will show a 10 increase from
last year in the number of students advancing to
low-intermediate ESL.
Step 1.
49
50Performance Standard-setting Models (Cont.)
Model Policy Strategy Advantage Disadvantage
Relative Ranking Standard is the mean or median state performance Used for Relatively stable measures, for which median performance is acceptable. Focuses program improvement mainly on low-performing programs. Higher performing programs have little incentive to improve.
Example Percentage of students passing GED in the
program will be equal to or greater than the
state average.
Step 1.
50
51Performance Standard-setting Models (Cont.)
Model Policy Strategy Advantage Disadvantage
External Criteria Standard is set by formula or external policy criteria. Promotes adoption of a policy goal to achieve a uniform higher standard of performance. Ensures an accepted level of quality Useful when large-scale improvements are needed over the long term. If set without considering past performance, may be unrealistic and unachievable.
Example Programs will achieve at least a 50
percent response rate on the follow-up survey.
Step 1.
51
52Putting it All Together A Summative Rubric
-
- Rubric
- A scheme for classifying products or behaviors
- into categories that vary along a continuum
- A vehicle for describing varying levels of
quality, - from excellent to poor, or from meets
expectations - to does not meet expectations, or unacceptable
- Examples of rubrics we are familiar with
- letter grades, or the GED essay scoring guide
52
Step 1.
53Advantages of Scoring Rubrics
- Clarify expectations about the characteristics
of - quality programs
- Set standards for program performance
- Provide an objective measure for examining
- and evaluating programs
- Can provide formative feedback to programs
- Can be used by programs for self-assessment
- and improvement
- Can lead to shared standards among staff and
- the public about what makes a good program.
53
Step 1.
54Steps to Developing a Summative Scoring Rubric
(Cont.)
1. Categorize measures 2. Combine or reduce
measures 3. Develop scale 4. Assign values to
measures 5. Test and refine
Policy
Policy
54
55Steps to Developing a Summative Rubric
- Identify the measures you want to usethey should
- reflect state policy/priorities.
- Combine the measures and organize them into
- categories that match your states interests
- and priorities.
- Develop a scoring scale by
- - Defining an unacceptable level, e.g.,
- lt25 out of 100
- - Defining the lowest level of acceptable
- performance, e.g., 25-50 out of 100
- - Defining an intermediate level between the
- highest and lowest scores, e.g.,
- 50-75 out of 100
-
Step 1.
55
56Steps to Developing a Summative Rubric (Cont.)
- Assign values or scores to each measure to fit
- into the scale and to indicate if a program met/
- not met/exceeded the standard.
-
- Can weight scores to indicate relative
importance - of each measure, e.g., more points for
exceeding - performance on educational gain measures.
-
- Weighting should reflect state policy and the
direction - you want programs to take.
-
Step 1.
56
57Steps to Developing a Summative Rubric (Cont.)
- Test and refine rubric
- Apply the categories and scoring rubric to
data - from your monitoring tool and note
distribution of - your programs. Examine performance to see
if it - matches your scoring scheme.
- The rubric should define performance along a
continuum, - e.g., does not meet/meets/exceeds
expectations. -
- Your programs should not all cluster at either
the high - or low end of the scale.
- If they do, adjust the scale or the rubric.
-
Step 1.
57
58Example from Colorado
Enrollment Up to 10 points for meeting enrollment projects and up to 10 points for enrollment most in need 20
Expenditure per learner instructional hour Up to 25 points for expenditure of federal funds relative to other programs 25
Level completion Points awarded according to programs overall level of completion performance compared with state average and according to programs meeting of its individual targets, adjusted for enrollment in levels 45
GED completion Up to 5 points for meeting targets for students obtaining adult secondary diploma or passing GED tests 5
Pre- and posttesting Up to 5 points for meeting targets for percentage of students pre-and posttested 5
Step 1.
58
59Example from Ohio
Student achievement Points awarded according to programs meeting targets for level completion (each NRS level taught) GED, postsecondary transition, entered and retained employment family literacy measures enrollment students pre- and posttested pre-test data Bonus points for exceeding state overall completion average and exceeding pre- and posttesting target 67
Administrative requirements Points awarded according to programs timely completion of state administrative requirements 33
Step 1.
59
60Example from Massachusetts
Attendance Points awarded according to actual and planned attendance hours 3 points
Average hours attended Points awarded for meeting or exceeding state standards 3 points
Pre- and posttesting percentage Points awarded for meeting or exceeding state standards 3 points
Learner gains Total percentage of students who show learner gains that meet or exceed state standards. 9 points
Step 1.
60
61Remember
- A desk monitoring tool can provide states
- with a handy mechanism for encouraging
- local programs to adhere to state policies.
- Once developed, the tool is not static
- and fixed for all time.
- States can change the measures in the
- monitoring tool when they want to focus
- local programs attention on specific
- issues or measures.
61
62Steps to Developing a Summative Rubric
- Activity on
- reviewing and evaluating rubrics
-
Refer to H-7
62
63Developing a Summative Rubric Summary
- Categorize similar measures according to
policy - or importance to state
- (e.g., retention, EFL gains)
- Dont need to use all measures
- Assign points based on performance,
- (e.g., one point for meeting or exceeding
- standards more points for meeting or
- exceeding, based on quartile rankings)
- Consider weighting or bonus points to stress
- importance
63
64Steps to Developing a Summative Rubric
- Your Turn!
- Developing Your States
- Summative Rubric
-
Refer to H-8
64
65Reflection on Day 2Activities and Learning
- Questions on the monitoring
- tool or rubric development?
- Individual Reflection
- Pluses and Deltas
Refer to H-9
65
66 Congratulations!
-
- Youve completed the
- development of your
- monitoring tool, including
- standards, and rubric.
- But youre not finished
- Now you need to plan for its
- implementation and use.
-
66
67Planning, Using, and Implementing Desk
Monitoring
- Jigsaw Readings
- Group A Planning and Development
- Pages 27 through 30 (up to Implementation)
- Group B Implementation
- Pages 30 through 33 (up to Use and Results)
- Group C Use and Results
- Pages 33 through 36 (up to Summary)
- All Groups Summary Section
- Pages 36 to 39
Refer to H-10
67
68- Data analysis should not be about
- just gathering data.
- Its easy to get analysis paralysis
- by spending time pulling data together
- and not spending time using the data.
-
- - Victoria
Bernhardt
68
69Desk Monitoring Planning, Development,Implementat
ion, and Use
- Planning
- Development
- Review and
- refine
- Collaborate
- Field Test
- Train Staff
- Implementation
- Provide sufficient
- resources
- Support with data
- System
- Build acceptance
- and buy-in
- Evaluate and
- modify
- Use
- Identify high and
- low performing
- programs
- Reward and
- sanction
- Provide technical
- assistance
- Improve local
- data use and
- state planning
Step 2
Modify
Evaluate
69
Step 4
Step 3
70Steps to Desk Monitoring
4. Use and Evaluate Desk Monitoring
5. Refine Process, as Needed
3. Implement Desk Monitoring
2. Plan Develop Monitoring Procedures
1. Design a Desk Monitoring Tool
70
Step 2
71Tips for Planning and Development
Build components of desk review into your overall administrative system. Start an accountability task force to help plan your measures and approach. Have a plan for using every piece of information requested. Produce templates and blank drafts of graphs and reports before finalizing your system. Pilot test. Try out desk monitoring for a year without tying it to rewards or sanctions. Have ongoing training for local staff. Provide reports with graphs and highlighted points and walk through the reports with staff.
Step 2
71
72Steps to Desk Monitoring
4. Use and Evaluate Desk Monitoring
5. Refine Process, as Needed
3. Implement Desk Monitoring
2. Plan Develop Monitoring Procedures
1. Design a Desk Monitoring Tool
72
Step 3
73Tips for Implementation
Know that implementation takes three years or more. Assign state staff to focus on monitoring programs. Use a real-time data system. Use a web-based system or one that can be accessed across the state. Ensure that program directors have the ability to run their own reports on an as-needed basis. Provide comparisons or the ability to choose comparisons. Provide options for comparisons to other programs, classes, etc. Transparency in how the system works is important to local buy-in. Watch out for unintended effects.
Step 3
73
74Steps to Desk Monitoring
4. Use and Evaluate Desk Monitoring
5. Refine Process, as Needed
3. Implement Desk Monitoring
2. Plan Develop Monitoring Procedures
1. Design a Desk Monitoring Tool
74
Step 4
75Tips for Use and Results
Identify highest and lowest performing programs, based on the rubric. Resolve data quality issues Provide rewards for good performance Sanction low-performing programs Recognition can be just as motivating as money Have a strategy to determine which programs get technical assistance Use the data from desk monitoring to determine technical assistance needs Consider using a peer system to provide technical assistance
Step 4
75
76Steps to Desk Monitoring
4. Use and Evaluate Desk Monitoring
5. Refine Process, as Needed
3. Implement Desk Monitoring
2. Plan Develop Monitoring Procedures
1. Design a Desk Monitoring Tool
76
Step 5
77Data Analysis is a Team Sport
State and local staff work together to review
program data, identify areas of excellence and
areas needing attention, and develop plan for
program improvement.
77
78Quote of the Day
Its easy to get the players. Getting em to
play together, thats the hard part.
- Casey Stengel
78
79Next Steps?
- How can we help you?
- www.nrsweb.org
79
80Best Wishes in Your Desk Monitoring andProgram
Improvement Efforts
Thank You and Good Luck!
80