Title: The Ultimate Question:
1The Ultimate Question
- Does Your Technology Program Work?
- Elizabeth Byrom, Principal Investigator
- Anna Li, Evaluator
2Objectives
- Think about the context for evaluating technology
programs - Identify the key elements of an evaluation model
- Walk through steps for developing an evaluation
plan - Identify Evaluation Resources
3Why is evaluating technology Programs a challenge?
- Differences among adopters
- Scale effects
- Geography
- Media as systems
- Rapid change
- Trail of use
- (B. Bruce, 2001)
4Why is evaluation a challenge?
- Re-creation of technology
- New roles for teachers and students
- Technical characteristics
- Access
- (B. Bruce, 2001)
5Observations from SEIRTEC
- Evaluation is often the weakest part of a
technology program. - Competing priorities
- Expertise
- Policymakers often have unrealistic expectations.
- Traditional measures do not always apply.
6Some Things to Consider
- It takes four or five years for most teachers to
become highly proficient in teaching with
technology - Effective use of technology usually requires
changes in teaching strategies.
7Some Things to Consider
- Its the combined effect of good teaching,
appropriate technologies, and conducive
environment that makes a difference in student
achievement. - Good technology does not make up for poor
teaching.
8Assessment and Evaluation
- Assessment the measurement of knowledge, skills
and performance learning e.g. student
assessment, self-assessment - Evaluation ways of examining overall technology
programs as well as specifics of the program
9Evaluation Process
- Select an evaluation model
- Identify performance indicators
- Identify or develop data collection methods or
instruments - Collect data
- Analyze data
- Write evaluation report(s)
- Use evaluation results to revise, maintain,
augment or eliminate
10Key Elements of An Evaluation Plan
- Logic map(s)
- Evaluation questions
- Indicators of success
- Information sources
- Criteria and benchmarks
- Outcomes
11(No Transcript)
12 Professional Development Map
Inputs
Plans
Needs Assessments
Mandates Policies
Research Best Practices
13Evaluation Questions
- At least one question per objective
- Questions on
- Accountability
- Quality
- Impact
- Sustainability
- Lessons learned
14Indicators
Outcomes
Questions
Methods
Criteria
15Kinds of Questions
- Accountability Is the program doing what it is
supposed to do? - Quality How well are we implementing program
activities and strategies? How good (useful,
effective, well received) are products and
services?
16Kinds of Questions
- Impact Is the program making a difference? What
effects are services and products having on
target populations/ - Proximal effects
- Distal effects
17Proximal and Distal Effects
18Kinds of Questions
- Sustainability What elements are, or need to
be, in place for sustained level of improvement
in teaching and learning with technology to
occur? - Lessons learned What lessons are we learning
about the processes and factors that support or
inhibit the accomplishment of objectives?
19Sample Questions
- To what extent are teachers using technology to
increase the depth of student understanding and
engagement? - How have students been impacted by technology
integration? - How effective has our professional development
been in helping teachers attain basic technology
proficiency? In helping them learn effective
teaching practices?
20Indicators
- Definition a statement of what you would expect
to find out or see that demonstrates particular
attributes. - Focus on
- Quality, effectiveness, efficacy, usefulness,
client satisfaction, impact
21Information Sources
- Self-reports
- Questionnaires
- Interviews
- Journals and anecdotal accounts
- Products from participants
- Sample of work, tests, and portfolios
- Observations
- Media videotape, audiotape, photographs
- Archives
22Sources of Information Tracking Tools
- Milken Exchange Framework
- CEO Forum STaR Chart
- Learning with Technology Profile Tool (NCRTEC)
- SEIRTEC Technology Integration Gauge for
Success - Profiler (HPRTEC)
23profiler.hprtec.org
24Methods/strategies for collecting data
- Questionnaire
- Survey
- Interview
- Focus group
- Observation
- Archival records
25Criteria and Benchmark
- Stick a stake in the ground and say we are here
today - Likert-type Scale or rubrics
- STaR Chart
- SEIRTEC Progress Gauge
- Percentages e.g. 75 passing rate
26(No Transcript)
27Outcomes
- Decisions are made about maintaining, changing,
or eliminating aspects of the program - Convincing evidence is gathered for proposals and
plans - Products developed and distributed
- Reports
- Plans
28Data Analysis
- Quantitative Data
- Use Excel Spreadsheet
- SPSS for Windows or Mac
- Qualitative Data
- Content analysis
- Look for Emerging Themes and Summarize
29Using Evaluation Results
- Make data-informed decisions
- Make a case for continued funding
- Inform research and the public
30Evaluation Data on Impact
- SEIRTEC Professional Development Models listed
by greatest impact
High Quality
Met Needs
Timely
Important Resource
Institutes 100 100 100 100
Academies 100 97.4 97.4 96.5
Core Groups 94.7 86.2 92.9 96.3
Workshops 89 84 87.5 85.9
Presentations 89.3 78.8 83.7 79
31Hints for Successful Evaluation
- Think positively. Evaluation is an opportunity
to learn - Try to be objective.
- Make evaluation an integral part of the program
- Involve stakeholders in the evaluation process
- Brag on your successes, however small.
- Ask for help when you need it.
32Recommended Books
- King, Morris, Fitz-Gibbon, How to assess
program Implementation. Sage, 1987 - Patton, Utilization Focused Evaluation, Sage, 3d
edition, 1996. - Patton, How to use Qualitative Methods in
Evaluation, Sage, 1987. - Joint Committee on Standards for Educational
Evaluation, The Program Evaluation Standards, 2nd
Edition. Sage, 1994. - Campbell, Stanley, Experimental and
Quasi-Experimental Designs for Research,
Houghton-Miflin, 1963. (CS)
33Evaluation Resources
- SEIRTEC Web Site http//www.seirtec.org
- US Department of Education
- http//www.ed.gov/pubs/EdTechGuide/
- http//www.ed.gov/Technology/TechConf/1999/whitepa
pers/paper8.html - Education Evaluation Primer http//www.ed.gov/off
ices/OUS/eval/primer1.html
34Evaluation Resources
- Muraskin, Understanding Evaluation The way to
Better Prevention Programs. http//www.ed.gov/PDFD
ocs/handbook.pdf - National Science Foundation, User-Friendly
Handbook for Program Evaluation,
www.ehr.nsf.gov/EHR/RED/EVAL/handbook/handbook.htm
- W.K. Kellogg Foundation Evaluation Handbook,
http//www.wkkf.org/Publications/evalhdbk/default.
htm
35For Further Information, Contact
- Elizabeth Byrom, Ed.D.
- Ebyrom_at_serve.org
- Anna Li
- Ali_at_serve.org
- SouthEast Initiatives Regional Technology in
Education Consortium - 1-800-755-3277 www.seirtec.org