Title: The Search for Meaningful Evaluation of Online Programs
1The Search for Meaningful Evaluation of Online
Programs
- Marcie J. Bober, Ph.D.
- Dept. of Educational Technology
- San Diego State University
- bober_at_mail.sdsu.edu
- http//edweb.sdsu.edu/courses/ed791b/necc99/index.
htm
2Whats fueling the move to online program
delivery in higher ed?
- In some cases, Keegan (1996) argues
- aging students who face multiple commitments
(e.g., work and family). - high maintenance costs associated with facilities
and infrastructure. - competition with large corporations or peripheral
educational institutions with the expertise and
the financial wherewithal to produce attractive
instructional programs and products.
3The legitimacy of online institutions ...
- Complicating the growing marketplace -- a new
breed of degree-mill operators who use attractive
websites to ensure potential customers - Degree mills tend to operate in states where the
laws governing school operations are lax, e.g.,
Hawaii, Louisiana. - Degree mills fill the growing demand for
alternatives to traditional education--and our
insatiable appetite for getting things done
quickly.
4Are we confusing quality with accreditation?
- The best way to check on a schools validity
find out which agency accredited it. - The agency SHOULD be sanctioned by the US DoE OR
the Council on Higher Ed Accreditation
(http//www.chea.org). - But accreditation is no assurance of quality
only legitimacy.
5What do students have to say?
- Several questions posed to students enrolled in
an ed tech masters program at SDSU - Experience with online learning
- Level of satisfaction w/ courses taken
- Interest in more such courses or an entire
program - Preferences for instructional strategies
6What do students have to say?
- Several questions posed to students
- Views on Internet security
- Comfort with posting work online
- Indicators of course success
- Feedback preferences
- Interactivity preferences
- Enticements to attend class online
7Against what consistent criteria are program
worth and merit assessed?
- There are few instructional or evaluative models
to which faculty and administrators may turn. - Those associated with the training or
professional development fields (through
well-respected and useful in their own right) may
be inappropriate or inadequate.
8Against what consistent criteria are program
worth and merit assessed?
- We tend to over-rely on case study methodology
where individual faculty and staff design and
conduct a course, and then report on its
strengths and weaknesses.
9The point of this session ...
- To foster dialogue
- articulating criteria to help students
determine which of several online programs meet
their instructional needs (Porter, 1997). - determining what factors--other than the
semblance of convenience and flexibility--attr
act students to online programs.
10The point of this session ...
- To foster dialogue
- thinking systemically and systematically about
program evaluation. - thinking systemically and systematically about
the ways in which credits or units are earned
(and mastery is recognized).
11Lets browse some sites
- Weber State University Online
- Western Governors University
- Christopher Newport University
- University of Phoenix
- Azuza Pacific University
- Nova Southeastern University
12Lets browse some sites
- Michigan State University
- George Washington University
- Concord University School of Law
- Jones International University
13Some definitions of evaluation ...
- According to the Merriam Webster Dictionary
(online), evaluate means ... - to determine the significance, worth, or
condition of something, usually by careful
appraisal and study. - Weiss defines evaluation
- ...as the systematic assessment of the
operation and/or the outcomes of a program or
policy, compared to a set of explicit or implicit
standards, as a means of contributing to the
improvement of the program or policy (p. 4).
14An evaluation framework some of the more obvious
criteria to consider ...
- The degree to which the class website is
well-constructed (Mitchell, in Slattery, 1998). - The level and type(s) of student interactions
(Gibbs Fewell, 1997). - The extent to which the user interface is
intuitive and well-designed.
15An evaluation framework other dimensions of
relevance ...
- The clarity and achievability of the course
objectives. - The quality (and breadth) of the content.
- Course pacing.
- System accessibility and stability.
- Accommodation of different learning styles.
16An evaluation framework other dimensions of
relevance ...
- Support and scaffolding.
- Environmental richness.
- Interplay of technologies (ensuring that students
learn about not just from technological
systems). - Appeal to adults (e.g., incorporation of
andragogical principles).
17What wont work
- Systems or structures that are
- Simplistic
- Generic
- Overly focused on principles of design
18An ongoing process ...
- Deans Grant
- Ongoing discussion with peers
- Ongoing discussion with students
- Active participation in university policy-setting
- Thoughtful reflection