Title: Development and Deployment of a WebBased Course Evaluation System
1Development and Deployment of a Web-Based Course
Evaluation System
- Jesse Heines and David Martin
- Dept. of Computer ScienceUniv. of Massachusetts
Lowell
Miami, Florida, May 26, 2005
2The All-Important Subtitle
- Trying to satisfy ...
- the Students
- the Administration
- the Faculty
- and the Union
- presented in a slightly different order from
that listed in the paper
3The All-Important Subtitle
4The All-Important Subtitle
5Paper-Based System Reality
- Distributed and filled out in classrooms
- Thus, virtually all students present that day
fill them out - However, absentees never fill them out
6Paper-Based System Reality
- Distributed and filled out in classrooms
- Collected but not really analyzed
- At best, Chairs look them over to get a
general feel for students reactions - Professors simply dont bother with them
- lack of interest and/or perceived importance
- simple inconvenience of having to go get them and
wade through the raw forms
7Paper-Based System Reality
- Distributed and filled out in classrooms
- Collected but not really analyzed
- Lose valuable free-form student input because
those comments are often ... - downright illegible
- so poorly written that its simply too difficult
to try to make sense of them
8Paper-Based System Reality
- Distributed and filled out in classrooms
- Collected but not really analyzed
- Lose valuable free-form student input
- However, these comments have the greatest
potential to provide real insight into the
classroom experience
9Paper-Based System Reality
- Distributed and filled out in classrooms
- Collected but not really analyzed
- Lose valuable free-form student input
- However, these comments have the greatest
potential to provide real insight - Bottom Line 1 The paper-based system pays
little more than lip service to the cry for
accountability in college teaching
10Paper-Based System Reality
- Bottom Line 2 Were all already being evaluated
online whether we like it or not ...
11(No Transcript)
12Web-Based System Goals
- Collect data in electronic format
- Easier and faster to tabulate
- More accurate analysis
- Possibility of generating summary reports
13Web-Based System Goals
- Collect data in electronic format
- Easier and faster to tabulate
- More accurate analysis
- Possibility of generating summary reports
- Retrieve legible free-form responses
- Allow all students to complete evaluations
anytime, anywhere, at their leisure, and even if
they miss the class in which the evaluations are
distributed
14What We Thought
- If we build it, they will come ...
- ... but we were very wrong!
15Student Issues
- Maintain anonymity
- Ease of use
- Speed of use
16Student Issues
- Maintain anonymity
- Ease of use
- Speed of use
- We guessed wrong on the relative priorities of
these issues.
17Student Issues
- Our main concern
- Prevent students from stuffing the ballot box
- One Student One Survey Submission
18Student Issues
- Our main concern
- Prevent students from stuffing the ballot box
- One Student One Survey Submission
- Major concern that appeared after the system was
deployed - Simply getting students to participate
- There appeared to be a great deal of apathy,
particularly in non-technical courses
19Student Login Evolution
Fall 2003
20Student Login Evolution
21Student Login Evolution
22Student Login Evolution
23Administration Issues
- System quality and integrity
- Buy in from the deans
- But the real issue was ...
- Dealing with the faculty union
24Faculty Issue 1
- Control of which courses are evaluated
- Contract wording
- The evaluation will be conducted in a single
section of one course per semester. ... At the
faculty members option, student evaluations may
be conducted in additional sections or courses.
25Union Issue 1
- In 2004, all surveys were turned on by default,
that is, they were all accessible to students on
the Web - This was a breach of the contract clause stating
that evaluation will be conducted in a single
section of one course - In 2005, the default is inaccessible
- Use of the system thus became voluntary
- As of May 20, 2005 (end of final exams), 95
professors (25 of the faculty) in 40 departments
had made 244 course surveys accessible to students
26Faculty Menu
27Faculty Issue 2
- Control of what questions are asked
- Contract wording
- Individual faculty members in conjunction with
the Chairs/Heads and/or the personnel committees
of academic departments will develop evaluation
instruments which satisfy standards of
reliability and validity.
28Union Issue 2
- In 2004, deans could set questions to be asked on
all surveys for their college - This was a breach of the contract clause stating
that faculty would develop questions in
conjunction with the Chairs/Heads and/or
department personnel committees - In 2005, all college-level questions are now at
the department level so that only Chairs can
specify required questions - Deans then had essentially no access to the
system unless they were teaching themselves or
were the acting chair of a department
29Faculty Menu
30Faculty Question Editor
31Faculty Question Editor
32Faculty Add Question Form
33Survey as Seen by Students
34Faculty Issue 3
- Control of who sees the results
- Contract wording
- Student evaluations shall remain at the
department level. At the faculty members
option, the faculty member may submit student
evaluations or a summary of their results for
consideration by various promotion and tenure
review committees. The faculty member shall
become the sole custodian of these student
evaluations at the end of every three academic
years and shall have the exclusive authority and
responsibility to maintain or destroy them.
35Results as Seen by Faculty
36Union Issue 3
- Data was collected without faculty consent
- This was a breach of the contract clause stating
that student evaluations shall remain at the
department level - All survey response data for the Fall 2004
semester were deleted on February 15, 2005,
unless the faculty member explicitly asked that
it be kept - Whats going to happen with this semesters data
has not yet been determined
37Faculty Menu
38Lessons Learned/Confirmed
- No matter what you do, there will be those who
object ? You must remain open-minded and
flexible - Practice good software engineering so that the
software can be easily modified - Its really worth it to work with the many power
factions to garner support - Every system needs a champion
- Be prepared to spend a huge amount of time on
system support
39Support, Support, Support
40Thank You
Jesse M. Heines, Ed.D. David M. Martin,
Ph.D. Dept. of Computer Science Univ. of
Massachusetts Lowell heines,dm_at_cs.uml.edu http/
/www.cs.uml.edu/heines,dm
41What We Thought
- If we build it, they will come ...
42What We Thought
- If we build it, they will come ...
- We were very wrong!
43What We Thought
- If we build it, they will come ...
44Student Issues
- Maintain anonymity
- Ease of use
- Speed of use
- We guessed wrong on the relative priorities of
these issues. - Our main concern Prevent students from stuffing
the ballot box - one student one survey submission
45Student Issues
- Our main concern
- Prevent students from stuffing the ballot box
- One Student One Survey Submission
- Major concern that appeared after the system was
deployed - Simply getting students to participate
- There appeared to be a great deal of apathy,
particularly in non-technical courses