Online%20Course%20Evaluations:%20Lessons%20Learned - PowerPoint PPT Presentation

About This Presentation
Title:

Online%20Course%20Evaluations:%20Lessons%20Learned

Description:

Originally only last two weeks of class extended during 1st pilot. ... My preference get the new dean back on board, even more reminders, advertisement. ... – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 56
Provided by: wayn76
Learn more at: http://w.cali.org
Category:

less

Transcript and Presenter's Notes

Title: Online%20Course%20Evaluations:%20Lessons%20Learned


1
Online Course Evaluations Lessons Learned
  • With a cast of thousands, including Susan
    Monsen, W. Ken Woo, Carrie Mahan Groce, Wayne
    Miller

2
Online Course Evaluations Lessons Learned
  • Susan Monsen

3
Yale Law Experience
  • Course Evaluations were run by Student
    Representatives
  • Introduced first online system 2001
  • Changed system twice and introduced incentives
  • For Spring 2005 have 90 response rate

4
YLS OCE Version 1
  • First online course evaluation (OCE) Fall
    2001-Spring 2003
  • Home grown web application with 18 questions
  • System did not scale for in-class completion
  • General email reminders sent to all students
  • No incentives
  • Response rate less than 20

5
Back to Paper
  • Returned to Paper after 3 semesters use
  • Reasons
  • Low response rate
  • Wanted an easier to use interface for completing
    and viewing results
  • Wanted ability to add incentives

6
OCE Version 2Design
  • Design with input from student representatives
    and faculty
  • Modeled after Yale College system
  • Reduced the number of questions to 8
  • Added a comment question
  • Students with evaluations to complete received
    weekly email reminder

7
Incentives
  • Tested Class Time for Completion
  • Worked for small-midsize classes
  • Response rate about 90
  • Load testing indicated up to 75 simultaneous
    users.
  • Introduced Grade Blocking
  • Students see an instead of grade for those
    classes not evaluated.

8
OCE 2 Results View
9
Response Rate by System
10
OCE 2 Response Rates
11
What did we learn
  • Dont
  • Too many questions
  • No automated reminders
  • No incentives
  • Do
  • Incentives work!
  • Reminders help
  • Load test system

12
CTEs Online
  • Presented by
  • Ken Woo
  • Director, Law School Computing
  • Northwestern University School of Law

13
When?
  • 1st Semester Spring 2004
  • 2nd Semester Fall 2005
  • 3rd Semester Spring 2005
  • Only 1.5 years into it Online

14
When? (continued)
  • Paper system Fall 2003 80
  • Paper system Spring 2003 77
  • Paper system Fall 2004 70
  • 1st Semester Spring 2004 N/A
  • 2nd Semester Fall 2005 70
  • 3rd Semester Spring 2005 67.8

15
Why?
  • Wanted to push everything onto the Web.
  • Everyone had some sort of web access
  • Loose papers and go paperless
  • Centralized storage location
  • On a centralized server
  • No Data Steward available
  • Access by Registrar and Registrar Team only
  • Professors can view own results

16
Why? (continued)
  • Perceived as easier to manage
  • Changes were easier for Registrar
  • 3 types of forms
  • Standard (19 questions)
  • CLR (23 questions)
  • Clinic (18 questions)
  • Legibility was a small issue

17
Lessons Learned
  • Very similar to paper questions with some added
    questions for clarity
  • Participation rate is falling
  • Some ideas to increase participation
  • Withhold transcripts no
  • Withhold final grades no
  • Let know, no view of any results if no
    participation next semester Fall 06

18
Q A
  • CTEs Online Presented by
  • Ken Woo
  • Director, Law School Computing
  • Northwestern University School of Law

19
Online Course Evaluations Lessons Learned
  • Carrie Mahan Groce

20
University of DenverSturm COL Experience
  • Why Online Evaluations
  • Academic Dean was the instigator. Wanted better,
    more timely, access to evaluations, particularly
    comments.
  • Hoped to get more meaningful written comments,
    both good and bad.
  • Our school has a culture of use of written
    comments by students and search committees.

21
University of DenverSturm COL Experience
  • Web Manager built homegrown Cold Fusion
    application using current evaluation form and
    procedures as a start.
  • Data pulled from administrative (Banner) system.
  • Course and student data stored in one database,
    results in a separate db (anonymity).
  • Questions generate dynamically.

22
University of DenverSturm COL Experience
  • Initial concerns taken into account.
  • Faculty - only registered students, one per
    student. No evaluation after exam.
  • Students retain anonymity, no faculty access
    before grades.
  • Additional Student Concern
  • Complained this format would be too time
    consuming not addressed, later feedback
    suggests students appreciate freeing up class
    time.

23
University of DenverSturm COL Experience
  • Additional Faculty Concerns how addressed
  • Lower response rates pilot conducted to get a
    feel for response rates before faculty approval
    of online evals.
  • Concern that comments would be too accessible
    leaving less popular professors vulnerable
    agreed that Academic Dean could remove very
    negative comments from public view.
  • Not all courses followed standard exam schedule
    handled case by case.

24
University of DenverSturm COL Experience
  • Assoc. Dean wanted data to take to faculty came
    to Ed. Tech.
  • Started with pilot group in Fall 02 7 profs, 10
    course participated.
  • Spring 03 all adjuncts and a handful of appointed
    faculty 80 courses in all
  • Summer 03 all courses participated.

25
University of DenverSturm COL Experience
  • Evaluation Procedures
  • Evaluation goes online 2 weeks prior to semester
    end available through the day prior to exams
    beginning. Originally only last two weeks of
    class extended during 1st pilot.
  • Students receive emails with links to all their
    course evaluations and detailed instructions.
  • Reminder emails sent every other day or so to
    those who have not completed.

26
University of DenverSturm COL Experience
  • Results from pilots encouraging. Response rates
    good (higher than paper), though inflated due to
    incentives and babysitting.
  • Summer low but very short evaluation period.
  • Dean took data to faculty for approval to move
    all courses online. Approval given beginning Fall
    2003.

27
University of DenverSturm COL Experience
  • Response Rates - real use setting

28
University of DenverSturm COL Experience
  • Reasons for drop in response rates - speculation
  • Change of Academic Dean. Current dean not
    invested, less hands on encouragement.
  • Novelty wearing off. This year we had our first
    incoming class who never did a paper evaluation.
    No novelty factor just another chore.

29
University of DenverSturm COL Experience
  • What should we do?
  • Nothing? Assessment department happy with 70 and
    we are getting better rates than other divisions.
  • My preference get the new dean back on board,
    even more reminders, advertisement.
  • Better communication to faculty about timing so
    they can tell students what to expect.

30
University of DenverSturm COL Experience
  • Next steps
  • More sophisticated results generation. Advanced
    searching ability to compare profs side by side,
    show all evals for a professor or a course.
  • Streamline course list interaction. Build direct
    access to Banner system rather than pulling data
    out of the admin system. Not likely to happen.
  • Move from Access back-end to SQL Server.

31
University of DenverSturm COL Experience
  • Potholes to watch out for.
  • Difficult to know how good the data is. We
    realized late that the person pulling lists
    didnt have permissions to get non-law students
    enrolled in law classes. No way to know that from
    looking at such large amounts of data. 150
    courses/nearly 5000 individual evaluations.
  • Different schedules for different courses can
    cause headaches. 1st year Legal Writing wanted
    complete control over timing. Some courses finish
    early. Hard to keep those in institutional
    memory. Anytime an individual eval has a
    different schedule response lower.

32
University of DenverSturm COL Experience
  • Potholes (cont.)
  • Complete anonymity made a few instances of
    students filling out one evaluation as though it
    was for a different professor tedious. Mostly
    resolved by adding the professors name
    throughout the text of the eval, in as many
    places as possible.
  • Students want to retract an evaluation (usually
    negative). This semester was the first time we
    heard this request. Academic dean turned down all
    requests and shut the door to additional requests.

33
University of DenverSturm COL Experience
  • And a sink hole
  • A more pervasive problem with any ed tech
    project, once we do something it becomes ours.
  • Problematic because we dont have the staff to
    take on administrative functions, nor have we
    been given the power to handle issues with those
    functions.

34
University of DenverSturm COL Experience
  • Remedies?
  • Proactive never take too much control of a
    project. Build as much administrative
    functionality in as possible at the beginning.
  • If youve taken on too much - give it back, if it
    was their job before it was online, it should
    still be their job.
  • Easier said than done.

35
University of DenverSturm COL Experience
  • Final words of wisdom
  • Dont try to reinvent the wheel. We found we had
    better buy-in when we agreed to keep system as
    close to original as possible.

36
Contact information
Carrie Mahan Groce Web Manager University of
Denver Sturm College of Law cmgroce_at_law.du.edu 30
3.871.6098
37
Online Course Evaluations Lessons Learned
  • Wayne Miller

38
The Duke Law Experience
  • Introduced Summer 2003 without much planning when
    scantron equipment failed and replacement was
    deemed too expensive
  • My motivation was to provide a service to the law
    school that would benefit all more efficient for
    staff and students unmediated access for
    faculty better community access to public
    information (summaries)

39
The Duke Law Experience
  • Homegrown, PHP-based survey software was employed
  • Student Information System provided rosters
  • Local email system provided authentication
    (through LDAP) for both students and faculty

40
Shortsightedness.
  • Paper form was copied without re-evaluation
  • 10 minutes for in-class completion of paper
    evaluations was given back to faculty
  • Incentives for students were not thought through

41
Click the radio button is awkward at best
42
Scale changes are very problematic
43
Things we designed right
  • Registrar has direct control over which classes
    are included which faculty are associated with
    each class etc.

44
Things we designed right
  • Students can submit conditional evaluations
    when they fail to log in correctly or are not in
    our roster

45
Things people want
  • Students want to be able to edit and save, and
    come back to evaluations
  • Registrar and some faculty members want
    individualized time windows for certain classes

46
Student Response Rate
  • 70 response rate required to share course eval
    summaries with community
  • Students need constant cajoling or we need to
    provide a better incentive
  • Some faculty are apprehensive about including
    students who would not have been in attendance on
    day of paper evaluations, and uneasy about
    cajoled students

47
Student Response Rate
Semester Total Response Rate Percentage of Class/Instr Making Cutoff
Fall 2003 66 (extended into exam period) 24/82 29
Spring 2004 60 36/119 30
Fall 2004 52 8/93 8
Spring 2005 67 (dropped non-law students) 48/117 41
48
Student Response Rate
49
Student Response Rate
Time scheduled for evals in large classes
50
Student Response Rate
Automated and person-specific email from
Associate Dean
51
Student Response Rate
Second automatic email from Associate Dean and
cajoling email from Registrar
52
Incentives under consideration
  • Withhold registration for following semester
  • Withhold grades
  • Withhold free printing
  • Withhold firstborn.

53
Issues
  • Security not discussed much, but was a big part
    of planning
  • Privacy deal breaker for some students
    responses are anonymized before release
  • Accuracy faculty are suspicious of mix-ups
    varying scales have confused students
  • Urban legends stories abound among faculty
    about how Prof X saw everyones evaluations, etc.

54
Future
  • Evaluation form is being reworked easier to fill
    out, less confusing
  • Incentives are being considered
  • Scantron on/off-line solutions are being weighed
  • Support could at any point be withdrawn
  • And probably would have been, were another
    solution easy to implement.

55
Contact information
Wayne Miller Director of Educational
Technologies Duke University School of
Law wmiller_at_law.duke.edu 919-613-7243 http//edt
ech.law.duke.edu/
Write a Comment
User Comments (0)
About PowerShow.com