Online course evaluations: A software as a service model - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Online course evaluations: A software as a service model

Description:

... through final exams ... the day after the last final for students who have not completed ... based on date of submission (before, during or after finals) ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 31
Provided by: sale69
Category:

less

Transcript and Presenter's Notes

Title: Online course evaluations: A software as a service model


1
Online course evaluations A software as a
service model
  • Louise Lonabocker
  • Julie Salem
  • Boston College
  • Office of Student Services

2
Boston College
  • Enrollment 14,500 students
  • Five undergraduate schools/colleges
  • Eight graduate and professional schools
  • Online course evaluation surveys 130,000 per
    year
  • Current survey 14 standard-response questions,
    4 open-ended questions, up to 5 custom questions
    entered by the faculty member

3
Project Phase I
  • Business case
  • Provost initiative paper to online
  • Agreement from academic committees
  • Student PEPs
  • Results at end of term
  • Faculty customized questions
  • Support for team-taught and cross-listed courses
  • Secure archival storage
  • Elimination of paper for surveys and reporting.

4
Project Phase I
  • Project team formed vice provost, associate
    dean, student services, academic technology, ITS,
    faculty
  • Vendor selection
  • ASP selected because the long-term strategy was
    uncertain
  • Business requirements, reference checks, contract
    negotiations, security, business continuity
  • Vendor Pearson eCollege

5
Project Phase I
  • Phase I Pilot Spring 2006
  • Online optional 60 percent opted for online
  • Survey
  • Maintain current survey including 6 standard and
    3 open ended questions plus optional customized
    questions

6
Project Phase I
  • Communication
  • Provost letter to faculty
  • BC website portal announcements
  • Student posters and flyers Be heard, Were
    listening
  • Campus newspapers
  • Emails to faculty and students in advance, at
    launch, and during the administration of the
    survey

7
Project Phase I
  • Customized questions 87 faculty for 123 courses
  • Five questions maximum, must use the agreement
    scale
  • Survey response rate 37.3 percent

8
Project Phase I
  • Evaluation Institutional Research surveys and
    focus groups
  • Strengths ease of use, preservation of class
    time, better quality responses for open-ended
    questions, fast reporting
  • Suggestions customized question options, enhance
    user interface, enhance reports, increase
    participation rate

9
Project Phase II
  • Phase II Fall 2006
  • Extend evaluations through final exams
  • Withhold early access to grades until the day
    after the last final for students who have not
    completed evaluations
  • Default to online evaluation, paper option for
    one more semester
  • Response rate 88 percent
  • Institutional Research analyzed results based on
    date of submission (before, during or after
    finals) to assess impact on instructor ratings
    and found neither substantive nor statistical
    differences.

10
Project Phase II
  • Reporting more instructor detail, mean scores,
    distribution of responses, summary of all classes
    taught by each instructor
  • Starting Fall 2007 administrators would have
    access to standard and open-ended responses, but
    not customized questions. Previously saw results
    for standard questions only.

11
Project Phases III-V
  • Phase III Spring 2007. All course evaluation
    surveys now online
  • Phase IV Fall 2008 New survey questions
    developed by University Council on Teaching 14
    standard and 4 open-ended questions
  • Phase V Fall 2009 Student access to results of
    standard-response questions only

12
Online Course Evaluation Management
  • Survey Volume
  • Response Rates
  • Customizing Surveys
  • Reporting
  • Administrative Challenges

13
Surveys Deployed
  • Fall Early December
    58,000
  • Spring Early May
    58,000
  • Cornerstone Dec 4 Dec 19 300
  • Fall MidTerm Mid Oct
    800
  • Spring MidTerm Late Feb 800
  • Summer Sessions (2)
    3500

14
Response Rate
  • Access to Grades
  • Communications reminders
  • Number of reminders
  • Non-respondents only
  • Direct link to survey
  • First reminder survey open date avoids
    confusion
  • Faculty real time response rate would be ideal

15
Response Rate Drivers
Reminder 3 May 13
Reminder 2 May 9
Grades Available May 6
Reminder 1 - May 5
16
(No Transcript)
17
Summer Response Rates
18
Customized Surveys
  • Faculty May Add 5 Questions
  • Low Adoption Rate lt 10 of Faculty
  • Agreement scale only
  • No Reuse or Copy Feature
  • No open-ended option
  • Priority for Future Enhancements

19
Reporting
  • Biggest Challenge
  • Revisions to provide response distribution
  • Sample Reports
  • Automation of Reporting

20
(No Transcript)
21
(No Transcript)
22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
Report Automation
  • First Implementation Expected Fall 2009
  • Automatic kick-off of reports
  • Web Tool for Manual Report Generation
  • Reports sent to vendor ftp site
  • Ability to Upload to Report Hosting Tool
  • Email notification when reports are complete

28
Administrative Challenges
  • Data Transmission
  • Security Management
  • User Support -Responding to Emails
  • Troubleshooting
  • Data Verification

29
Next steps Wish List
  • Improved Customized Questions
  • Ability to automate email reminders
  • Survey completed confirmation
  • Automated Data transmission

30
for more information
  • Boston College Student Services
  • http//www.bc.edu/courseevaluations
  • Louise Lonabocker louise_at_bc.edu
  • Julie Salem salemju_at_bc.edu
Write a Comment
User Comments (0)
About PowerShow.com