Technology-Mediated Assessment - PowerPoint PPT Presentation

1 / 59
About This Presentation
Title:

Technology-Mediated Assessment

Description:

Electronic grade book, chat rooms, bulletin boards, calendars. Provides links to ... about circuits in general - Learned how to read schematics - Learned how ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 60
Provided by: jackmc9
Category:

less

Transcript and Presenter's Notes

Title: Technology-Mediated Assessment


1
Technology-Mediated Assessment
  • Jack McGourty, Columbia University
  • John Merrill, Ohio State University
  • Mary Besterfield-Sacre Larry Shuman, University
    of Pittsburgh

Gateway Engineering Education Coalition
2
Technology-Mediated Assessment
  • Introduction
  • Your Expectations
  • Applications
  • Drexel and Columbias Course Evaluation
  • Ohio States Activities
  • Team Evaluator
  • Your Experiences
  • Enablers and Barriers (Break-out Groups)
  • Conclusions

3
Introduction
  • Reasons for Online-Assessment
  • Common Applications
  • Design and Development
  • Things to Think About

4
Reasons for On-Line Assessment
  • Customized development
  • Targeted communication
  • Ease of distribution/no boundaries
  • Automatic data collection and analyses
  • Real time response monitoring
  • Timely feedback

5
Common Applications
  • Attitude Surveys
  • Multisource assessment and feedback
  • Course evaluations
  • Portfolios
  • Technology-mediated interviews
  • Tests

6
Design and Development
  • Item/Question development
  • Adaptive testing/expert systems
  • Multimedia tutorials
  • Dialogue boxes
  • Reporting wizards

7
Things to Think About
  • Confidentiality/Privacy
  • Response rates
  • Reliability/Validity
  • Ease of use
  • Administrators, end users
  • System growth
  • Can it easily be upgraded?
  • Adding modules
  • System flexibility
  • Survey/test construction
  • Data flexibility
  • Item databases
  • Reporting wizards
  • Data storage
  • Platforms
  • Specific vs. combination
  • Reporting
  • Various levels
  • Dissemination mechanisms
  • Real time vs. delayed

8
Technology in Education
Technology Enabled Assessment The Wave of The
Future
  • Dr. John Merrill
  • The Ohio State University
  • Introduction To Engineering Program

9
Objectives
  • Explanation of web-based assessment tools
  • Uses of assessment tools
  • Virtual run-through of student actions
  • Lessons learned
  • QA

10
Web-Based Assessment Tools
  • Course Sorcerer (through WebCT)
  • Online Journal Entries
  • Course Evaluations
  • Team Evaluator
  • Peer Evaluations

11
WebCT
  • WebCT is a commercial web-based tool used for
    course management.
  • IE Program uses/capabilities
  • Electronic grade book, chat rooms, bulletin
    boards, calendars
  • Provides links to
  • Course Material
  • Course Sorcerer
  • Team Evaluations (Team Evaluator)

12
Course Sorcerer
  • A simple, web-based evaluation tool created by
    Scott Cantor at University Technology Services
  • Technical Specifications
  • Written in Cold Fusion
  • Run on Windows NT with a Netscape Enterprise Web
    Server
  • Uses a MS SQL Server database with 15 tables
  • Server Machine PII-450 w/ 512M of RAM
  • Accesses Sybase running on Solaris 2.6 as a
    warehouse for roster data.
  • Used for Journal Entries Course Evaluations

13
Team Evaluator (Peer Evaluation)
  • Used by team members to provide confidential
    assessment
  • System Requirements
  • Operating System Windows 2000 with ActivePerl or
    UNIX with Perl 5.004 or
  • higher
  • Perl Modules CGI, DBI (plus SQL drivers), POSIX
  • SQL Server MySQL 3.23 or higher
  • Web Server IIS (Windows) or Apache 1.3 (UNIX)
  • CPU Pentium II 400 or better recommended
  • Memory 128 MB or higher recommended
  • Disk Space 100 MB for adequate database space

14
Journal Entries
  • Students complete journal entries online every
    two weeks.
  • Submissions are anonymous.
  • All entries are read and summarized by a staff
    member and shared with the instructional team.
  • Instructional team members share the summaries
    with their classes.

15
Course Evaluations
  • Students in 181 182 complete online course
    evaluations at the end of each quarter.
  • Questions designed to evaluate courses based on
    items a-k of Criterion 3, Program Outcomes
    Assessment, in the ABET Engineering Criteria,
    2000.

16
Short Term Uses Journal Entries Course
Evaluations
  • Address immediate student concerns/questions
    about class, labs, or projects.
  • Inquire about student problems with specific
    topics and labs.
  • Discover general information from students in
    regards to interests, influences, and attitudes.

17
ExampleAddressing Immediate Student Concerns
  • How are the figures supposed to be done?
    Strictly isometric or just drawn so you can see
    everything? What pieces need to be labeled?
  • What are we doing in labs 6 7? I know it says
    in the syllabus that we are incorporating the
    sorting mechanism, but is that going to take two
    weeks?

18
Long-Term Uses Journal Entries Course
Evaluations
  • Improve program content
  • Improve course materials
  • Modify teaching styles
  • Evaluate course based on ABET criteria

19
ExampleImproving Course Content
  • Positive I... - Gained knowledge about circuits
    in general - Learned how to read schematics -
    Learned how to use breadboards - Further
    developed team working skills Negative - The
    circuits did not work the first time. - Time ran
    short for both labs, but we did finish each
    circuit.

20
How It Works
  • Start WebCT site
  • http//courses2.telr.ohio-state.edu

21
Completion Tracking
22
Lessons LearnedJournal Entries Course
Evaluations
  • Students are more likely to complete if given
    credit.
  • Students are extremely responsive to the
    anonymity of the online survey.
  • Students respond positively when asked for
    suggestions/solutions to problems in the class.

23
Web Enhanced Course Evaluation at Columbia
University
  • Jack McGourty
  • Columbia University

24
Overview
  • A little history
  • How does course assessment fit into the big
    picture?
  • Why use web technology?
  • How is it being done?
  • Does it work?

25
History
  • Columbias Fu Foundation School of Engineering
    and Applied Science began using the web for
    course assessment about four years ago starting
    with a student administered web site for results
  • Designed and developed state-of-the-art system
    using student teams
  • Now building on current infrastructure to include
    on-line tutorials and increased flexibility for
    administration

26
Student Web Site
  • Search by course or faculty
  • Current and past results
  • No comments

27
The Big Picture
  • Why are we assessing courses and programs?
  • Continuous improvement of the education process
  • What are we doing right, and what can we do
    better?
  • Integral part of our ABET EC2000 Compliance
  • Develop a process
  • Collect and evaluate data
  • Close the loop
  • Document/Archive results
  • Course evaluation one of several outcome
    assessment measures such as senior exit surveys,
    enrolled student surveys, and alumni surveys

28
How WCES Fits in
29
Using Technology
  • Pro
  • Students have the time to consider their
    responses
  • Timely feedback
  • Responses are easily analyzed, archived and
    distributed
  • Less paper
  • Lower cost/efficient administration
  • Con
  • You lose the captive audience
  • You cant guarantee a diversity of opinions
  • Motivated/Non-motivated
  • Like course/Dislike course
  • Not necessarily less effort

30
Course Assessment Details
  • 10 Core Items
  • Course Quality
  • Instructor Quality
  • Relevant ABET EC2000 Items
  • Pre-selected by faculty member
  • Customized questions for specific course
    objectives

31
Selecting EC2000 Questions
32
Monitoring Faculty Usage
One of our culture change metrics is the
percentage of faculty who are capitalizing on the
system and adding custom and EC2000 questions.
Currently around 15.
33
Course Evaluation Results
  • Web page access
  • Current terms assessment
  • Limited time window
  • Limited access
  • Secure site
  • Previous terms results
  • Open access to numerical results not comments
  • Email Results
  • Individual faculty
  • Aggregate Data Department Chairs

34
Reporting
35
Promoting Responses
  • Student-driven results website
  • Multiple targeted emails to students and faculty
    from Dean
  • Announcements in classes
  • Posters all over the school
  • Random prize drawing

36
Closing the Loop
37
Does it Work?
  • Student response rates have steadily increased
    over past two years from 72 to 85
  • More detail in student written comments in course
    assessments
  • Data is available that we have never had before
  • Faculty use of ABET EC2000 and Customized
    question features increasing but still limited
    (15)

38
Cross Institutional Assessment with a Customized
Web-Based Survey System Mary Besterfield-Sacre
Larry Shuman University of
Pittsburgh
This work is sponsored by two grants by the
Engineering Information Foundation, EiF 98-01,
Perception versus Performance The Effects of
Gender and Ethnicity Across Engineering Programs,
and the National Science Foundation, Action
Agenda - EEC-9872498, Engineering Education
Assessment Methodologies and Curricula Innovations
39
Why a Web-Based Survey System for Assessment?
  • Need for a mechanism to routinely
  • Elicit student self-assessments and evaluations
  • Facilitate both tracking and benchmarking
  • Most engineering schools lack sufficient
    resources to conduct requisite program
    assessments
  • Expertise
  • Time
  • Funds
  • Triangulation of multiple measures
  • Multiple measures

40
Pitt On-line Student Survey System (Pitt-OS3)
  • Allows multiple engineering schools to conduct
    routine program evaluations using EC 2000 related
    web-based survey instruments.
  • Assess and track students at appropriate points
    in their academic careers via questionnaires
  • Survey students throughout their undergraduate
    career
  • Freshman Pre and Post
  • Sophomore
  • Junior
  • Senior
  • Alumni
  • Freshman orientation expanded to include
  • Math Placement Examinations
  • Mathematics Inventory Self-Assessment

41
Student-Focused Model
42
System-Focused Model
43
Pitt OS3
  • Conduct routine program evaluation via surveys
    through the web
  • Data collection
  • Report generation (under development)
  • Web versus paper surveys
  • Pros
  • Administration ease
  • Minimize obtrusiveness
  • Data is cleaner
  • Cons
  • Lower response than paper-pencil surveys
  • User/Technical issues

44
Pitt OS3System Components
45
Pitt OS3Local Administrator
  • Individual at the school where the surveys are
    being conducted
  • Responsible for the administering the surveys
    through a web-interface
  • Controls the appearance of the survey
  • Selects school colors
  • Uploads school emblem/logo
  • Selects survey survey beginning and ending dates
  • Composes initial and reminder email letter(s) to
    students
  • Cut-and-pastes user login names and email address
  • Manages surveys in progress
  • Extends surveys beyond original dates

46
Pitt OS3Local Administrator
47
Pitt OS3Local Administrator
48
Pitt OS3Local Administrator
49
Pitt OS3Local Administrator
50
Pitt OS3Student
  • Java Applet running on a web browser
  • One question per screen minimizes scroll bar
    confusion
  • Once student submits questionnaire, results are
    compressed and sent to the OS3 server
  • Results stored and students password is
    invalidated
  • Confirmation screen thanks the student for taking
    the survey
  • Can accommodate users who do not have email
    accounts

51
Pitt OS3Sample Student Email
52
Pitt OS3Student Welcome
53
Pitt OS3Student Instructions
54
Pitt OS3 Questionnaire
55
Pitt OS3How it Works
  • Every day OS3 summarizes all active surveys for
    each school
  • Summary reports on the number of students who
    have and have not taken the survey
  • Specific students can also be viewed from the
    local administrators account
  • Upon completion of the survey dates
  • Email addresses are stripped from the system
  • Only login names remain with results
  • Only time the OS3 system has student email
    addresses is when the local admin is receiving
    daily updates about their active surveys

56
Pitt OS3Sample Daily Report
57
Pitt OS3Evaluation of the System
  • Piloted on five schools
  • Multiple surveys concurrently at each school
  • Multiple schools at one time
  • Response rates vary (30 - 70 on average)
  • Example
  • University of Pittsburgh - April 2001
  • One initial with two reminder emails over 2.5
    weeks
  • Responses
  • Freshman 70
  • Sophomores 48
  • Junior 44
  • Varied by department
  • Some usernames had

58
Pitt OS3System Trace of One School
  • Freshman Post Survey
  • Survey available for two weeks with one reminder
    message
  • 57 overall response rate
  • Increased server traffic 2 to 24 hours after
    each email
  • Design concerns
  • 63 of students had to log in more than one time
  • Multiple logins due to case sensitive passwords
  • 14 never finished - browser problems or didnt
    want to finish
  • 10 gave up - just didnt complete login

59
Pitt OS3Issues to Consider
  • Consent for Human Subjects
  • Discuss with institutions Internal Review Board
  • Surveys often exempt
  • Java Applets not supported by very old browsers
  • HTML as alternate
  • Firewalls established by other organizations
Write a Comment
User Comments (0)
About PowerShow.com