5. Online Assessment and Evaluation Practices - PowerPoint PPT Presentation

About This Presentation
Title:

5. Online Assessment and Evaluation Practices

Description:

'One difference between assessment in classrooms and in distance education is ... .com/ January, 2002, Phillip Long, Plagiarism: IT-Enabled Tools for Deceit? ... – PowerPoint PPT presentation

Number of Views:112
Avg rating:3.0/5.0
Slides: 73
Provided by: Curtis103
Category:

less

Transcript and Presenter's Notes

Title: 5. Online Assessment and Evaluation Practices


1
5. Online Assessment and Evaluation Practices
  • Dr. Curtis J. Bonk
  • Indiana University and CourseShare.com
  • http//php.indiana.edu/cjbonk
  • cjbonk_at_indiana.edu

2
Online StudentAssessment
3
Assessment Takes Center Stage in Online
Learning(Dan Carnevale, April 13, 2001,
Chronicle of Higher Education)
  • One difference between assessment in classrooms
    and in distance education is that
    distance-education programs are largely geared
    toward students who are already in the workforce,
    which often involves learning by doing.

4
Focus of Assessment?
  1. Basic Knowledge, Concepts, Ideas
  2. Higher-Order Thinking Skills, Problem Solving,
    Communication, Teamwork
  3. Both of Above!!!
  4. Other

5
Assessments Possible
  • Online Portfolios of Work
  • Discussion/Forum Participation
  • Online Mentoring
  • Weekly Reflections
  • Tasks Attempted or Completed, Usage, etc.

6
More Possible Assessments
  • Quizzes and Tests
  • Peer Feedback and Responsiveness
  • Cases and Problems
  • Group Work
  • Web Resource Explorations Evaluations

7
Sample Portfolio Scoring Dimensions(10 pts
each)(see http//php.indiana.edu/cjbonk/p250syl
a.htm)
  1. Richness
  2. Coherence
  3. Elaboration
  4. Relevancy
  5. Timeliness
  6. Completeness
  7. Persuasiveness
  8. Originality
  1. Insightful
  2. Clear/Logical
  3. Original
  4. Learning
  5. Fdback/Responsive
  6. Format
  7. Thorough
  8. Reflective
  9. Overall Holistic

8
E-Peer Evaluation Form
  • Peer Evaluation. Name ____________________
  • Rate on Scale of 1 (low) to 5 (high)
  • ___ 1. Insight creative, offers
    analogies/examples, relationships drawn, useful
    ideas and connections, fosters growth.
  • ___ 2. Helpful/Positive prompt feedback,
    encouraging, informative, makes suggestions
    advice, finds, shares info.
  • ___ 3. Valuable Team Member dependable, links
    group members, there for group, leader,
    participator, pushes group.
  • ___ Total Recommended Contribution Pts (out of
    15)

9
E-Case Analysis Evaluation
  • Peer Feedback Criteria
  • (1 pt per item 5 pts/peer feedback)
  • (a) Provides additional points that may have been
    missed.
  • (b) Corrects a concept, asks for clarification
    where needed, debates issues, disagrees
    explains why.
  • (c) Ties concepts to another situation or refers
    to the text or coursepack.
  • (d) Offer valuable insight based on personal
    experience.
  • (e) Overall constructive feedback.

10
Issues to Consider
  1. Bonus pts for participation?
  2. Peer evaluation of work?
  3. Assess improvement?
  4. Is it timed? Allow retakes if lose connection?
    How many retakes?
  5. Give unlimited time to complete?

11
Issues to Consider
  1. Cheating? Is it really that student?
  2. Authenticity?
  3. Negotiating tasks and criteria?
  4. How measure competency?
  5. How do you demonstrate learning online?

12
Increasing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
  • http//www.academictermpapers.com/
  • http//www.termpapers-on-file.com/
  • http//www.nocheaters.com/
  • http//www.cheathouse.com/uk/index.html
  • http//www.realpapers.com/
  • http//www.pinkmonkey.com/
  • (youll never buy Cliffnotes again)

13
(No Transcript)
14
(No Transcript)
15
(No Transcript)
16
(No Transcript)
17
Reducing Cheating Online
  • Ask yourself, why are they cheating?
  • Do they value the assignment?
  • Are tasks relevant and challenging?
  • What happens to the task after submittedreused,
    woven in, posted?
  • Due at end of term? Real audience?
  • Look at pedagogy b4 calling plagiarism police!

18
Reducing Cheating Online
  • Proctored exams
  • Vary items in exam
  • Make course too hard to cheat
  • Try Plagiarism.com (300)
  • Use mastery learning for some tasks
  • Random selection of items for item pool
  • Use test passwords, rely on IP screening
  • Assign collaborative tasks

19
Reducing Cheating Online(7-30/page,
http//www.syllabus.com/ January, 2002, Phillip
Long, Plagiarism IT-Enabled Tools for Deceit?)
  • http//www.plagiarism.org/ (resource)
  • http//www.turnitin.com/ (software, 100, free 30
    day demo/trial)
  • http//www.canexus.com/ (software essay
    verification engine, 19.95)
  • http//www.plagiserve.com/ (free database of
    70,000 student term papers cliff notes)
  • http//www.academicintegrity.org/ (assoc.)
  • http//sja.ucdavis.edu/avoid.htm (guide)
  • http//www.georgetown.edu/honor/plagiarism.html

20
(No Transcript)
21
(No Transcript)
22
Turnitin Testimonials
  • "Many of my students believe that if they do not
    submit their essays, I will not discover their
    plagiarism. I will often type a paragraph or two
    of their work in myself if I suspect plagiarism.
    Every time, there was a "hit." Many students were
    successful plagiarists in high school. A service
    like this is needed to teach them that such
    practices are no longer acceptable and certainly
    not ethical!

23
Online Testing Tools
24
(No Transcript)
25
(No Transcript)
26
Test Selection Criteria (Hezel, 1999)
  • Easy to Configure Items and Test
  • Handle Symbols
  • Scheduling of Feedback (immediate?)
  • Provides Clear Input of Dates for Exam
  • Easy to Pick Items for Randomizing
  • Randomize Answers Within a Question
  • Weighting of Answer Options

27
More Test Selection Criteria
  • Recording of Multiple Submissions
  • Timed Tests
  • Comprehensive Statistics
  • Summarize in Portfolio and/or Gradebook
  • Confirmation of Test Submission

28
More Test Selection Criteria(Perry Colon, 2001)
  • Supports multiple items typesmultiple choice,
    true-false, essay, keyword
  • Can easily modify or delete items
  • Incorporate graphic or audio elements?
  • Control over number of times students can submit
    an activity or test
  • Provides feedback for each response

29
More Test Selection Criteria(Perry Colon, 2001)
  • Flexible scoringscore first, last, or average
    submission
  • Flexible reportingby individual or by item and
    cross tabulations.
  • Outputs data for further analysis
  • Provides item analysis statistics (e.g., Test
    Item Frequency Distributions).

30
Web Resources on Assessment
  • http//www.indiana.edu/best/
  • http//www.indiana.edu/best/best_suggested_links.
    shtml
  • http//www.indiana.edu/best/samsung/
  • Rubric for evaluation technology projects
  • http//www.indiana.edu/tickit/learningcenter/rubr
    ic.htm

31
Online Survey Tools for Assessment
32
Sample Survey Tools
  • Zoomerang (http//www.zoomerang.com)
  • IOTA Solutions (http//www.iotasolutions.com)
  • QuestionMark (http//www.questionmark.com/home.htm
    l)
  • SurveyShare (http//SurveyShare.com from
    Courseshare.com)
  • Survey Solutions from Perseus (http//www.perseusd
    evelopment.com/fromsurv.htm)
  • Infopoll (http//www.infopoll.com)

33
Web-Based Survey Advantages
  • Faster collection of data
  • Standardized collection format
  • Computer graphics may reduce fatigue
  • Computer controlled branching and skip sections
  • Easy to answer clicking
  • Wider distribution of respondents

34
Web-Based Survey Problems Why Lower Response
Rates?
  • Low response rate
  • Lack of time
  • Unclear instructions
  • Too lengthy
  • Too many steps
  • Cant find URL

35
Survey Tool Features
  • Support different types of items (Likert,
    multiple choice, forced ranking, paired
    comparisons, etc.)
  • Maintain email lists and email invitations
  • Conduct polls
  • Adaptive branching and cross tabulations
  • Modifiable templates library of past surveys
  • Publish reports
  • Different types of accountshosted, corporate,
    professional, etc.

36
Web-Based Survey Solutions Some Tips
  • Send second request
  • Make URL link prominent
  • Offer incentives near top of request
  • Shorten survey, make attractive, easy to read
  • Credible sponsorshipe.g., university
  • Disclose purpose, use, and privacy
  • E-mail cover letters
  • Prenotify of intent to survey

37
Tips on Authentification
  • Check e-mail access against list
  • Use password access
  • Provide keycode, PIN, or ID
  • (Futuristic Other Palm Print, fingerprint, voice
    recognition, iris scanning, facial scanning,
    handwriting recognition, picture ID)

38
Evaluation
39
Champagne Wisher (in press)
  • Simply put, an evaluation is concerned with
    judging the worth of a program and is essentially
    conducted to aid in the making of decisions by
    stakeholders. (e.g., does it work as
    effectively as the standard instructional
    approach).

40
Evaluation Purposes
  • Cost Savings
  • Improved Efficiency/Effectiveness
  • Learner Performance/Competency Improvement/Progres
    s
  • What did they learn?
  • Assessing learning impact
  • How well do learners use what they learned?
  • How much do learners use what they learn?

41
Kirkpatricks 4 Levels
  • Reaction
  • Learning
  • Behavior
  • Results

42
(No Transcript)
43
My Evaluation Plan
44
What to Evaluate?
  1. Studentattitudes, learning, jobs.
  2. Instructorpopularity, survival.
  3. Trainingeffectiveness, integratedness.
  4. Task--relevance, interactivity, collab.
  5. Tool--usable, learner-centered, friendly,
    supportive.
  6. Courseinteractivity, completion.
  7. Programgrowth, model(s), time to build.
  8. Universitycost-benefit, policies, vision.

45
Measures of Student Success(Focus groups,
interviews, observations, surveys, exams, records)
  • Positive Feedback, Recommendations
  • Increased Comprehension, Achievement
  • High Retention in Program
  • Completion Rates or Course Attrition
  • Jobs Obtained, Internships
  • Enrollment Trends for Next Semester

46
1. Student Basic Quantitative
  • Grades, Achievement
  • Number of Posts
  • Participated
  • Computer Log Activitypeak usage, messages/day,
    time of task or in system
  • Attitude Surveys

47
1. Student High-End Success
  • Message complexity, depth, interactivity, qing
  • Collaboration skills
  • Problem finding/solving and critical thinking
  • Challenging and debating others
  • Case-based reasoning, critical thinking measures
  • Portfolios, performances, PBL activities

48
2. Instructor Success
  • Technology training programs
  • Funding adequate
  • Utilize Web to share teaching
  • Positive attitudes, more signing up
  • Course recognized in tenure decisions
  • Understands how to coach

49
3. TrainingOutside Support
  • Training (FacultyTraining.net)
  • Courses and Certificates (JIU, e-education)
  • Reports, Newsletter, Pubs (e.g., surveys)
  • Aggregators of Info (CourseShare, Merlot)
  • Global Forums (FacultyOnline.com GEN
    http//www.vu.vlei.com)
  • Resources, Guides/Tips, Link Collections, Online
    Journals, Library Resources (e-global Library)

50
TELEStraining.com
  • Courses
  • DWeb Training the TrainerDesigning, Developing,
    and Delivering Web-Based Training (1,200
    Canadian)
  • (8 weeks Technology, design, learning,
    moderating, assessment, course development,
  • Techniques for Online Teaching and Moderation
  • Writing Multimedia Messages for Training

51
Certified Online Instructor Program
  • Walden Institute12 Week Online Certification
    (Cost 995)
  • 2 tracks one for higher ed and one for online
    corporate trainer
  • Online tools and purpose
  • Instructional design theory techniques
  • Distance ed evaluation
  • Quality assurance
  • Collab learning communities

52
Distance Ed Certificate Program (Univ of
Wisconsin-Madison)
  • 12-18 month self-paced certificate program, 20
    CEUs, 2,500-3,185
  • Integrate into practical experiences
  • Combines distance learning formats to cater to
    busy working professionals
  • Open enrollment and self-paced
  • Support services

53
(No Transcript)
54
(No Transcript)
55
http//www.utexas.edu/world/lecture/
56
(No Transcript)
57
(No Transcript)
58
(No Transcript)
59
3. TrainingInside Support
  • Instructional Consulting
  • Mentoring (strategic planning )
  • Small Pots of Funding
  • Facilities
  • Summer and Year Round Workshops
  • Office of Distributed Learning
  • Colloquiums, Tech Showcases, Guest Speakers
  • Newsletters, guides, active learning grants,
    annual reports, faculty development, brown bags

60
Technology and Professional Dev Ten Tips to Make
it Better (Rogers, 2000)
  • 1. Offer training
  • 2. Give technology to take home
  • 3. Provide on-site technical support
  • 4. Encourage collegial collaboration
  • 5. Send to prof development conference
  • 6. Stretch the day
  • 7. Encourage research
  • 8. Provide online resources
  • 9. Lunch bytes, faculty institutes
  • 10. Celebrate success

61
RIDIC5-ULO3US Model of Technology Use
  • 4. Tasks (RIDIC)
  • Relevance
  • Individualization
  • Depth of Discussion
  • Interactivity
  • Collaboration-Control-Choice-Constructivistic-Comm
    unity

62
RIDIC5-ULO3US Model of Technology Use
  • 5. Tech Tools (ULOUS)
  • Utility/Usable
  • Learner-Centeredness
  • Opportunities with Outsiders Online
  • Ultra Friendly
  • Supportive

63
6. Course Success
  • Few technological glitches/bugs
  • Adequate online support
  • Increasing enrollment trends
  • Course quality (interactivity rating)
  • Monies paid
  • Accepted by other programs

64
7. Online Program or Course Budgethttp//webpages
.marshall.edu/morgan16/onlinecosts/brian.morgan_at_
marshall.edu (asks how pay, how large is course,
tech fees charged, of courses, tuition rate,
etc.)
  • Indirect Costs learner disk space, coordination,
    phone, admin training, creating student criteria,
    accreditation, integration with existing
    technology and procedures, library resources, on
    site orientation tech training, faculty
    training, office space, supplies
  • Direct Costs courseware, instructor, business
    manager, help desk, books, seat time, bandwidth
    and data communications, server, server back-up,
    course developers, postage

65
7. ProgramOnline Content Considerations
  • Live mentors?
  • Beyond content dumping?
  • Interactivity? Collaboration?
  • Individual or cohort groups?
  • Lecture or problem-based learning?
  • Record keeping and assessment?

66
8. Institutional Success
  • E-Enrollments from
  • new students, alumni, existing students
  • Additional grants
  • Press, publication, partners, attention
  • Cost-Benefit model
  • Faculty attitudes
  • Acceptable policies

67
8. Increase Accessibility
  • Make Web material ADA compliant (Bobby)
  • Embed interactivity in lessons
  • Determine student learning preferences
  • Conduct usability testing
  • Consider slowest speed systems
  • Orientations, training, support materials
  • e.g., CD-ROM

68
8. Initial Lessons to Learn
  • Start small, be clear, flexible
  • Create standards and policies
  • Consider Instructor Compensation online teaching
    is not the same
  • Look at obstacles and support structures
  • Mixed or blended may dominate

69
8. What steps in getting it work?
  • Institutional support/White Paper
  • Identify goals, policies, assess plans, resources
    (hardware, software, support, people)
  • Faculty qualifications compensation
  • Audience Needs student or corporate
  • Finding Funding Partnering
  • Test software
  • usability testing, system compatibility, fits
    tech plans

70
  • 8. How long to build a program?
  • Year 1 Experimental Stage
  • Year 2 Development Stage
  • Hire people, creating marketing materials,
    assess, etc.
  • Year 3 Revision Stage
  • Year 4 Move On Stage

71
Final advicewhatever you do
72
Ok, How and What Do You Assess and Evaluate?
Write a Comment
User Comments (0)
About PowerShow.com