ICT in Assessment and Learning - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

ICT in Assessment and Learning

Description:

Holistic nature of The Learning System' : Assessment is a goal for teachers ... Candidate bookmark. Audio feature for questions. SCROLLA Symposium 6 Feb 2002. 8 ... – PowerPoint PPT presentation

Number of Views:690
Avg rating:3.0/5.0
Slides: 27
Provided by: roberth81
Category:

less

Transcript and Presenter's Notes

Title: ICT in Assessment and Learning


1
ICT in Assessment and Learning
  • Developments from the Enigma Project
  • Robert HardingNick Raikes
  • ITAL UnitInteractive Technologies in Assessment
    and Learning

2
Introduction
  • ICT-led innovation
  • Exciting image
  • but assessment is Cinderella!
  • Holistic nature of The Learning System
  • Assessment is a goal for teachers and learners
  • but valid assessment must be rooted in learning
  • How will new methods fit with existing practice?
  • The Three-legged race.

3
Outline
  • The Enigma Project trials
  • Scope
  • User interface
  • Evaluation
  • Traditional style questions (conceptual)
  • New interactive style questions (analytical)
  • Administrative issues
  • Developments triggered as a result
  • Brief resumé of some resulting work
  • Closer look at one aspect
  • on-screen marking
  • Conclusion.

4
The Enigma Project- Scope
  • Two years, 32 schools, about 170120 pupils
  • Pilot 1 O-Level Physics examination in paper and
    CBT forms
  • A few graphics manipulation items
  • Pilot 2 Selection of O-Level Science in CBT form
  • Conceptual objective items
  • Analytical simulationfree text answer items
  • Marking
  • Objective/multiple choice automatic
  • Open answers printed out, marked by hand
  • Comparative evaluation of pupils performance.

5
User Interface
  • Navigation
  • Time count-down
  • Progress indicator
  • Candidate bookmark
  • Audio feature for questions.

6
(No Transcript)
7
User Interface
  • Navigation
  • Time count-down
  • Progress indicator
  • Candidate bookmark
  • Audio feature for questions.

8
Evaluation - traditional
  • Pilot 1 the evaluators concluded
  • For MCQs, no obvious differences between groups
    (3)
  • For open-ended, those on paper theory Qs better
    than CBT (11)
  • Pilot 2 poor correlation between pilot and real
    examinations
  • Was computer aptitude responsible?
  • Was it mismatch of trial vs real examination?.

9
Evaluation - analytical
  • Objective - how do students react to ICT oriented
    question styles?
  • Use of simulations
  • 6 Qs 2 each Physics, Chemistry, Biology
  • Panel of examiners
  • Typical example follows note
  • access to simulation, free text response boxes
  • ability to record student actions.

10
(No Transcript)
11
(No Transcript)
12
Evaluation - analytical
  • Student responses
  • Only 1 in 6 failed to understand what was
    required of them
  • but 40 said occasions when unsure what to
    do.(but how true of most of us in many
    situations!)
  • Poor correlations between this test and real
    examination
  • BUT 70 said
  • computer based tests are as fair as conventional
    tests
  • Computer literacy cited as most common reason for
    bias.

13
Evaluation - overall
  • 40 students said the computer slowed them down
  • Interpretation they are unfamiliar with
    computers in schoolwork
  • Conclude that learning and assessment must be
    integrated
  • 40 said computer stopped them answering as
    wanted
  • Most common complaint - not enough space to write
  • Did not like not being allowed to go back and
    change answer
  • 70 thought CBTs to be as fair as paper-based
  • Reasons why not so fair typing speed and
    literacy (lack of).

14
Administrative issues
  • Schools normal IT systems can interact with
    assessment software
  • Individual system crashes - incremental backup
    essential
  • High level of expert attention was needed
  • Is printing needed?
  • Security - visibility of screens
  • At least one candidate observed using email!
  • Issues of accessibility to other software and
    data
  • Interference with normal working must be
    minimised.

15
What developed? - Analysis
  • We have not seen a surge of CBT use in schools
  • Material circumstances and resources
  • Holistic nature of changes needed
  • Cast of influences
  • Can you box off summative assessment?
  • Traditional links between assessment and teaching
  • Examiner-teachers set questions, not Boards
  • Feedback loops
  • Learning process geared to passing examinations
  • Examinations are rooted in educational
    ambience.

16
What we did - ambience
  • Teacher support for using ICT - TEEM
  • http//www.teem.org.uk/
  • Syllabus support
  • e-Lists and electronic communities
  • Electronic learning resourcese.g. Enigma
    simulations on the www
  • http//www.cie.org.uk/learning/science_practicals/
    .

17
What we did - assessment
  • CALM / CUE / EQL collaboration
  • MEI A-Level Maths examinations
  • IMS standards and QTI workgroup
  • Question test interoperability
  • Standards for Examination conduct
  • On-screen marking.

18
On-screen marking
  • Scan scripts, or capture electronic scripts
  • What do we want it for?
  • Faster, more flexible marking, management
  • More efficient quality control
  • Better feedback to Centres
  • Transition to on-line assessment.

19
Study - proof of concept
  • Actual, live scripts
  • O-Level Maths (350)
  • A-Level Geography (900), Eng Lit (300)
  • 5 examiners per subject
  • Conventional and screen marked
  • Whole script, marking by question
  • Download scripts via Internet.

20
Features of the software
  • Ticks and crosses
  • Anchored comments
  • Display control - e.g. zoom, script scroll
  • View marking guide
  • Navigation between question or script
  • Progress monitor and referal.

21
Examiners impressions
  • Generally managed the downloading
  • Scripts at least as legible as on paper
  • Most felt they gave same marks
  • Exceptions in English Lit Geography
  • ?s on trial nature, annotation, whole script
  • Maths points re marking by question
  • All would mark on-screen again.

22
Analysis of marks
  • Maths component - consistency
  • Geography - mostly satisfactory
  • one examiner more severe on screen
  • one consistent on paper but not on screen
  • English
  • 2 more severe on screen
  • All more consistent on paper than screen
  • Possible link with whole script judgement?.

23
Conclusions
  • Holistic nature of system
  • Three-legged race - or more?
  • Central role of teachers
  • Challenge
  • Integrate, make the technology invisible
  • Way forward - open source?.

24
Some URLs and email
  • ITAL Unit
  • http//ital.ucles-red.cam.ac.uk/
  • Teacher support for using ICT - TEEM
  • http//www.teem.org.uk/
  • Enigma simulations on the www
  • http//www.cie.org.uk/learning/science_practicals/
  • Robert Harding ltR.D.Harding_at_ucles-red.cam.ac.ukgt
  • Nick Raikes ltn.raikes_at_ucles-red.cam.ac.ukgt .

25
(No Transcript)
26
(No Transcript)
27
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com