Progress Testing with SIR - PowerPoint PPT Presentation

About This Presentation
Title:

Progress Testing with SIR

Description:

... on learning behaviours tutorial functioning self/peer ... at patterns.. across time week in ... save table, spss save SIR 2000 \ valued features (1) ... – PowerPoint PPT presentation

Number of Views:109
Avg rating:3.0/5.0
Slides: 32
Provided by: DKe68
Category:

less

Transcript and Presenter's Notes

Title: Progress Testing with SIR


1
Progress Testing with SIR
  • A Case Study
  • Based on the McMaster
  • Undergraduate MD Programme

SIR UK User Group Conference Aberdeen, UK, 21
June 2002 David Keane, Research Associate
(keaned_at_mcmaster.ca) Programme for Educational
Research and Development Faculty of Health
Sciences, McMaster University Hamilton, Ontario,
Canada www.fhs.mcmaster.ca/perd
2
Objectives
  • 1. Introduction to progress testing
  • definition
  • purpose
  • method
  • goals
  • special features
  • basic patterns in performance data
  • 2. Using SIR for progress testing

3
Progress testing \ a definition
  • longitudinal testing of knowledge acquisition

4
Progress testing \ a definition
  • longitudinal testing of knowledge acquisition
  • an objective method for assessing the acquisition
    and retention of knowledge over time relative to
    curriculum-wide goals

5
Progress testing \ definition detail
an objective method for assessing the acquisition
and retention of knowledge over time relative to
curriculum-wide goals
  • objective
  • use multiple choice questions
  • knowledge
  • test what learners know
  • over time
  • test repeatedly at regular intervals
  • curriculum-wide
  • address end-of-programme learning objectives

6
Progress testing \ the purpose
  • to determine whether the learner is progressing
  • learning enough?
  • retaining enough?
  • doing so quickly enough?

7
Progress testing \ the method
  • progress is relative
  • compare learner to his/her peer group
  • current class or past n classes
  • standardized scores (z-scores)
  • review performance on multiple tests
  • current and past
  • assessed with one measure
  • percentage correct, whole test
  • adjust for guessing?

8
Progress testing \ goals
  • help the learner (formative evaluation)
  • constructive feedback
  • about their knowledge base
  • about their ability to self-assess
  • has to be specific/detailed
  • timely feedback
  • reassure those who are progressing
  • alert those who are not
  • do so early enough to facilitate effective
    remediation

9
Progress testing \ goals
  • help the Programme (summative evaluation)
  • provide defensible evidence to support critical
    decisions pertaining to individuals
  • mandated remediation
  • pass / fail / conditional advancement
  • the emphasis..
  • on formative aspects
  • minimize negative impact on learning behaviours
  • tutorial functioning
  • self/peer-assessment

10
Progress testing \ special features
  • the item bank
  • a sample of the knowledge that a good student
    will likely encounter by the time that student. .
  • graduates?
  • is six months / a year beyond graduation?
  • content encompasses nearly the 'entire' domain of
    the field in question
  • cf. course/curriculum 'core' knowledge

11
Progress testing \ special features
  • instructions to examinees
  • don't study for this test
  • 180 items, randomly selected from 2,600
  • don't guess
  • test your ability to self-assess
  • penalty for guessing (optional)
  • attempt only those items for which you have some
    knowledge and are reasonably confident you know
    the best/correct answer

12
Basic patterns in performance data
  • Class means on whole test for..
  • attempted
  • correct
  • not adjusted for assumed guesses
  • look at patterns..
  • across time
  • week in programme (x of 138)
  • across classes at week x

13
gif Items Attempted ()
14
gif Items corect ()
15
Basic patterns in performance data
  • attempted, correct
  • patterns are relatively stable across tests and
    classes
  • means at week x are relatively consistent across
    tests and classes
  • examinee performance is relatively consistent
    across tests and classes
  • overall test reliability 0.6 - 0.7 (8 tests)
  • test-retest correlation 0.6 - 0.8 (2 tests)

16
End of Part I
  • Introduction to Progress Testing
  • Any questions?

17
Objectives
  • Introduce progress testing
  • 2. Using SIR for progress testing
  • what's in an item?
  • data management tasks
  • managing dm tasks
  • software
  • databases and pql
  • SIR \ valued features
  • future enhancements

18
What's in an item? (1)
  • the examinee sees..

nn. An elderly woman has been showing signs of
forgetfulness, poor concentration, and decreased
energy and appetite. On exami-nation her
cognitive functioning seems quite good and her
mini-mental (Folstein) score is 27/30. The most
likely diagnosis is A Anxiety disorder B
Multi-infarct dementia C Alzheimer disease D
Personality disorder E Depression
Stem Options
19
What's in an item? (2)
  • the data manager sees..
  • stem and options (text)
  • and
  • unique item identifier
  • correct response code
  • content codes (6 fields, 1, 2 or 3 sets)
  • item performance data
  • stats on usage, power to discriminate
  • by test, class across tests, classes
  • and more..
  • date last used, don't use flag

20
Data management tasks
  • 1. store, retrieve, manipulate and print
  • large volumes of textual information
  • pre-test test booklets
  • 180 items, 2122 pages/booklet
  • post-test performance reports
  • for examinees 2 reports x 12 pages/rep
  • for administrator re. items, tests, classes
    and examinees who are not progressing
  • accommodate special needs re.
  • special characters Greek letters, math symbols
  • page layout, fonts, typeface style
  • merge data into report templates

21
Data management tasks \ post-test
  • 2. read examinees' responses
  • 100-item optical mark response sheets
  • tab-delimited ascii records
  • Mac 2 sheets X approx. 280420 examinees
  • 3. score examinees' responses
  • requires item, test, class, examinee info
  • compute and retain performance stats
  • key measures attempted, correct
  • mean sd re. whole test (and major subdomains?)
  • for each examinee, each classpeer group

22
Data management tasks \ post-test
  • 4. compute and retain item performance
  • stats
  • requires item, test, class, examinee info
  • 5. compute/retrieve data needed in
  • standard reports
  • re. examinees, classes, tests, items
  • 6. assemble and print reports

23
Data management tasks
  • 7. enable support staff to do all of the
  • above with relative ease
  • minimal reliance on the application programmer
    after everything is up and running

24
What's the best tool for the job?
  • SIR is not a word processor
  • SIR is a record management and stats-oriented
    reporting tool
  • allows user to build powerful custom applications
  • vendor provides exceptional support beyond the
    installed Help files
  • prompt, relevant and practical replies

25
Solutions \ the best tool
  • the MD Programme's solution
  • for text-intensive tasks..
  • Corel WordPerfect
  • - for numeric data and stats-intensive tasks..
  • SIR

26
Solutions \ Corel WordPerfect
  • a set of merge data files (database)
  • case-based by item id
  • item stems, options and other fixed info
  • export data via csv or fixed-format records
  • import data via csv-format records
  • into merge data files
  • multiple merge forms (report templates)
  • extensive use of merge and macro commands to
    produce pre/post-test reports
  • custom-build mergemacro applications

27
Solutions \ SIR ver. 3.2 - 2000
  • 2 databases, case-based
  • ITEMS re. items
  • TEEX re. tests, examinees, classes
  • major reliance on (vb) PQL
  • custom applications
  • csv-format records
  • add/update records/fields (eg, from WP)
  • write records/values (eg, for WP)
  • PQL procedures
  • csv save, tabulate, save table, spss save

28
SIR 2000 \ valued features (1)
  • DBMS
  • case-based option for db type
  • system-maintained
  • easy access to any case's records
  • case id is on all dumped records
  • global variables
  • pass user settings to applications
  • utilities
  • Data \ File Dump, File Input
  • tabfiles and tables
  • create, index, populate, delete tables

29
SIR 2000 \ valued features (2)
  • PQL
  • nested access to cases
  • read csv-format records
  • vb dialog boxes
  • PQL Procedures
  • write csv-format records
  • xtab tables, flexibility re. headers (columns)
  • write SPSS system files

30
Future enhancements
  • upgrade to SIR 2002 (from SIR 2000)
  • update custom applications (to vb pql)
  • add secondary indexes
  • examinees by name, current class
  • web access
  • for examinees performance reports
  • method?
  • ColdFusion (CF SQL ODBC driver SIR db)
  • CGI scripts

31
End of Part II
  • Using SIR for progress testing
  • Any questions?
Write a Comment
User Comments (0)
About PowerShow.com