Title: Training the OSCE Examiners
1Training the OSCE Examiners
- Katharine Boursicot
- Trudie Roberts
2Programme
- Principles of OSCEs for examiners
- Video marking
- Marking live stations
- Strategies for enhancing examiner participation
in training
3Academic principles of OSCEs
- The basics
- What is an OSCE?
- More academic detail
- Why use OSCEs?
- The role of examiners
- Examiners in OSCEs
4The basics
- For examiners who dont know about OSCEs
- A brief reminder for those who are familiar with
OSCEs
5What is an OSCE?
- Objective
- Structured
- Clinical
- Examination
6OSCE test design
Station
7OSCEs - Objective
- All the candidates are presented with the same
test
8OSCEs - Structured
- The marking scheme for each station is structured
- Specific skill modalities are tested at each
station - History taking
- Explanation
- Clinical examination
- Procedures
9OSCEs Clinical Examination
- Test of performance of clinical skills not a
test of knowledge - the candidates have to demonstrate their skills
10More academic detail
- Why use OSCEs in clinical assessment?
- Improved reliability
- Fairer test of candidates clinical abilities
11Why use OSCEs in clinical assessment?
- Careful specification of content
- Observation of wide sample of activities
- Structured interaction between examiner and
student - Structured marking schedule
- Each student has to perform the same tasks
12Characteristics of assessment instruments
- Utility
- Reliability
- Validity
- Educational impact
- Acceptability
- Feasibility
- Reference
- Van der Vleuten, C. The assessment of
professional competence developments,research
and practical implications Advances in Health
Science Education 1996, Vol 1 41-67
13Test characteristics
- Reliability of a test/ measure
- reproducibility of scores across raters,
questions, cases, occasions - capability of differentiating consistently
between good and poor students
14Sampling
Domain of Interest
?
?
15Reliability
- Competencies are highly domain-specific
- broad sampling is required to obtain adequate
reliability - across content i.e. range of cases/situations
- across other potential factors that cause error
variance i.e. - testing time, examiners, patients, settings,
facilities
16OSCE blueprint
History Explan Exam Procedure
CVS Chest pain Disch drugs Cardiac BP
RS Haemoptysis Smoking Resp Peak flow
GIS Abdo pain Gastroscopy Abdo PR
Repro Amenorrhoea Abnormal smear Cx smear
NS Headache Eyes Ophthalmosc
MS Backache Hip
Generic Pre-op assess Consent for post mortem IV cannulation Blood trans rea
17Test characteristics
- Validity of a test/measure
- the test measures the characteristic (eg
knowledge, skills) that it is intended to measure
18Model of competence
Professional authenticity
Miller GE. The assessment of clinical
skills/competence/performance. Academic Medicine
(Supplement) 1990 65 S63-S67.
19Validity of testing formats
Professional practice assessment
Performance assessment OSCEs, long/short cases,
OSLERs, etc
Problem-solving assessment EMQs, SEQs
Knowledge assessment MCQs
20Test characteristics Educational impact
Relationship between assessment and learning
Curriculum
Assessment
Teacher
Student
21Test characteristics
- Feasibility
- cost
- human resource
- physical resources
22Test characteristics
- Acceptability
- tolerable effort
- reasonable cost
- Acceptability
- doctors
- licensing bodies
- employers
- patients/consumer groups
- students
- faculty
23The role of examiners in OSCEs
- General
- Types of stations
- Standard setting
- Practice at marking
24The role of examiners in OSCEs
- To observe the performance of the student at a
particular task - To score according to the marking schedule
- To contribute to the good conduct of the
examination
25The role of examiners in OSCEs
- It is NOT to
- Conduct a viva voce
- Re-write the station
- Interfere with the simulated patients role
- Design their own marking scheme
- Teach
26Types of OSCE stations
- History taking
- Explanation
- Clinical examination
- Procedures
27Communication skills
- Stations involving patients, simulated patients
or volunteers - Content vs process i.e
- what the candidate says
- vs
- how the candidate says it
28Clinical skills
- People
- Professional behaviour
- Manikins
- Describe actions to the examiner
29The examiners role in standard setting
- Use your clinical expertise to judge the
candidates performance - Allocate a global judgement on the candidates
performance at that station - Remember the level of the examination
-
-
30Global scoring
Excellent pass
Very good pass
Clear pass
Borderline
Clear fail
31Borderline method
Test score distribution
Checklist
1. Hs shjs sjnhss sjhs sjs sj 2. Ksks
sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk
dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk
dd 6. Hskls skj sls ska ak akl ald 7.
Hdhhddh shs ahhakk as TOTAL
?
?
?
?
Borderline score distribution
?
Pass, Fail, Borderline P/B/F
Passing score
32Regression based standard
Checklist
1. Hs shjs sjnhss sjhs sjs sj 2. Ksks
sksmsiqopql qlqmq q q qkl 3. Lalka kdm ddkk
dlkl dlld 4. Keyw dd e r rrmt tmk 5. Jfjfk
dd 6. Hskls skj sls ska ak akl ald 7.
Hdhhddh shs ahhakk as TOTAL
?
X passing score
?
Checklist Score
?
?
X
?
Overall rating 1 2 3 4 5
1 Clear fail 2 Borderline 3 Clear pass 4
V Good pass 5 Excellent pass
Clear Borderline Clear V Good
Excellent fail pass pass
pass
33Practice at marking
- Videos
- Live stations
- Mini-OSCE
34(No Transcript)
35Strategies for enhancing examiner participation
- CME
- Job plan/ part of contract
- Specific allocation of SIFT
- Experience for post-graduate examinations
- Payment