Title: Faculty of Medicine, University of Newcastle
1Faculty of Medicine, University of
Newcastle Newcastle upon Tyne, NE2 4HH Tel 0191
222 5888 Fax 0191 222 5016
enquiries_at_ltsn-01.ac.uk www.ltsn-01.ac.uk
1
2Learning and Teaching Support Network for
Medicine, Dentistry and Veterinary Medicine
- Providing professional educational support to
teachers, students and practitioners in the UK
with subject-based provision for sharing
innovation and good practice in learning and
teaching - Practical answers to practical problems
2
3What Is the Learning and Teaching Support Network?
- 24 subject centres
- One stop shop
- Generic Centre
- Funded by UK HE Funding Councils for 3 years
- Broader remit than CTI Centres
3
4Who Benefits From the LTSN?
- Individuals
- Departments
- Faculties and institutions
- Further information on all 24 centres
- www.ltsn.ac.uk
4
5Learning and Teaching Support Network for
Medicine, Dentistry and Veterinary Medicine
- Based in Faculty of Medicine, University of
Newcastle - Partners at
- School of Veterinary Medicine, University of
Edinburgh - Royal College of Physicians, London
- www.ltsn-01.ac.uk
5
6Objective
- To facilitate the development of caring,
knowledgeable, competent and skilful graduates
who broadly understand health and disease and who
are able to benefit from subsequent education and
adapt to future developments in practice.
7
7Using OSCEs
8Using OSCEs
- Aim
- To explore the use of OSCEs in assessment in
veterinary medicine - Learning outcomes
- Review the principles of assessment
- Increased understanding of the OSCE, its
advantages disadvantages - Consider feasibility desirability of
introducing OSCEs into assessment schedules in
veterinary science
9Using OSCEs
- Content
- Exercise
- Some thoughts about clinical reasoning
- Review of principles of assessment
- What is an OSCE? - video
- More OSCE theory
- Exercise - design an OSCE station
- Evaluation
10Using OSCEs
- An exercise
- Divide into 3s
- Context An examination
- Roles
- Examinee
- Examiner
- Patient
- Perform the test - 4 minutes only!
11Some thoughts about clinical reasoning
12Clinical reasoning
- Complex!
- Correlates strongly with knowledge (organisation
access, rather than quantity) - Qualititative difference between novices and
experts - novices inductive deductive processes
- experts pattern recognition
- Not a generic skill i.e. is content specific
13Principles of assessment
14- Examinations are formidable even to the best
prepared, for the greatest fool may ask more than
the wisest man can answer. - (Charles Colton, 1780-1832)
15- Assessment of learning
- The tail that wags the dog
16A definition of assessment
- A systematic procedure for measuring a sample of
a learners behaviour in order to make a
judgement about the learner
17Why assess?
- Measuring academic achievement
- Assuring standards
- Diagnosing problems
- Encouraging appropriate learning
- both content (what?) and process (how?)
- Evaluation of teacher or course effectiveness
- Predicting future performance
18Overall aim of assessment
- To attempt to approximate to the real
professional/educational world whilst maintaining
standardised test conditions, at a level
appropriate to the learner
19However
- Assessment of professional competence is complex
and no single method can assess fully - need for triangulation
- Often the wrong test format is used to assess a
particular area of learning - Practice is bedevilled by prejudice, opinions,
sentiments, traditions, values, intuitive
beliefs and past experiences - Assessment is often seen as a bolt-on extra
rather than an integral part of the learning
process
20The adverse effects of some assessments used
historically include...
- Creation of a hurdle-jump and pass forget
mentality - Restriction of learning to perceived exam content
- Loss of self-esteem, humiliation even career
difficulties - Engendering competitiveness?
- Wrong decisions
21Are we assessing appropriately?From
Undergraduate examinations - a continuing
tyrannyRichard Godfrey, Lancet 1995
- Are all assessments planned in full integration
with the rest of the course? - Does every assessment have clearly defined
objectives, relating to purpose, content, method,
and effect on student learning? - Are all assessments of proven reliability,
validity and practicality? - Is regular formative assessment being practiced,
and are summative assessments being reduced in
favour of formative? - Are methods that require critical reasoning being
favoured over rote-recall tests?
22- Are students being permitted to take books and
other resources into examinations? - Are all aspects of clinical work assessed,
including skills in communication and the
recognition of key features? - Are assessments requiring original investigation
being increased, such as in-depth case studies,
lit reviews etc? - Do students keep a running portfolio of their
assessment performance, with a defined guide on
what constitutes satisfactory progress? - Have all gradings except satisfactory/unsatisfact
ory been abandoned?
23Concepts
- Summative and formative
- Norm referencing
- Criterion referencing
- Blueprinting
- Standard-setting
24Content
- Should relate to intended learning outcomes
- Areas of learning to be tested may include
- knowledge
- skills
- attitudes
- Need a representative sample of things to be
tested
25Utility of an assessment instrument
- Utility V x R x A x E x C
- V validity
- R reliability
- A acceptability
- E educational impact
- C cost
26Validity
- Does it measure what it is supposed to measure?
- Ideally, to ensure validity the instrument
should - appear to measure the item(s) of interest face
validity - contain a representative sample of items to be
assessed measure items of interest enough times
content validity - be capable of measuring the item(s) of interest
construct validity - be able to predict future change in behaviour
predictive validity
27Reliability
- The extent to which the instrument consistently
measures what it is supposed to measure. Thus - different examiners assessing the same work
should award the same scores - examiners should award the same score on another
occasion - students should get the same score when it is
administered at different times - It takes a lot of cases/items/raters to generate
a reliable score
28Acceptability
- Staff and students are influenced by a wide range
of values and beliefs - Research and evidence have had relatively little
impact - However, assessment that is not acceptable to
staff and students will not engender confidence
(or worse will cause hostility) - Provision of information, and compromise are
important
29Educational impact
- The tail that wags the dog
- Assessment drives learning through
- content
- format
- programming
- Educators can exploit this principle effectively
30Cost
- Resource implications are important
- However, consider the relative costs of teaching
and assessment - Value for money
31Other points
- Fairness/transparency assessment should
obviously be seen to be fair - Asssessment needs to be managed, in the same was
that the rest of the curriculum does
32Utility of an assessment instrument
- Utility V x R x A x E x C
- V validity
- R reliability
- A acceptability
- E educational impact
- C cost
33Utility equation applied to True/False MCQs
- Intended purpose
- to test knowledge
- Drawbacks
- trivialising knowledge (?)
- authoring
- Utility
- Reliability Validity
- Educational Impact /- Acceptability
- Cost --
34Utility equation applied to the viva voce
- Intended purpose
- to test clinical reasoning problem solving, or
depth/breadth of knowledge - Drawbacks
- case-specificity - sampling
- potential problems of bias
- Utility
- Reliability - Validity
- Educational Impact Acceptability
- Cost
35Selection of appropriate assessment method(s)
- All methods have advantages and disadvantages
(trade off between items in the utility equation) - Choice should be based on the learning you need
to assess, recognising the practical constraints
within which you work - Dont compromise validity in return for ease of
administration - Assessment blueprinting is extremely useful
36Blueprinting
- Blueprinting
- ensures that assessment reflects the curriculum
(both content areas of learning) - helps planning
- enables quality assurance evaluation
- makes the process transparent
- However it is rarely feasible to assess
everything in a curriculum, hence the need for
sampling
37What is an OSCE?
38More OSCE theory
39OSCEs
- First introduced in 1970s in undergraduate
medicine - Harden et al, Dundee
- Now in almost universal use in medicine,
increasingly also in dentistry, nursing PAMs - Subject to huge amounts of research (cf other
assessment instruments) - Not really an assessment format per se, more an
administrative format
40OSCEs
- Different kinds of stations
- observed
- non-observed
- linked
- rest stations
- critical
41Utility equation applied to OSCEs
- Intended purpose
- to test clinical skills, especially communication
physical examination - Drawbacks
- case-specificity - sampling
- skills are abstracted
- may encourage a checklist approach to preparation
- Utility
- Reliability Validity
- Educational Impact Acceptability
- Cost
42Advantages of OSCE
- Each student (in theory) undergoes the same
assessment, under identical conditions - Versatile format
- Potentially provides both student school with
detailed analysis of performance - Potential for objective feedback
43Disadvantages of OSCE
- Resource intensive
- Time pressures
- Hard work for all concerned
- Experienced examiners may not like the format
- Experts generally do worse than novices!
44Other issues
- Main influence on reliability is content
- all other factors such as examiner patient
variability are less of a problem - Stations need piloting
- Patients need training
- Examiners need training
- Ideally standard setting required to set pass
marks - Test security
- NEED FOR PLANNING
45Content of recent OSCE circuit at Newcastle(3rd
year students)
- Examination of legs (? DVT) - PhD student
- Explanation about diabetes - trained role player
- Cardio-pulmonary resuscitation - mannequin
- History of abdominal pain (concealed alcohol
problem) - trained role player - Examination of heart (systolic murmur) - real
patient - Examination of chest (COPD) - real patient
- History of abdominal pain (IBS) - PhD student
- Insertion of i/v cannula - mannequin
- History of headache (space occupying lesion) -
PhD student - Explanation about taking the Pill - PhD student
46From the recent literature
- Clinical experience, learning styles OSCE
performance - Does studying for an OSCE make a difference?
- Knowledge-of-skills testing as an adjuct to OSCE
- OSVE - Objective Structured Video Examination
- Effects of examiner fatigue
47Real simulated patients
- Spectrum
- real, spontaneous------standardized, trained
- Horses for courses
- Briefing training
- The patient as assessor
48Design an OSCE station
49Design an OSCE station
- In 3s
- Need
- a scenario
- instructions for candidate examiner
- instructions for role player if appropriate
- resources required