Pr - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

Pr

Description:

Title: Pr sentation PowerPoint Last modified by: Amanda Galli Document presentation format: Presentaci n en pantalla Company: Cessul Other titles – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 46
Provided by: saidemOr
Category:
Tags: exam | osce

less

Transcript and Presenter's Notes

Title: Pr


1
Where are we with assessment and where are we
going? Cees van der Vleuten University of
Maastricht This presentation can be found
at www.fdg.unimaas.nl/educ/cees/amee
2
Overview of presentation
  • Where is education going?
  • Where are we with assessment?
  • Where are we going with assessment?
  • Conclusions

3
Where is education going?
  • School-based learning
  • Discipline-based curricula
  • (Systems) integrated curricula
  • Problem-based curricula
  • Outcome/competency-based curricula

4
Where is education going?
  • Underlying educational principles
  • Continuous learning of, or practicing with,
    authentic tasks (in steps of complexity with
    constant attention to transfer)
  • Integration of cognitive, behavioural and
    affective skills
  • Active, self-directed learning in collaboration
    with others
  • Fostering domain-independent skills, competencies
    (e.g. team work, communication, presentation,
    science orientation, leadership professional
    behaviour.).

5
Where is education going?
Constructivism
Cognitive psychology
  • Underlying educational principles
  • Continuous learning of, or practicing with,
    authentic tasks (in steps of complexity with
    constant attention to transfer)
  • Integration of cognitive, behavioural and
    affective skills
  • Active, self-directed learning in collaboration
    with others
  • Fostering domain-independent skills, competencies
    (e.g. team work, communication, presentation,
    science orientation, leadership professional
    behaviour.).

Collaborative learning theory
Cognitive load theory
Empirical evidence
6
Where is education going?
  • Work-based learning
  • Practice, practice, practice.
  • Optimising learning by
  • More reflective practice
  • More structure in the haphazard learning process
  • More feedback, monitoring, guiding, reflection,
    role modelling
  • Fostering of learning culture or climate
  • Fostering of domain-independent skills
    (professional behaviour, team skills, etc).

7
Where is education going?
  • Work-based learning
  • Practice, practice, practice.
  • Optimising learning by
  • More reflective practice
  • More structure in the haphazard learning process
  • More feedback, monitoring, guiding, reflection,
    role modelling
  • Fostering of learning culture or climate
  • Fostering of domain-independent skills
    (professional behaviour, team skills, etc).

Deliberate practice
Emerging work-based learning theories
Empirical evidence
8
Where is education going?
  • Educational reform is on the agenda everywhere
  • Education is professionalizing rapidly
  • A lot of educational technology is available
  • How about assessment?

9
Overview of presentation
  • Where is education going?
  • Where are we with assessment?
  • Where are we going with assessment?
  • Conclusions

10
Expanding our toolbox..
Does
Shows how
Knows how
Knows
11
Expanding our toolbox..
Does
Shows how
Knows how
Knows
12
Expanding our toolbox..
Does
Shows how
Knows how
Knows
13
Expanding our toolbox..
Does
Shows how
Knows how
Knows
Domain specific skills
14
What have we learned?
  • Competence is specific, not generic

15
Reliability as a function of testing time
Case- Based Short Essay2 0.68 0.73 0.84 0.82
Practice Video Assess- ment7 0.62 0.76 0.93 0.93
Mini CEX6 0.73 0.84 0.92 0.96
In- cognito SPs8 0.61 0.76 0.92 0.93
Testing Time in Hours 1 2 4 8
MCQ1 0.62 0.76 0.93 0.93
PMP1 0.36 0.53 0.69 0.82
Oral Exam3 0.50 0.69 0.82 0.90
Long Case4 0.60 0.75 0.86 0.90
OSCE5 0.47 0.64 0.78 0.88
1Norcini et al., 1985 2Stalenhoef-Halling et al.,
1990 3Swanson, 1987
4Wass et al., 2001 5Petrusa, 2002 6Norcini et
al., 1999
7Ram et al., 1999 8Gorter, 2002
16
What have we learned?
  • Competence is specific, not generic
  • Any single point measure is flawed
  • One measure is no measure
  • No method is inherently superior
  • Subjectivity/unstandardised conditions is not
    something to be afraid of.

17
What have we learned?
  • Competence is specific, not generic
  • One method cant do it all

18
Magic expectations.
Does
Shows how
Knows how
Knows
19
What have we learned?
  • Competence is specific, not generic
  • One method cant do it all
  • One measure is no measure
  • We need a mixture of methods
  • to cover the entire pyramid
  • We can choose from a rich toolbox!

20
What have we learned?
  • Competence is specific, not generic
  • One method cant do it all
  • Assessment drives learning

21
Assessment and learning
  • The in-training assessment programme was
    perceived to be of benefit in making goals and
    objectives clear and in structuring training and
    learning. In addition, and not surprisingly, this
    study demonstrated that assessment fosters
    teaching and learning..
  • (Govaerts et al, 2004, p. 774)

22
Assessment and learning
  • Feedback generally inconsistent with and lower
    than self-perceptions elicited negative emotions.
    They were often strong, pervasive and
    long-lasting.
  • (Sargeant et al., under editorial review)

23
Assessment and learning
  • You just try and cram - try and get as many of
    those facts into your head just that you can pass
    the exam and it involves sadly it involves very
    little understanding because when they come to
    the test, when they come to the exam, theyre not
    testing your understanding of the concept. They
    test whether you can recall ten facts in this
    way? (Student quote from Cilliers et al., in
    preparation)

24
The continuous struggle
Curriculum
Assessment
  • Content
  • Format
  • Programming/
  • scheduling
  • Regulations
  • Standards
  • Examiners

Learner
25
What do we know?
  • Competence is specific, not generic
  • One method cant do it all
  • Assessment drives learning
  • Verify the consequences
  • Use the effect strategically
  • Educational reforms are as good as the assessment
    allows it to be.

26
What do we know?
  • Competence is specific, not generic
  • One method cant do it all
  • Assessment drives learning
  • Verify the consequences
  • Use the effect strategically
  • Educational reforms are as good as the assessment
    allows it to be.

27
Overview of presentation
  • Where is education going?
  • Where are we with assessment?
  • Where are we going with assessment?
  • Conclusions

28
My assumptions
  • Innovation in education programmes can only be as
    successful as the assessment programme is
  • Assessment should reinforce the direction of
    education that we are going
  • Future directions should use our existing
    evidence on what matters in assessment.

29
The Big Challenge
  • Established assessment technologies have been
    developed in the conventional psychometric
    tradition of standardisation, objectification
    structuring
  • Emerging technologies are in vivo and by nature
    less standardized, unstructured, noisy,
    heterogeneous, subjective
  • Finding an assessment answer beyond the classic
    psychometric solutions is The Big Challenge for
    the future.

30
Design requirements future assessment
  • Dealing with real-life
  • In vivo assessment cannot and should not be
    (fully) standardized, structured and objectified
  • Includes quantitative AND qualitative information
  • Professional and expert judgement play a central
    role.

31
Design requirements future assessment
  • Dealing with learning
  • All assessment should be meaningful to learning,
    thus information rich
  • Assessment should be connected to learning
    (framework of the curriculum and the assessment
    are identical)
  • Assessment is embedded in learning (equals the
    in vivo of educational practice and adds
    significantly to the complexity).

32
Design requirements future assessment
  • Dealing with sampling
  • Assessment is programmatic
  • Comprehensive, includes domain-specific and
    domain independent skills
  • Combines sampling across many information
    sources, methods, examiners/judges/ occasions..
  • Is planned, coordinated, implemented, evaluated,
    revised (just like a curriculum design).

33
Challenges we face
  • Dealing with real life
  • How to use professional judgement? Do we
    understand judgment?
  • How to elicit, structure and record qualitative
    information?
  • How to use (flexible) standards?
  • What strategies for sampling should we use? When
    is enough enough?
  • How to demonstrate rigour? What (psychometric,
    statistical, qualitative) models are appropriate?

34
Challenges we face
  • Dealing with learning
  • What are methodologies for embedding assessment
    (e.g. Wilson Sloane, 2000)?
  • How to deal with the confounding of the teaching
    and assessor role?
  • How to combine formative and summative
    assessment?
  • How to involve stakeholders?
  • How to educate stakeholders?

35
Challenges we face
  • Dealing with sampling at the programme level
  • What strategies are useful in designing a
    sampling plan or structure of an assessment
    programme?
  • How to combine qualitative and quantitative
    information?
  • How to use professional judgement in decision
    making on aggregated information?
  • How to longitudinally monitor competence
    development?
  • What are (new) strategies for demonstrating
    rigour in decision making? What formal models are
    helpful?

36
Contrasting views in approach
Programmatic embedded assessment
Conventional assessment
  • Assessment separate from learning
  • Assessment as part of learning
  • Method-centred
  • Programme-centred (based on overarching cohesive
    structure)
  • Context free
  • Context matters (dynamic relation between an
    ability, a task and a context in which the task
    occurs - Epstein Hundert, 2002)

37
Contrasting views in approach
Programmatic embedded assessment
Conventional assessment
  • Separation of formative and summative assessment
  • Combined formative and summative assessment
  • Traits (inferred dispositions)
  • States (directly meaningful entities
    situational)
  • Hard competencies
  • Hard soft competencies

38
Contrasting views in approach
Programmatic embedded assessment
Conventional assessment
  • Standardized and structured
  • Real life circumstances
  • Decision driven (pass/fail)
  • Feedback driven (what needs improvement)
  • Reductionistic (ticking boxes, scoring, grading,
    qualifying)
  • Information rich (including narrative,
    descriptive, qualitative information)
  • Fixed standards
  • Flexible standards

39
Contrasting views in approach
Programmatic embedded assessment
Conventional assessment
  • Ownership lies with external administrative bodies
  • Ownership lies with teachers and learners (within
    a master plan)
  • Analytical scoring restricted human judgement
  • Holistic appraisal relying on professional
    judgement (both at the individual situation level
    as well at the programme level)

40
Contrasting views in approach
Programmatic embedded assessment
Conventional assessment
  • Point assessment
  • Longitudinal, developmental, continuous
  • Credit points in a database
  • Thorough documentation of progress
  • One method on skill
  • Multimodal

41
Contrasting approaches in research
Programmatic embedded assessment
Conventional assessment
  • Rigour defined in direct (statistical) outcome
    measures
  • Rigour defined by evidence on thrustworthiness or
    credibility on the assessment process
  • Reliability/validity
  • Saturation of information, triangulation
  • Benchmarking
  • Accounting

42
Contrasting approaches in research
Programmatic embedded assessment
Conventional assessment
  • Psychometric
  • Edumetric/ educational
  • Evidence to predict future performance
  • Evidence of being exposed to the right training
  • Naturalistic experimentation
  • Controlled experimentation

43
Contrasting approaches in research
Programmatic embedded assessment
Conventional assessment
  • Instrument improvement
  • Instrument utility reliability validity
  • System or programme improvement
  • Instrument utility depends on place and
    function in the assessment programme

44
Contrasting views in approach
Programmatic embedded assessment
Conventional assessment
45
Overview of presentation
  • Where is education going?
  • Where are we with assessment?
  • Where are we going with assessment?
  • Conclusions

46
Conclusions
  • Assessment has made tremendous progress
  • Good assessment practices based on established
    technology are implemented widely
  • Sharing of high quality assessment material has
    begun (IDEAL, UMAP, Dutch consortium)

47
Conclusions
  • We are facing a major next step in assessment
  • We have to deal with the real world
  • The real world is not only the work-based setting
    but also the educational training setting

48
Conclusions
  • To make that step
  • We need to think out of the box
  • New methodologies to support assessment
    strategies
  • New methodologies to validate the assessment

49
Conclusions
  • There is a lot at stake
  • Educational reform depends on it

50
Conclusions
  • Lets join forces to make that next step!

51
This presentation can be found on www.fdg.unimaa
s.nl/educ/cees/amee
Write a Comment
User Comments (0)
About PowerShow.com