John Carpenter - PowerPoint PPT Presentation

1 / 50
About This Presentation
Title:

John Carpenter

Description:

The Birmingham University Programme in Community Mental Health ... So, it's good news. The students did learn. They did put their learning into practice. ... – PowerPoint PPT presentation

Number of Views:40
Avg rating:3.0/5.0
Slides: 51
Provided by: Brow165
Category:

less

Transcript and Presenter's Notes

Title: John Carpenter


1
John Carpenter
  • Evaluating the Outcomes of Education
  • towards a comprehensive andrigorous approach
  • TITLE OF THE PRESENTATION
  • Sub Title
  • Date

2
Last year
  • I asked Why should we evaluate the outcomes of
    social work education?
  • I suggested
  • What we might mean by outcomes.
  • How they could be measured.
  • What kinds of research design might be
    appropriate.

3
SIESWE and SCIE
  • Jointly published
  • Evaluating the Outcomes of Social Work Education
    (Jan 2005).
  • At conference or pdf via
  • www.scie.org.uk
  • New SCIE resources

4
Question 1 (affective or attitudinal)
  • I am interested in methods of evaluating the
    outcomes of SWE
  • Strongly disagree
  • Disagree
  • Dont know
  • Agree
  • Strongly agree

5
Question 2 (declarative knowledge)
  • What is a randomised controlled trial?
  • Participants are allocated to control and
    comparison groups and t1 and t2 scores are
    compared.
  • Allocated randomly to one or more experimental
    groups and a control group T1 vs. T2 scores
    compared.
  • Frustrated researcher tells uncontrolled,
    randomised participants to get a grip.

6
Question 3 (self-efficacy)
  • I am confident that I know how to design a
    methodologically sound evaluation of the outcomes
    of an aspect of a SWE course.
  • Strongly disagree
  • Disagree
  • Dont know
  • Agree
  • Strongly agree

7
Question 4 (motivational)
  • I am determined to carry out a controlled or
    comparative evaluation of our course/module next
    year.
  • Strongly disagree
  • Disagree
  • Dont know
  • Agree
  • Strongly agree

8
  • What about the outcomes?
  • Did my (educational) interventions have any
    effect?
  • If you were in Glasgow last year.

9
Question 5 (behavioural implementation)
  • Have you carried out a controlled or comparative
    evaluation of your course/module in the last
    year?
  • Oops! Sorry, I forgot.
  • THEY wouldnt let me.
  • I tried, but it didnt work
  • Yes!

10
Response to the paper
  • 4 extremely positive emails e.g.
  • Fascinating material - much of which I didn't
    know about/hadn't thought about but realise I
    need to know! Moreover you make the case for a
    more sophisticated and fit-for-purpose approach
    to evaluation of teaching and learning very
    convincingly.
  • 1 email exchange and one conversation to clarify
    that I am not a hard-nosed positivist who hates
    qualitative research.
  • 6 positive (but less coherent) verbal responses.

11
A pretty poor research design
  • No systematic sampling.
  • Unknown but probably hopeless response rate.
  • No control or comparison group.
  • No attempt to collect the data.
  • Useless qualitative data.
  • Biased researcher.

12
In my defence
  • Some attempt to measure a range of outcomes
  • Attitudes
  • Knowledge
  • Behaviour.
  • There were 3 time points
  • Beginning of the presentation
  • End
  • Long term follow-up.
  • Multiple baseline would have been better.

13
Towards Better Evaluations
  • Comprehensive approach to identifying outcomes,
    including user-defined outcomes.
  • Selecting measures which work.
  • Collecting qualitative and quantitative
    information about the processes as well as the
    outcomes.
  • Using rigorous designs.
  • Build in comparison and control groups, including
    waiting list controls and repeated measures.

14
A Case Study
  • The Birmingham University Programme in Community
    Mental Health
  • Post-qualification postgraduate level (assessed)
  • Health and social work staff seconded from MH
    teams in the region
  • Part-time (1 day per week) for 2 years (plus 1
    for Masters dissertation)
  • Run by Depts. of Social Work and Psychology with
    input to course planning and management from
    service agencies and users.

15
Curriculum
  • Principles and values (user-centred)
  • Assessment and care planning
  • Mental Health Law
  • Cognitive Behaviour Therapy
  • Family Intervention
  • Interagency and team working
  • Research methods (for MA)

16
External evaluation (Uni. of Durham John
Carpenter, Di Barnes and Claire Dickinson)
  • Commissioned by NHS
  • Independent (Not carried out by the Programme
    trainers)
  • Longitudinal 5 years following 3 cohorts
  • Comprehensive many levels of outcomes.
  • Focused on outcomes following through learning to
    practice
  • Concerned about process
  • Comparative
  • Formative AND summative.

17
Outcomes the Kirkpatrick/Barr Framework
18
Summary of methods and measures
19
Key characteristics of students
  • 58 aged between 31-40
  • 69 were women
  • Between 9 and 22 students from Black and
    minority ethnic communities
  • Tended to be well established professionals
  • Majority had been in current job for more than 5
    years, 22 more than 10 years.

20
Profession of students starting the Programme
Cohort 1 n45
Cohort 2 n51
Cohort 3 n45
Cohort 4 n47
Cohort 5 n37
21
Learners reactions to the Programme
22
A stressful experience?
23
Changing attitudes
  • Began course with positive attitudes towards
    principles and values underpinning community
    mental health care
  • No differences between professions
  • No change over time
  • Valued opportunity to reflect on their values

24
  • Reported increased awareness of social
    perspectives and service users views.
  • The course has changed my perspective. For me
    it has been a tremendous change as we worked very
    much in the medical model. It gave me some
    answers, for example, the social disability
    model. The user involvement aspect has shifted
    my thinking - obviously so. It helped that as
    part of the course we talked about what we
    brought to our work. (CPN Interview 3)

25
Professional Identification
26
Stereotypes
  • Psychiatrists and psychologists high for
    academic rigour and leadership. Low for
    communication practical skills and breadth of
    life experience.
  • Social workers High interpersonal skills
    moderate in academic rigour and practical skills.
    Poor for leadership.
  • CPNs high on interpersonal skills and practical
    skills. Lower for leadership and academic rigour.

27
Changing stereotypes
  • Little effect after 9 months and 21 months. Why?
  • The stereotypes were so strongly held and
    reinforced by day-to-day contact with colleagues
    that the effect of the teaching and learning
    experience was counteracted.
  • The conditions necessary to change stereotypes
    were not sufficiently present on the Programme.

28
Typical of their profession?
  • The OTs are typical generally we are more open
    minded...
  • But The nurses are atypical. On the course the
    nurses are more open minded, willing to move
    forward. They are not as defensive of their
    profession - at the edge of crawling out of their
    hole Those who are not inclined will not go on
    the course. It is self-selective. (OT Interview
    4)
  • Participants are not willing to generalise

29
Necessary conditions
  • Opportunities for productive inter-group learning
    were missed (group membership and mixing)
  • Similarities rather than differences in roles
    were emphasised. Consequently, the difficulties
    in interprofessional working were infrequently
    exposed and explored.
  • Feedback and programme re-designwith a positive
    response

30
Do they learn? (1) Partnership working with users
Students ratings of importance, and
self-assessments of knowledge and skills at T1
and T3.   Scale 0not at all, 5intermediate,
10very high/expert.  
Rating of importance
T1
T3
31
Do they learn? (2) Multidisciplinary team working
Students ratings of importance, and
self-assessments of knowledge and skills at T1
and T3.   Scale 0not at all, 5intermediate,
10very high/expert.
Rating of importance
T1
T3
32
Do they learn? (3) Psychosocial Interventions
Students ratings of importance, and
self-assessments of knowledge and skills at T1
and T3.   Scale 0not at all, 5intermediate,
10very high/expert.
Rating of importance
T1
T3
33
Barriers to implementation of learning
34
Implementation role conflict
35
Climate for Innovation
36
Change in organisational practice
  • Teams only moderately open to innovation
  • Difficult to introduce changes in practice and to
    cascade learning to others.
  • Time and energy act as deterrents to swimming
    against the tide.
  • But, numerous (corroborated) reports of success.

37
Outcomes Service users views on trainees
  • She makes one feel that what a person thinks
    matters.
  • My worker understands me because she is trained
    to understand. She understands me because she
    cares about me.
  • She treats me as how I am, as an individual and
    not an illness.

38
Outcomes for users
  • Statistically significant improvements (t1-t2)
  • General social functioning (GAS)
  • Reduction in mental health and social problems
    (HoNOS)
  • Decrease in psychiatric symptoms (BPRS)
  • Improvement in life skills (LSP).
  • But, can this be attributed to the training?
  • Comparison groups also tended to improve except
    for life skills.

39
Change in Life Skills
40
So, its good news
  • The students did learn.
  • They did put their learning into practice.
  • It did make a difference to their teams.
  • It did improve outcomes for users.
  • Carpenter, J. et al. (in press) The outcomes of
    IPE for community mental health. J.of
    Interprofessional Care

41
What about SWE?
  • Its desirable and essential to engage students
    in the systematic evaluation of their own
    learning. (Test data MCQs, concept mapping,
    ratings.)
  • Engage service user trainers and consultants.
  • Engage educators in the systematic evaluation of
    learning outcomes.
  • Compare and contrast collaboration between
    programmes.

42
  • Enthusiasts wanted to join a project to
  • Test the feasibility of outcome measures and
    research designs
  • Generate high quality evidence about the
    effectiveness of methods of SWE
  • Build capacity and capability amongst academics
    and trainers, including users and carers, in the
    evaluation of SWE.

43
Learning sets
  • Group of enthusiasts who will learn together over
    3 years
  • Face-to-face meetings (4 in Year 1) discussion
    board.
  • Focus on real problems, provide for group
    reflection and learning, establish personal
    responsibility, action based.
  • Collaborate, e.g. on cross-programme evaluations.

44
  • Social work educators
  • Lecturers
  • Service user and carer educators working on a
    programme
  • Practice teachers.

45
  • Facilitation and consultation by John Carpenter
    and Hilary Burgess (Bristol SWAP) supported by
    SCIE and SIESWE.
  • Programme support from SWAP (4k per programme).

46
Whats needed
  • 3-year Programme-wide commitment, plus Head of
    Department
  • curriculum development committee comprising
    staff, user and carer consultants and students
  • lead local evaluator with time and eager to
    collaborate
  • willing and able to negotiate adjustments to
    timetable and curriculum
  • user/carer educator/evaluator
  • commitment to disseminate and publish
  • willing to host meetings.

47
Plus
  • Agree to participate in a qualitative study of
    the process of the project.
  • Have some ideas for a project but not a
    proposal at this stage.
  • Passing the exam

48
Process
  • End July Invitations via SWAP, SCIE, SIESWE to
    be received by end September.
  • Mid October selection of programmes
  • Mid November 1st meeting of learning set
  • Meetings January-March-June

49
The Exam Preview
  • Define the following
  • Counterbalancing. (p.32)
  • Automaticity. (p.13)
  • A standardised adolescent (p.29).

50
Further information
  • See handout
  • Email j.s.w.carpenter_at_bristol.ac.uk
Write a Comment
User Comments (0)
About PowerShow.com