Title: Developing a Framework to Evaluate Training Programs Provided by WHO
1Developing a Framework to Evaluate Training
Programs Provided by WHO
- The Feasibility of Incorporating Social Justice,
Cultural Competency and Return on Investment - A Work in Progress
2(No Transcript)
3Thank you to Athabasca University for funding
this project
4Background
- Training requires increased financial resources
and human resources - No comprehensive framework to evaluate return on
investment (ROI) - Request from Office of Nursing and Midwifery,
Department of Human Resources for Health, WHO
5Method
- Participatory Action Approach
- WHO Personnel Selected by Senior Scientist,
Office of Nursing Midwifery - Representation WHO Priority Areas and
Headquarters, Regional and Country Levels
6Participation of WHO
- Interview and survey questions reviewed by key
personnel interested in evaluation - Participants selected from
- Reproductive Health and Research
- Making Pregnancy Safer
- Expanded Program Pandemic Flu
- Gender and Womens Health
7Survey
- Questions focused on
- Program Planning and Design
- Program Delivery Teaching Methods
- Evaluation/Monitoring
8Interview Questions
- Evaluation methods currently used
- How they were selected?
- Strengths of current methods
- What is missing?
- Whether indicators of social justice, cultural
competency and measures of ROI were included - What ought to be included in a framework?
9Challenges
- Timing-Senior Scientist off work
- Difficult to contact participants-busy or
traveling - Lack of time for some participants-interviews
and/or survey not completed - Discovery of other evaluation work being
undertaken by WHO/UN
10Interviews
- WHO personnel from Infectious Diseases, Gender,
HIV/AIDS, Making Pregnancy Safer (9) - Country representatives (Jamaica and the
Philippines) (3)
11Surveys Head Quarters
- Reproductive Health and Research (RHR) (FCH/STI
and FCH/TCC) Controlling Sexually Transmitted and
Reproductive Tract Infections - Making Pregnancy Safer (FCH/MPS) -Essential
Newborn Care Training - Integrating Gender into Public Health- Gender and
Health Learning Program - Expanded Program Immunization Pandemic
FluBio-risk Reduction (CDS/EPR) EPI Training on
Immunization in the African Region
12Surveys Region and Country
- AFRO-EPI-Training on Immunizations in the African
Region - SEARO (Thailand)-ToT-Nursing Management of
HIV/AIDS Prevention, Care and Support - Country-(Philippines)-ToT-Promotion of Healthy
Lifestyles
13Social Justice Guiding Principles (Canadian
Nurses Association)
- Equity
- Human Rights
- Democracy and Civil Rights
- Capacity Building
- Just Institutions
- Enabling Environments
- Poverty Reduction
- Ethical Practice
- Advocacy
- Partnerships
14Cultural Competency
- A culturally competent professional is one who
is actively in the process of becoming aware of
his or her own assumptions about human behaviour,
values, biases, preconceived notions, personal
limitations and so forth. Second, a culturally
competent professional is one who actively
attempts to understand the world view of
culturally diverse populations. Third, a
culturally competent professional is one who is
in the process of actively developing and
practicing appropriate, relevant and sensitive
intervention strategies and skills in working
with his or her culturally different students
(Adapted from Sue Sue, 1990).
15Assessing Cultural Competency
- May be assessed using indicators adapted from
National Standards for Culturally and
Linguistically Appropriate Services (Putsh et al,
2003. p. 10), Sue Sues (1990) Attributes of a
culturally competent professional (awareness,
knowledge and skills) or Culhane-Peras (1997)
Five Levels of Cultural Competency in Medicine
or others.
16Cultural Competency
- A lot of the people that come out of medical
school, less from nursing school although even
nursing school, have no information or have never
heard of this topic, although less and less. So
trying to get it into the under-graduate
curriculasome these attitudinal and cultural
competencies --not just having them in your
post-graduate courses, but really getting them
into the early trainings, so then you can really
improve on it rather than having to start from
scratch, with people that are already practicing.
17Evaluation
- Generally defined as the systematic acquisition
and assessment of information to provide useful
feedback about some object (Michael Zinovieff
quoting Bill Trochin, Cornell University, 2006).
18 Evaluation
- Donald Kirkpatrick (1959,1998) measuring changes
in behavior that occur as a result of training
programs. - Developed four Levels of training evaluation
reaction, learning, behaviour and results
19Types of Evaluation
- Formative
- Summative
- Confirmative
- Meta
20Formative
- Focus on process
- Improves the quality of the training during the
design, development, and implementation stages - Carries out a subject matter expert review, a
user review, or a pilot test. - When the design of the training program is near
completion, both subject matter experts and users
provide feedback to further refine the training
21Summative
- Focus on final product
- Determines the impact on individual and
organizational performance during and after the
training - Use direct observation, surveys of training
stakeholders, measurement of performance
indicators (quality, productivity, satisfaction,
etc.) and/or measurement of the institutional
outcome
22Confirmative Evaluation
- Future-oriented
- Focuses on verification of the continuous quality
improvement of training programs. - Looks at enduring, long-term effects or results
over the life cycle of an instructional
performance intervention-changes that can be
identified after the passage of time and are
directly linked to participation in training - Level four of the Kirkpatrick evaluation model is
confirmative evaluation - Contains elements of outcome and impact
evaluation
23- Outcome evaluation
- - type of program evaluation that uses valued
and objective person-referenced outcomes to
analyze a programs effectiveness, impact, or
cost-benefit - Impact evaluation
- - looks at negative or positive program-based
changes in performance and focuses on whether the
program has made a difference compared to either
no program or an alternate program
24Meta Evaluation
- Quality control process that is applied to the
processes, products and results of formative,
summative, and confirmative evaluation - Evaluating the evaluation (evaluator tries to
figure out how the evaluation was conducted) - Purpose is to validate the evaluation inputs,
process, outputs, and outcomes - Serves as a learning process for the evaluator
and makes the evaluators accountable
25Evaluation Models
- Kirkpatricks Four Levels
- Phillips Return on Investment
- Context, Input, Process and Product (CIPP)
26(No Transcript)
27Levels of Evaluation
- Level Measurement Focus Questions
Addressed - 1-Reaction Trainees Perception
What did they think of the
training? - 2-Learning Knowledge/Skills Gained Was
there and increase in K/S? - 3-Behavior Worksite Implementation Is
new K/S being used on the job? - 4-Results Impact on the Organization What
effect did training have on the
organization? -
-
28Use of Levels
- Level 1-most commonly used (Smile sheet)-easy to
administer and evaluate - Level 2-used by academic centers, public sector,
WHO-most reliable when pre-post-tests used - Level 3-difficult to measure human behavior and
show evidence of it - Level 4-tied to measurable information related to
bottom line
29Types and Levels of Evaluation
- Levels 1(Reaction) and 2 (K/S) part of formative
evaluation - Can lead to false sense of security
- May be no relationship between feelings about
training and improved performance
30Types and Levels of Evaluation
- Levels 3 and 4-associated with summative
evaluation - Level 4 will determine whether it has value
- Level 3 can be used to refine training
31Level 5 ROI
- Justification of costs of training based on the
return on investment and organizational impact - Requires collecting level 4 data
- Converting results to monetary values
- Comparing results to cost of training
32Return on Investment
- Measuring return on investment is becoming a
truly global issue. Organizations from all over
the world are concerned about the accountability
of training and are exploring ways and techniques
to measure the results of training (Jack
Phillips, 1997, p. 4).
33Content, Input, Process and Product (CIPP)
- Impact Evaluation-assess the reach to target
audience - Effectiveness Evaluation-assess quality and
significance of outcomes - Sustainability Evaluation-assess extent to which
contributions are successfully institutionalized
and continued over time - Transportability evaluation-assess extent to
which program or training has been adapted or
applied elsewhere
34Types of Evaluation Methods Used at WHO
- Mainly level one and level two evaluations
- Some examples of level three
- No Return on Investment
- Daily, Mid-way, Upon Completion
- Upon Return to Worksite-Sporadically
35Satisfaction with Current Evaluation
- Easy to Do
- Low Cost
- Immediate Feedback
36Themes/Concerns
- Need to define training
- Make training part of an educational
process-include in pre-service and ongoing
education - Better selection of trainees
- Need for follow-up
- Need more mentorship and support to use new
knowledge, attitude and skills - Quality of educators/trainers
- No capacity to do extensive evaluation
37Need to define training
- If you look at a project proposal and you look
at what anyone is doing, the first thing you have
is development of guidelines, training programs,
training modules, training curricula. And we need
to revisit the whole issue of what we mean by
training. Training is not a one-off entity and I
think we have to acknowledge that. And we have
to acknowledge that training requires follow-up
and supervision--it's a holistic vision of what
training can add to enable a person to perform
better. - I would like for us to first look at it as
education-not training. And I would like us to
see education as a component of programmatic
management. And that we actually have to
characterize it in a manner that it builds
capacity over time. So, it is not just a one-off
event. And that we should actually be innovative
in the way we look at it and try to more
inclusive
38Need to Define Training
- Do we mean in-service training, pre-service
training-both? How much? In my area of work for
example, we are working a lot with in-service
training but this is not probably the most
convenient approach. We would like if countries
are requiring in-service training in order to be
with illusion of faster in implementing
recommendations. The most sustainable approach is
probably pre-service and this seems to be more
complex. But it would be important to evaluate
both and how training can be linked with a career
path of the health care providers.
39Make training part of an educational
process-include in pre-service and ongoing
education
- We are taking short-cuts and what we should be
looking at is an education process. We are
missing certain factors like personal growth,
motivation. We are missing other factors like
empowerment to apply. We are missing other
factors like how does this link in with all the
other training that's going on.
40Better Selection of Trainees
- another big question I have is who's getting
trained and how much, because what I see is there
is a certain body of people who get trained and
then there is this huge gap. - I think we just evaluate the input and not the
output. The output is the bigger picture. The
output could be that you have 15 people trained.
Well, but within the context of how many people
actually need to be trained to do what and how
41Need more mentorship and support to use new
knowledge, attitude and skills
- How do we make this an educational experience in
which we provide mentorship and follow-up
afterwards, so that there is a continuum so that
people can be helped or empowered to apply that
knowledge in practice? I see this as our biggest
gap. We keep training people in a vacuum without
looking at the environment and the infrastructure
in which we wish them to work. What we need to
do is enable them to be able to apply that
knowledge in practice, and also to build on their
knowledge and experience.
42Need for Follow-up
- we are not looking at the longer-term picture
of how the knowledge is being applied in
practice to adequately measure the uptake and
application of skills
43Quality of Educators
- a major area of concern of mine for many
years, and this particularly applies to nursing
but it applies to many other fields as well--is
the quality of the educators. And we don't put
enough energy into ensuring that the educators
are qualified and enthusiastic in the way that
they teach. - if we could include into some of that
evaluation, some of the more challenging issues
that we need to address like the quality of the
trainers, what is out there? And what is being
missed? Not from the point of view of what is
wrong but what do they need to improve or what to
follow up?
44Evaluation needs to be practical
- We have to make evaluation incredibly practical
and cost effective because its a problem fitting
it in and its got to be something that you could
actually use as a tool for ongoing planning. If
you could get that into peoples' heads I think
people see evaluation as being the end of the
road and not the beginning of it. And I think
we've got to change the paradigm on this-maybe
the wording.
45Social Justice
- We have a training program within the
department (MPS) that is actually focused on a
human right based approach to reproductive
health--it used a rights based approach and that
one actually has an evaluation framework which
would actually include some of those indicators.
46Cultural Competency
- it would be interesting to look at it with a
critical lens because we import many training
programs. So it is the importation of the
training and the training process that may not be
culturally explicit. You see this for example for
programs we've established for community health
workers. When I go back to the times I was
working with community health workerswhere you
lifted people out of their environment, took them
to another, trained them and then didn't follow
them up. Whether there are examples where you
actually then take people in their environment,
you've selected them based on the fact that
you've actually assured of their continued
existence within that community and trained them
within the framework of that community,
basically. So, if you are looking at it from that
kind of cultural context then it is in-country
where you really need to go, not at this level.
But you also could have a look and see what
people like us are recommending, because that
also influences.. and in actual fact when you are
looking at the training you should look at what
are they recommending because that influences
training.
47Social Justice and Cultural Competency
- Actually if you want to add both social
justice and cultural issues is what is being done
at the moment in the world of community health
workers and use it as an example. Because if
you've got HIV now, which is just focusing on
community based health workers. But you've also
got malaria, TB, and family planning-what are we
actually doing and where are the drivers for
this? Is it the countries or is it where the
donors are pushing? And the time frames for
training. I mean , a lot of the times you are
constrained when you put forward an idea for
training or education because people say ,
'"there is now way -we can spend that much time
on training". But if you need that much to be
able to produce a workforce, surely it's
cost-effective.
48Learnings
- When choosing a model it is essential to first
identify the questions the evaluation needs to
address - Evaluation needs to be practical
- Need to account for the impact of intervening
variables such as motivation to learn,
trainability, job attitudes, personal
characteristics, and transfer of training
conditions - WHO needs to decide whether it is prepared to
allocate the financial resources to carry out
evaluation beyond level one and two
49Learnings (Contd)
- Need a Long Term vs Short-term Approach
- Need to have a plan and inform people about
how/if results will be used - Need to develop capacity/competent trained people
to deal with evaluation.
50Evaluate within Program Context
- I see this as some basic evaluation
principles in the way that we should actually
support the vision and promote the vision of
--you know if you want to build capacity well if
you are going to evaluate that, it needs to be
evaluated within the context of the program which
is being provided
51Social Justice and Cultural Competency
- Gender, Women and Health considers culture,
but what advice does one give to Tanzania versus
Sudan versusI mean it's so relative that it's a
little difficult to sayI think it's difficult
construct when you are working at that global
scale, and people are working in a very
culturally, relative, and strict context. So it's
best to stick to some of the social justice
principles that are bit more universal, and even
that, say for example the advice that we might
give on violence in a Middle East setting will be
very adapted to the cultural context than say in
our countries in PAHO and the Latin American
Region.
52Evaluation Method must be Appropriate for the
Program
- I'm not a big fan of the gold standard
evaluation designs just because I think it's a
lot of resources which for social interventions
are not necessarily appropriate. - for us time is premium. I mean everything has to
be done yesterday, so time is premium and so what
is convenient, but yet can tell us what we need
to know, that is often what dictates what we
choose as a strategy for evaluation.
53Conclusions
- The ultimate goal of evaluation is the promotion
of best practice - The importance of using multiple means
(approaches) and multiple methods (tools) for
evaluation cannot be overstated - Main sources of information that can be used to
assess performance include - -Learner assessments of their own learning
- -Service users/community assessments of quality
of service, - -Trainer assessments of acquisition of knowledge
and skills, - -Proxies for health outcomes derived from
routine service delivery statistics - For WHO it is imperative to ensure all
stakeholders are included in the evaluation
process before a particular type or method is
chosen
54Final Note
- Even the most reliable and valid instrument will
not be useful if the process for using the tool
is too burdensome or too costly...the feasibility
of using the tool must be considered as a
component of the selection effort.