Evaluating Disaster Mental Health Programs for Children and Families

About This Presentation
Title:

Evaluating Disaster Mental Health Programs for Children and Families

Description:

SAMHSA. Substance Abuse. and. Mental Health Services Administration. 3. Northwest Center for ... SAMHSA's Disaster Technical Assistance Center (DTAC). ( http://mentalhealth.samhsa.gov/dtac ... –

Number of Views:83
Avg rating:3.0/5.0
Slides: 44
Provided by: frann
Learn more at: https://www.nwcphp.org
Category:

less

Transcript and Presenter's Notes

Title: Evaluating Disaster Mental Health Programs for Children and Families


1
CHILD AND FAMILY DISASTER RESEARCH TRAINING AND
EDUCATION
2
Federal Sponsors
  • NIMH
  • National Institute of Mental Health
  • NINR
  • National Institute of Nursing Research
  • SAMHSA
  • Substance Abuse
  • and
  • Mental Health Services Administration

3
Principal Investigators
  • Betty Pfefferbaum, MD, JD University of Oklahoma
    Health Sciences Center
  • Alan M. Steinberg, PhD University of California,
    Los Angeles
  • Robert S. Pynoos, MD, MPHUniversity of
    California, Los Angeles
  • John Fairbank, PhDDuke University

4
Evaluating Disaster Mental Health Programs Part
II From Theory to Practice
Clark Johnson, Ph.D. Adopted / Modified from
materials prepared by Fran Norris Ph.D., Craig
Rosen, Ph.D. Helena Young, Ph.D. National Center
for PTSD
5
We will start the next session in about 10
minutes and will begin with a discussion of the
following text
Disaster research is different from most other
fields in that much of the work is motivated by a
sense of urgency and concern. Disaster research
has both benefited and suffered from this. It
has benefited because the cadre of researchers is
fluid, and new ideas are accepted and welcomed.
It has benefited also because the result has been
an impressively diverse database that includes
samples from all different regions of the United
States.... However, disaster research has also
suffered from this situation. Scholarship is not
always the best because studies often are
undertaken under conditions where there simply is
not time to absorb a literature that is scattered
across a variety of journals and is mixed in
quality. Concerns about experimental designs and
scientific rigor must often take a back seat to
provider beliefs, consumer demands, and clinical
necessities. Most of the research is
atheoretical and little of it is programmatic.
On the basis of this review, we will state our
opinion unequivocally that we do not need more
research that establishes only that severely
exposed disaster victims develop psychological
disorders or, worse, that barely exposed disaster
victims do not. We need carefully conceived and
theory-driven studies of basic process that are
longitudinal in design. ... We need more
research that addresses the needs of diverse
populations. We need more complex studies of
family systems and community-level processes. We
need to identify and investigate novel approaches
to community intervention, where the intervention
itself has been designed to produce collective
rather than individual improvements.
  • Source
  • Norris, Friedman, Watson. (2002) 60,000
    Disaster Victims Speak Part II. Summary and
    Implications of the Disaster Mental Health
    Research.
  • Psychiatry 65(3), 240-260

6
Disaster research is motivated by a sense of
urgency and concern.
  • It has benefited from this because
  • The cadre of researchers is fluid, and new ideas
    are accepted and welcomed.
  • The result has been an impressively diverse
    database that includes samples from all different
    regions of the United States

7
Disaster research is motivated by a sense of
urgency and concern.
  • It has suffered from this situation because
  • Scholarship is not always the best because
    studies often are undertaken under conditions
    where there simply is not time to absorb a
    literature that is scattered across a variety of
    journals and is mixed in quality.
  • Concerns about experimental designs and
    scientific rigor must often take a back seat to
    provider beliefs, consumer demands, and clinical
    necessities.
  • Most of the research is atheoretical and little
    of it is programmatic.

8
On the basis of this review, we will state our
opinion unequivocally that
  • We do not need
  • more research that establishes only that
    severely exposed disaster victims develop
    psychological disorders or, worse, that barely
    exposed disaster victims do not.
  • We do need
  • carefully conceived and theory-driven studies of
    basic process that are longitudinal in design.
  • more research that addresses the needs of diverse
    populations.
  • more complex studies of family systems and
    community-level processes.
  • to identify and investigate novel approaches to
    community intervention, where the intervention
    itself has been designed to produce collective
    rather than individual improvements.

9
Lets pretend were starting today
  • When disaster strikes the Psychosocial health of
    the children and families in our community will
    be adversely impacted.
  • Our goal is to implement a program that is
    designed to minimize this impact.
  • The evaluation of this program must be designed
    to
  • Guide development (Proactive / Clarificative)
  • Monitor implementation (Interactive)
  • Summarize outcomes and results (Impact)
  • How should we proceed?

10
Psychological First Aid (PFA)http//www.ncptsd.va
.gov/ncmain/ncdocs/manuals/nc_manual_psyfirstaid.h
tml
  • PFA is an evidence-informed modular approach for
    assisting people in the immediate aftermath of
    disaster and terrorism to reduce initial
    distress, and to foster short and long-term
    adaptive functioning.
  • It is for use by mental health specialists
    including first responders, incident command
    systems, primary and emergency health care
    providers, school crisis response teams,
    faith-based organizations, disaster relief
    organizations, Community Emergency Response
    Teams, Medical Reserve Corps, and the Citizens
    Corps in diverse settings.

11
Ex Ante Evaluation
EX ANTE EVALUATION A practical guide for
preparing proposals for expenditure
programmes Available (http//ec.europa.eu/budget
/evaluation/pdf/ex_ante_guide_en.pdf)
  • Overview of key elements
  • Lessons from the past
  • Problem analysis and needs assessment
  • Objective setting
  • Alternative delivery mechanisms and risk
    assessment
  • Added value of this activity
  • Planning future monitoring and evaluation
  • Helping to achieve cost-effectiveness

12
Pubmed Psychological first aid
May 28th, 2007
13
Pubmed Psychological first aid
  • 23 hits
  • General overview
  • 1 - Technology
  • 6 - Disaster Planning / Policy
  • 7 - How tos
  • 1 - Personal Experiences w/
  • 8 - N/A
  • One article that might help us along
  • Macy, et.al. (2004).Community-based, acute
    posttraumatic stress management A description
    and evaluation of a psychosocial-intervention
    continuum. Harvard Review of Psychiatry.
    12,217-228.

14
Focus on Macy, et. al. (2004).
  • Helps us avoid re-inventing the wheel
  • Literature paucity of evidence that CISD is
    effective (p218)
  • Conceptual and practice framework for assessing
    and intervening with children, youths, families
    and their various types of adult caregivers
    (p219)
  • Template for intervention process and practice
    protocols (p221-222)
  • An evaluation study that can be used as a
    preliminary template (p223)
  • Stakeholders study
  • Case records study
  • Study of training
  • Results (p226) Program effective because
  • It helped communities handle crises
  • Trained a network of local people to lead or
    assist with the interventions
  • Identifies studys limitations (p226-7)
  • No comparison group
  • No use of standardized / validated instruments
  • No analysis of quantitative client-outcome data
  • Long-term effectiveness unclear

15
Evaluation Study
  • The study was conducted over a five-month period,
    between June and October 2003. The design was
    essentially that of a case study structured to
    capture PTSMs essential elements and to enable
    an assessment of program effectiveness,
    specifically through a three-component design
  • (1) a study of stakeholders in order to assess
    their views of the program, its impact on
    individuals and communities, and its quality
  • (2) a study of case records of interventions
    with individuals and community groups
    experiencing traumatic events in order to assess
    the breadth and depth of the interventions, the
    manpower and time required, and the effectiveness
    of the interventions and
  • (3) an assessment of the effectiveness of the
    training that was designed to create a cadre of
    people to assist with community interventions.

16
Generate an initial intervention plan
  • A this point the plan is just a rough sketch
    of your ideas for
  • Training
  • Process
  • ??
  • Documents include
  • Training
  • Program logic
  • ??

17
Ex Ante Evaluation
  • Lessons from the past
  • Problem analysis and needs assessment
  • Problem Analysis
  • What is the problem to be solved?
  • What are the main factors and actors involved?
  • Needs Assessment
  • What is the concrete target group
  • What are the needs and / or interests of this
    group
  • Objective setting
  • Alternative delivery mechanisms and risk
    assessment
  • Added value of this activity
  • Planning future monitoring and evaluation
  • Helping to achieve cost-effectiveness

18
Problem analysis
  • Roadmap
  • Define the key aspects of the situation to be
    addressed by the program
  • Identify factors that are likely to influence the
    key problem
  • Identify the main groups of actors that influence
    or that are being influenced by the situation
  • Analyze the cause-effect relations between the
    factors identified and the interests and
    motivation of the actors
  • Construct a visual presentation of these
    relationships

19
Needs assessment
  • Roadmap
  • Identify the target population and the most
    important subgroups within it
  • Investigate the situation, motivations and
    interests of these groups
  • Make sure that the identified needs actually
    correspond with social, economic and
    environmental objectives of the community

20
Ex Ante Evaluation
  • Lessons from the past
  • Problem analysis and needs assessment
  • Objective setting
  • Define general, specific and operational
    objectives
  • Define indicators that measure inputs, outputs,
    results and impacts
  • Alternative delivery mechanisms and risk
    assessment
  • Added value of this activity
  • Planning future monitoring and evaluation
  • Helping to achieve cost-effectiveness

21
Key questions
To Generate Ask
General Objectives What goal are we working towards?
Specific Objectives What will have changed when we achieve it?
Operative objectives What will be delivered to achieve the goal?
Progress Indicators How will we know if we are on course
Success criteria How can we judge if the action has been successful
Outcome indicators How do we know if the desired change has been effected?
22
- Inputs - Resources Available for Achieving
Goals
  • Some inputs are tangible resources, such as
    funding, program staff, office space, supplies,
    and transportation
  • Others are less concrete, such as the skills of
    staff and relationships among staff and with
    local community leaders
  • Lack of these resources can greatly limit an
    organizations ability to deliver services
  • Given the unexpected nature of disasters,
    programs often are initiated before all of the
    necessary inputs are in place, creating
    challenges for both the program and its evaluation

23
Outputs The Measurable Units of Products from
Program Processes
  • Evaluations often focus on the outputs of the
    service delivery process, such as
  • Number of outreach visits concluded
  • Number of children receiving counseling
  • Number of people reached in public education
  • Number of individuals screened and referred for
    more extensive treatment
  • In some cases, evaluations conclude with outputs,
    which are used as a proxy for outcomes

24
Indicators Of Outputs And Outcomes
  • Indicators are the observable measures or
    standards used to monitor or evaluate program
    success or outcome (e.g., number of clients
    receiving services, changes in consumer
    self-reported symptoms or behaviors, or changes
    in community conditions)
  • It is the job of the evaluator to ensure that
    these criteria are defensible
  • For indicators of success to be meaningful, they
    must exhibit good construct validity (measure
    what they claim to be measuring)

25
Processes Activities or Means to Bring About
Program Objectives
  • Such processes might include
  • outreach to affected people in the community
  • providing classes or community education on
    normal responses to trauma
  • public relations efforts to increase community
    awareness of the agencys services
  • training secondary helpers in how to provide
    reassurance and support to facilitate recovery
  • providing brief individual or group counseling or
    more extensive intervention
  • arranging treatment referrals for individuals
    with more severe mental health needs

26
Ready-to-use Data Collection, Data Management,
And Reporting Tools
  • If possible locate and use tools that someone
    else has developed and validated (!)

27
  • Individual Encounter Log
  • Used to document interactions with individuals or
    families lasting 15 minutes and involving
    participant disclosure.
  • Captures encounter characteristics, risk
    categories, participant characteristics,
    referrals.
  • Completed by the crisis counselor immediately
    after the encounter is over.
  • Training considerations Eliciting personal
    information through active listening without
    asking directly.

28
  • Participant Survey (1)
  • Used to obtain feedback about the program.
  • In one selected week each quarter, all adults
    receiving individual or group crisis counseling
    are given a packet containing a cover letter,
    survey, pen, and stamped return envelope.
  • Survey provides some data about immediate
    outcomes of crisis counseling, such as learning
    about common reactions to disasters,
    normalization of feelings and help-seeking, and
    finding ways to take care of ones self family.

29
  • Participant Survey (2)
  • Data on disaster experiences (p. 1) and distress
    (p. 2) provide information about participant
    needs.
  • Distress measure is the SPRINT-E.
  • Training considerations The counselor must be
    convinced that the survey is the recipients
    opportunity to tell the program about community
    needs and how well program is meeting those
    needs.

30
  • Provider Survey (1)
  • Used to capture crisis counselors opinions about
    training, resources, services provided, and
    overall quality of the CCP.
  • The provider survey is collected anonymously from
    crisis counselors and their supervisors at 6 and
    12 months post-disaster.
  • A packet containing a cover letter, survey, and
    pen is given to each crisis counselor together
    with a stamped return envelope, addressed to an
    external evaluator (presently NCPTSD).

31
  • Provider Survey (2)
  • The survey also measures worker stress (p. 2).
  • Respondents identity is protected by lack of
    identifying information, return of the survey to
    an external evaluator, and aggregation of
    results.
  • Training considerations Conveying reasons for,
    and importance of, the survey explaining why
    high response rate matters.

32
Resources
  • For further information about
  • Tools
  • Databases
  • Evaluation manual
  • Contact the SAMHSAs Disaster Technical
    Assistance Center (DTAC). (http//mentalhealth.sa
    mhsa.gov/dtac/)
  • Email -- http//nmhicstore.samhsa.gov/emails/conta
    ct.aspx
  • Phone 1-800-308-3515

33
Ex Ante Evaluation
  • Lessons from the past
  • Problem analysis and needs assessment
  • Objective setting
  • Alternative delivery mechanisms and risk
    assessment
  • Added value of this activity
  • Planning future monitoring and evaluation
  • What types of evaluations are needed and when
    should they be carried out?
  • Are the proposed methods of collecting, storing
    and processing the follow-up data sound?
  • Is the monitoring system fully operational
    already from the outset of the program
    implementation?
  • Helping to achieve cost-effectiveness

34
Ex Ante Evaluation
  • Lessons from the past
  • Problem analysis and needs assessment
  • Objective setting
  • Alternative delivery mechanisms and risk
    assessment
  • Added value of this activity
  • Planning future monitoring and evaluation
  • Helping to achieve cost-effectiveness

35
Where are experts and resources
  • Let your exploration identify the experts and
    resources that are available
  • For our hypothetical example we have the
    following leads
  • Fran Norris - fran.norris_at_dartmouth.edu
  • Robert Macy rdmacy_at_verizon.net
  • SAMHSAs Disaster Technical Assistance Center
    (DTAC). (http//mentalhealth.samhsa.gov/dtac/)
  • Email -- http//nmhicstore.samhsa.gov/emails/conta
    ct.aspx
  • Phone 1-800-308-3515

36
Blank
37
Barriers and Challenges to Conducting Program
Evaluation
  • Conducting program evaluation in the aftermath of
    disasters poses special challenges

38
Crisis And Chaos
  • In the immediate aftermath of disasters,
    decisions need to be made quickly on the basis on
    limited information. The prejudice is towards
    action, not deliberation.
  • During the crisis, there may be little interest
    in collecting systematic information on how the
    program is working. This shortcoming makes it
    difficult to monitor program progress and
    provides few data with which to later evaluate
    program achievements
  • In this context, evaluation may be viewed as
    arbitrary and burdensome, imposed by outsiders
    without a stake in serving survivors

39
Evolving, Adapting Services
  • The nature of the services may evolve over time
    as the needs of survivors change
  • Program models often must be adapted to the
    community, and providers in the field have a
    sense of learning as they go
  • Evaluations cannot assume that services are being
    delivered based on a pre-ordained model. It is
    essential to continually document program
    services and delivery strategies in order to be
    able to evaluate what the program is actually
    doing at different points in time

40
Evolving Community Context
  • Evaluation results are influenced by the
    community context, which also evolves over time.
  • For example, client satisfaction results may be
    higher during early phases of recovery than
    during later stages, when disillusionment sets
    in.
  • Outreach programs may discover problems that
    existed prior to the disaster.
  • Evaluations must be careful to differentiate new
    mental health problems from pre-existing problems

41
Factors That Boost Capacity For Program Evaluation
  • How do we establish an evaluation infrastructure
    that will allow us to maximize learning in future
    disasters.

42
Advance Political Support
  • Building evaluation capacity in disaster mental
    health requires creating an evaluation planning
    component in State Emergency Disaster
    Preparedness programs
  • Embedded in this ethos would be a respect for
    quality management informed by empirical
    feedback, and the expectation of accountability
  • A dialogue among key stakeholders involved in
    post-disaster recovery -- at federal, state, and
    local levels -- to set evaluation policy is
    needed to ensure that the evaluation mandate is
    feasible, relevant to real-world concerns, and
    not unduly burdensome

43
Outcomes The Societal Benefits
  • While outputs assess how much was done,
    outcomes focus on how much good was done. They
    are the least well-specified arena in disaster
    mental health
  • Outcomes differ over time
  • Immediate outcomes can be observed directly after
    completing an activity
  • Intermediate outcomes derive from immediate
    outcomes such as alleviation of psychiatric
    symptoms, reduced substance use, or improved role
    functioning
  • Long-term outcomes are program benefits such as
    community cohesion or disaster preparedness
Write a Comment
User Comments (0)
About PowerShow.com