EVALUATION: GETTING STARTED First Steps Toward Finishing Strong - PowerPoint PPT Presentation

1 / 54
About This Presentation
Title:

EVALUATION: GETTING STARTED First Steps Toward Finishing Strong

Description:

Candace J. Chitty, RN, MBA, CPHQ, LHRM. President. Where Results Begin. 2. Where Results Begin ... Collecting information about how things are done and the ... – PowerPoint PPT presentation

Number of Views:89
Avg rating:3.0/5.0
Slides: 55
Provided by: candic1
Category:

less

Transcript and Presenter's Notes

Title: EVALUATION: GETTING STARTED First Steps Toward Finishing Strong


1
EVALUATION GETTING STARTEDFirst Steps Toward
Finishing Strong
  • Candace J. Chitty, RN, MBA, CPHQ, LHRM
  • President

Where Results Begin
2
Never doubt that a small group of thoughtful,
committed people can change the world. Indeed
it is the only thing that ever has.
-Margaret Mead
3
Why Evaluate?
  • Collecting information about how things are done
    and the results helps us understand how community
    initiatives develop and disseminate lessons other
    groups can profit from.
  • Providing ongoing feedback can improve community
    work by encouraging continuous adjustments of
    programs, policies and activities.
  • By involving community members, people who
    havent had a voice may gain the opportunity to
    better understand and contribute to program
    efforts.
  • Finally, evaluation can help hold groups
    accountable to the community and to the grant
    makers who provide funding.

4
Institute of Medicines 2000 Report
  • Described our healthcare safety net as intact,
    but endangered. One of the key recommendations
    in the report focused on the need for data
    systems and measures to evaluate the performance
    of the safety net and health outcomes of
    vulnerable populations.
  • 2000, Institute of Medicines Crossing the
    Quality Chasm

5
HCAP Legislation Grantee Program Monitoring
Evaluation Expectations
  • Grantees will
  • Describe a plan for evaluation of the activities
    carried out under the grant, including
    measurement of progress toward the goals and
    objectives of the program and the use of
    evaluation findings to improve program
    performance (quality improvement).

6
What does the Legislation Require?
  • HCAP Grantees are required to report data at
    least every six months that will show the extent
    to which HCAP activities have
  • Improved the effectiveness, efficiency, and
    coordination of services for uninsured and
    underinsured individuals in the communities or
    geographic areas served by each grantee


7
What does the Legislation Require?
  • And whether these activities resulted in
  • The provision of better quality health care for
    such individuals and
  • The provision of health care to such individuals
    was at a lower cost than would be possible in the
    absence of the activities conducted under HCAP.

8
Key Evaluation Principles
  • First Steps
  • Realize that evaluation is a critical and ongoing
    task of successful programs.
  • Get started early.
  • Implement your evaluation at the same time as
    your initiatives.
  • Support collaborative evaluation planning.
  • Use evaluation as a catalyst for community change
    and sustainability Finishing strong!

9
(No Transcript)
10
Evaluation FrameworkSix Steps
  • Step 1 Engage stakeholders
  • The evaluation cycle begins by engaging
    stakeholders (i.e. persons or organizations
    having an investment in what will be learned from
    the evaluation and what will be done with the
    knowledge)
  • Helps increase chances that the evaluation will
    be useful
  • Can improve evaluations credibility
  • Clarifies evaluation roles and responsibilities

11
Evaluation FrameworkSix Steps
  • Step 2 Describe the program
  • Program descriptions convey the mission and
    objectives of the program being evaluated. They
    should be sufficiently detailed to ensure
    understanding of goals and strategies.
  • Improves evaluations fairness and accuracy
  • Permits a balanced assessment of strengths and
    weaknesses
  • Helps stakeholders understand how program
    features fit together and relate to a larger
    context

12
Why Use the Logic Model? How will it help you?
Helps to clarify what is appropriate to
evaluate, and when, so that evaluation resources
are used wisely. Summarizes complex programs to
communicate with stakeholders, funders,
audiences. Enables effective competition for
resources. (Many funders request logic models in
their grant requests.)
13
Sample Logic Model with Evaluation Questions
14
Evaluation FrameworkSix Steps
  • Step 3 Focus the evaluation design
  • The evaluation must be focused to assess the
    issues of greatest concern to stakeholders while
    using time and resources as efficiently as
    possible.
  • Provides investment in quality
  • Understanding how evaluation results are to be
    used
  • Describes practical methods for sampling, data
    collection, data analysis, interpretation and
    judgment.
  • Provides a written protocol or agreement that
    summarizes evaluation procedures with clear roles
    and responsibilities of stakeholders
  • Revisions of parts or all of the plan when
    critical circumstances change

15
Evaluation FrameworkSix Steps
  • Step 4 Gather credible evidence
  • An evaluation should strive to collect
    information that will convey a well-rounded
    picture of the program so that the information is
    seen as credible by primary users.
  • Improves evaluations fairness and accuracy
  • Permits a balanced assessment of strengths and
    weaknesses
  • Helps stakeholders understand how program
    features fit together and relate to a larger
    context

16
Evaluation FrameworkSix Steps
  • Step 5 Justifying conclusions
  • Evaluation conclusions are justified when they
    are linked to the evidence gathered and judged
    against agreed-upon values or standards set by
    the stakeholders.
  • Reinforces conclusions central to the
    evaluations utility and accuracy
  • Involves analysis and synthesis of qualitative
    and quantitative data, systematic interpretation,
    and appropriate comparison against relevant
    standards

17
Evaluation FrameworkSix Steps
  • Step 6 Ensuring use and sharing lessons learned
  • Preparing for use involves strategic thinking and
    continued vigilance, both of which begin in the
    earliest stages of stakeholder engagement and
    continue throughout the evaluation process.
  • Ensures that evaluation achieves its primary
    purpose being useful

18
Standards for Effective EvaluationJoint
Committee for Standards for Educational Evaluation
  • Standards help to avoid creating an unbalanced
    evaluation ( e.g. one that is accurate and
    feasible but not useful or one that would be
    useful and accurate but is not feasible).
  • Standards are grouped into four categories and
    include a total of 30 specific standards.
  • Source www.eval.org/EvaluationDocuments/progeval
    .html

19
Standards for Effective Evaluation
  • Category One Utility Standards
  • Ensures that information needs of evaluation
    users are satisfied.
  • Stakeholder identification
  • Evaluator credibility
  • Information scope and selection
  • Values identification
  • Report clarity
  • Report timeliness and dissemination
  • Evaluation impact

20
Standards for Effective Evaluation
  • Category Two Feasibility Standards
  • Ensures the evaluation is viable and pragmatic.
  • Practical procedures
  • Political viability
  • Cost effectiveness

21
Standards for Effective Evaluation
  • Category Three Proprietary Standards
  • Ensures the evaluation is ethical.
  • Service orientation
  • Formal agreements
  • Rights of population served
  • Human interactions
  • Complete and fair assessments
  • Disclosure of findings
  • Conflict of interest
  • Fiscal responsibility

22
Standards for Effective Evaluation
  • Category Four Accuracy Standards
  • Ensures the evaluation conveys technically
    adequate information regarding the determining
    features of merit of the program.
  • Program documentation
  • Context analysis
  • Described purposes and procedures
  • Defensible information sources
  • Valid information
  • Reliable information

23
Standards for Effective Evaluation
  • Category Four Accuracy Standards
  • Systematic information
  • Analysis of quantitative information
  • Analysis of qualitative information
  • Justified conclusions
  • Impartial reporting
  • Metaevaluation

24
Defensible Information Sources
  • Reliable.
  • Valid.
  • Operational definitions.
  • Defined populations.
  • Measurements over time.

25
Reliability
  • Reliability is concerned with the accuracy of the
    measurement instrument or procedure.
  • Reliable data provides stable results over time.

26
Validity
  • Validity is concerned with the success at
    measuring what is intended to be measured.
  • Validity is concerned with how a measure or
    procedure appears.
  • Does it seem like a reasonable way to gain the
    information the program is attempting to obtain?
  • Does the measure seem well designed?
  • Does the measure seem as though it will work
    reliably?
  • Are the results generalizable?

27
Operational Definitions
  • Maps the data collection process to ensure
    consistency.
  • Measurement specifications (what, when, how)

28
Clearly Defined PopulationsIMPORTANT
Underinsured
Uninsured
Age Category
FPL Served
29
Measurements Over Time
Baseline Re-measurement Trends
30
Comprehensive Analysis
  • Performance Goals and/or Benchmarks.
  • Data Analysis and Synthesis

31
Performance Goals and/or Benchmarking
  • If you dont know your destination any path will
    do.

32
Data Analysis and Synthesis
  • Data are assessed in order to determine
  • What were the findings?
  • What outcomes went well ?
  • What are the potential causes contributing to
    less than desirable outcomes?
  • How do these causes effect the outcomes?
  • What could have been done better?
  • What would we change and do differently?

33
Who Should Conduct Your Evaluation?
  • Evaluation is a team effort that works to
  • Determine the focus and design of an evaluation
  • Develop the evaluation plan, performance
    indicators, and data collection instruments
  • Collect, analyze, and interpret the data
  • Prepare reports on evaluation findings

34
Types of Evaluation Teams
  • Option 1 Hiring an outside evaluator
  • Option 2 Using an in house evaluation team
    supported by an outside consultant and program
    staff
  • Option 3 Using an in house evaluation team
    supported by program staff

35
Which Evaluation Team Option is Best?
  • Does your HCAP program have funds designated for
    evaluation purposes?
  • Have you successfully conducted previous
    evaluations of similar programs, components, or
    services?
  • Are there existing measures or performance
    indicators in place?
  • Can you collect evaluation information as part of
    your regular program activities?
  • Are there program staff who have training and
    experience in evaluation-related tasks?

36
Hiring an Outside Evaluator
  • Before begin the finding and hiring an outside
    evaluator develop a statement of work (SOW) which
    details the general and specific requirements for
    the evaluator.
  • In creating the SOW you will need to know the
    type of evaluation activities you want the
    evaluator to perform.

37
Hiring an Outside Evaluator
  • Consider the following
  • What level of experience does the evaluator have
    in the area of program evaluation?
  • Does the evaluator have experience conducting
    evaluations in the area of healthcare access
    programs?
  • Can the evaluator offer assistance in the full
    range of program evaluation activities?

38
Hiring an Outside Evaluator
  • Consider the following
  • Will the evaluator present information in a way
    that is useful to you?
  • How much will it cost?
  • Is the evaluator willing to work closely with you?

39
A Good Evaluator
  • Is willing to work collaboratively to develop an
    evaluation plan that meets your needs.
  • Is able to communicate in simple, practical terms
  • Has experience evaluating similar programs and
    working with similar levels of resources
  • Has experience with statistical methods

40
A Good Evaluator
  • Has the time available to do the evaluation
  • Has experience developing data collection forms
    or using standardized instruments
  • Is willing to work with a national evaluation
    team (if there is one)
  • Will treat data confidentiality

41
Managing Evaluation by an Outside Evaluator
  • One mechanism for effectively managing the
    evaluation is to prepare a written contract. The
    contract should include
  • Who will perform evaluation tasks
  • Who owns the evaluation data
  • Your expectations about contacts with the
    evaluator and the program

42
Finishing StrongUsing Evaluation in
Sustainability Policy Change and Investment
Influencers
Where Results Begin
43
  • In business, consumer and supplier buying
    decisions are heavily influenced on a sellers
    past performance. The seller must demonstrate
    proven and effective delivery of products and/or
    services.

44
Ideally, in healthcare, policymakers and
community stakeholders should be able to make
policy and/or investment decisions based on
available, understandable, and reliable outcome
performance measures.
45
  • The ideal may be difficult to realize in todays
    healthcare system due to
  • Proven and effective outcome measures remain few.
  • Data for developing outcome measures are often
    hard to come by.
  • The populations are often transient in nature.

46
Data Influences Policy
  • Outcome performance measures derived from sound
    methodology presented in a clear and
    understandable way cannot ensure the desired
    outcome but it has been proven to be a compelling
    strategy for getting the attention of key policy
    and decision-makers.
  • Source Agency for Healthcare Research and Quality

47
  • Reliable outcome data allows key policy and
    decision-makers to
  • Assess the effects of health care program and
    policy choices.
  • Guide future health care policy making.
  • Accurately measure outcomes, community access to
    care, utilization, and costs.

48
  • One of the Keys to influencing a policy or
    community change is to clearly deliver the
    message. This means getting the message straight
    by presenting information which is
    understandable, concise and factually correct.
  • Lawrence and Lewin, Presenting Information to
    Decision-makers A Guide for Policy Analysts

49
Using Evaluation to Deliver a Clear Message
  • Tells a Good Story.
  • Incorporate information from performance
    measurement into the story.

50
Value of Evaluation Data in Telling The Story
  • Data is generally used for one of these purposes
  • To Persuade (that is, you are giving your
    opinions about what you think should be done
    given the facts you present) or  
  • To Inform about trends (just the facts maam)
    so your reader can decide what they think should
    be done given the facts you present.

51
  • A good story tells us what we didnt know before,
    and after reading it, we realize we really,
    really wanted to know it.

52
Elements of A Good Story
  • Enticing Hey-this story looks interesting.
  • Captivating I cant put this down. Focuses on
    things people will remember after hearing your
    story.
  • Persuasive This is for real Data-driven
    performance based on sound methodology.
  • Stimulating The WOW Factor.

53
To Sum It Up.
  • Evaluation is an early and central part of your
    HCAP program.
  • You have taken your first steps toward finishing
    strong by understanding key evaluation
    principles, framework and standards and applying
    them as guidelines in your evaluation plan.
  • Good luck.

54
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com