Title: Preparing for CrossSite Evaluation Presented by Olivia Silber Ashley, Dr.P.H., Jennifer Gard, M.P.H.
1Preparing for Cross-Site EvaluationPresented by
Olivia Silber Ashley, Dr.P.H., Jennifer Gard,
M.P.H., and Kellie M. Loomis, M.Ed.Presented
toOffice of Adolescent Pregnancy Programs Care
Grantee Conference, February 1-2, 2007, New
Orleans, Louisiania
3040 Cornwallis Road P.O. Box 12194
Research Triangle Park, NC 27709
Phone 919-541-6427
e-mail osilber_at_rti.org
Fax 919-485-5555
RTI International is a trade name of Research
Triangle Institute
2Acknowledgements
- This project was conducted by RTI for the Office
of Population Affairs, Department of Health and
Human Services, under Contract No. 233-02-0090,
Task Order 26 - Barbara Cohen, OPA Project Officer
- AFL Project Officers and other OPA staff
- Linda Bailey-Stone, Karl Bauman, Jennifer Gard,
Sonya Green, Kellie Loomis, Adrienne Rooks,
Ellen Wilson--RTI - Project Staff and Client Committee
- Project Expert Work Group
3Overview
- Core evaluation instruments
- Cross-site evaluation
- Preparation
4Background on Core Evaluation Instruments
- Office of Management and Budget (OMB) recently
examined the AFL program using its Program
Assessment Rating Tool (PART) - Identified program strengths
- Program purpose
- Design
- Management
- Identified areas for improvement
- Strategic planning
- Program results/accountability
- In response, OPA
- Developed baseline and follow-up core evaluation
instruments - Developed performance measures to track
demonstration project effectiveness
5Staff and Client Advisory Committee
- Anne Badgley
- Leisa Bishop
- Doreen Brown
- Carl Christopher
- Cheri Christopher
- Audra Cummings
- Christina Diaz
- Amy Lewin
- David MacPhee
- Janet Mapp
- Ruben Martinez
- Mary Lou McCloud
- Charnese McPherson
- Alice Skenandore
- Jared Stangenberg
- Cherie Wooden
6Capacity Assessment Methods
- Review of grant applications, annual reports, and
other information from 28 most recently funded
programs - Qualitative assessment involving program
directors, evaluators, and staff in - 14 Title XX Prevention programs
- 14 Title XX Care programs
- Telephone interviews
- Site visit
- Observations of data collection activities
- Document review
- Conducted between January 26, 2006, and March 16,
2006 - 31 interviews involving 73 interviewees across 28
programs - 100 response rate
7Selected Title XX Prevention and Care Programs
- Baptist Childrens Home Ministries
- Boston Medical Center
- Emory University
- Freedom Foundation of New Jersey, Inc.
- Heritage Community Services
- Ingham County Health Department
- James Madison University
- Kings Community Action
- National Organization of Concerned Black Men
- Our Lady of Lourdes
- Red Cliff Band of Chippewas
- St. Vincent Mercy Medical Center
- Switchboard of Miami, Inc.
- Youth Opportunities Unlimited
- Childrens Home Society of Washington
- Childrens Hospital
- Choctaw Nation of Oklahoma
- Congreso de Latinos Unidos
- Hidalgo Medical Services
- Illinois Department of Human Services
- Metro Atlanta Youth for Christ
- Roca, Inc.
- Rosalie Manor Community Family Services
- San Mateo County Health Services Agency
- Truman Medical Services
- University of Utah
- Youth and Family Alliance/Lifeworks
- YWCA of Rochester and Monroe
8Capacity Assessment Research Questions
- What is the data collection capacity of AFL
Prevention and Care demonstration projects? - How and to what extent have AFL projects used the
core evaluation instruments? What problems have
AFL projects encountered with the instruments? - What data collection systems and evaluation
designs are appropriate for the AFL program? - What are the potential barriers to projects
participating in electronic data collection
and/or a cross-site evaluation?
9Difficulties with Core Evaluation Instruments
among Care Programs
10Difficulties with Core Evaluation Instruments
among Prevention Programs
11Expert Work Group
- Elaine Borawski
- Claire Brindis
- Meredith Kelsey
- Doug Kirby
- Lisa Lieberman
- Dennis McBride
- Jeff Tanner
- Lynne Tingle
- Amy Tsui
- Gina Wingood
12Draft Revision of Core Evaluation Instruments
- Confidentiality statement
- 5th grade reading level
- Instructions for adolescent respondents
- Re-ordering of questions
- Improved formatting
- Sensitivity to diverse family structures
- Consistency in response options
- Improved fidelity to original source items
- Eliminated birth control question for pregnant
adolescents - Modified birth control question for parenting
adolescents - Clarified reference child
- Separated questions about counseling/testing and
treatment for STD - Modified living situation question
- Improved race question
- Added pneumococcal vaccine (PCV) item
13Future Activities
- Create crosswalk from original instrument items
to revised items - Translate instruments and consent/assent forms
into Spanish - Pilot test
- Develop database structure
- Seek OMB clearance
- Individual-level data collection
- Possible additional revisions to the core
evaluation instruments - Provide technical assistance and training
14Purpose of Cross-Site Evaluation
- Improve OPAs PART rating
- Provide evaluation data about the AFL program as
a whole - Inform resource allocation decisions
- Determine the activities and impacts of AFL
demonstration project efforts - Inform policy decisions about program
- Support
- Expansion
- Improvement
15Conceptual Model
Stage 1
Stage 2
Stage 3
Stage 4
Assess capacity of AFL grantees
Convene expert work group
Convene expert work group
Develop core evaluation instruments
- Revise
- Convert to Teleform
- Pilot test
- Translate to Spanish
- Pilot test
Train prevention grantees on using new
instruments
Train care grantees on using new instruments and
provide technical assistance
Obtain initial OMB clearance
Obtain OMB clearance for new instruments and
individual-level data coll.
Develop cross-site evaluation plan
Develop analysis plan
Conduct cross-site evaluation
16Capacity Assessment Findings
- 13 of 14 programs use a comparison group
- 4 use random assignment
- Comparison groups smaller than treatment groups
- Paper and pencil surveys
- Home, hospital, or clinic-based individual data
collection - Program staff collecting data
- Rolling intake baseline data collection
- Follow-up at different time periods
- Respondent ID numbers and names
- Follow-up for non-responders
- Most clients have very limited access to the Web
- Wish for dataset structure
- Access to cross-site evaluation data
- No major barriers to meta-analysis
- Open to training and documentation about
standardized data collection procedures
17Draft Evaluation Design
- Implementation evaluation
- Outcome evaluation
- Consider program characteristics
- Inclusion/exclusion criteria
- Two analytic strategies used for the
meta-analysis - Treating each project as a unit of analysis, with
the effect sizes of the projects as the focus - Including all adolescents within projects in the
project-level study together as a unit of
analysis, with program exposure as a predictor
variable on performance measures - Provide assistance with tracking non-responders
- Address missing data
- Multiple imputation
- Maximum likelihood modeling
- Mediation and moderation analysis
18Draft Timeline
19Preparing for Cross-Site Evaluation
- Maintain and improve sampling strategy
- Maintain comparison groups
- Consider randomization
- Standardize and improve outcome data collection
- Avoid bias
- Improve confidentiality
- Minimize attrition
- Minimize threats to validity
20Sampling Strategy and Evaluation Design
- Large, representative sample
- Control or comparison group
- Appropriate to answer evaluation research
questions - Random assignment is the gold standard to answer
research questions about program effectiveness - Units for study (such as individuals, schools,
clinics, or geographical areas) are randomly
allocated to groups exposed to different
treatment conditions - Begin with most rigorous design possible
- Source USDHHS. (2002). Science-based prevention
programs and principles, 2002. Rockville, MD
Author.
21Why Standardize Data Collection Procedures?
- Grantees voiced a need
- Collect quality data uniformly
- Allow for generalization of findings across sites
- Comply with Federal regulations
22Data Collection
- Quality of measurement procedures
- Strong evaluations collect data using unbiased
procedures - Participant subject data are anonymous or
confidential - Ensure that data are coded and stored to protect
individual identities - Source USDHHS. (2002). Science-based prevention
programs and principles, 2002. Rockville, MD
Author.
23Principles Guiding Human Subjects Research
Respect for persons (let people make their own
decisions)
Beneficence (do no harm)
Justice (include all types of people in research)
Source Ryan, K.J., Brady, J.V., Cooke, R.E.,
Height, D.I., Jonsen, A.R., King, P., et al.
(1979). The Belmont Report Ethical principles
and guidelines for the protection of human
subjects of research. Washington, DC The
National Commission for the Protection of Human
Subjects of Biomedical and Behavioral Research,
Office of the Secretary, U.S. Department of
Health, Education, and Welfare.
24Shared Responsibility between RTI and AFL
- All RTI research involving human subjects is
governed by the Code of Federal Regulations 45
CFR 46 - RTI bears full responsibility for ensuring that
human subjects research is conducted in
accordance with the Federal regulations - RTIs Institutional Review Board (IRB) must
review and approve all research involving human
subjects - Both RTI project staff and AFL project staff are
responsible for - Protecting the rights and welfare of human
subjects - Complying with Federal regulations
25Draft Confidentiality Guidelines
- Improve perceptions of confidentiality among
adolescents - Increase disclosure
- Avoid social desirability bias
- ID numbers with no identifying information
- Sealed envelope
- Staff confidentiality agreement
26Survey Administration
- Read questions aloud if necessary
- Avoid interpreting questions or providing help
beyond reading questions aloud - A staff person knowledgeable about the instrument
and study should be available to answer questions
about the study if needed - Use sealed envelope
- After completion, check with adolescents to see
whether they have questions or want to discuss
feelings or issues - Ensure time
- Provide privacy
27Data Storage and Shipping
- Store signed consent/assent forms separately from
completed instruments - Ship signed consent/assent forms separately from
completed instruments - Separate packages
- Different days
- Federal Express versus mail
- Notify recipient
- When shipment sent
- Tracking number
- If shipments do not arrive as scheduled, intended
recipient will immediately initiate tracing
through Federal Express - Monitor, provide feedback, and provide
re-training if needed
28Attrition
- Number of participants lost over the course of a
program evaluation - Some participant loss is inevitable due to
transitions among program recipients - Extraordinary attrition rates generally lower the
degree of confidence reviewers are able to place
on outcome findings - Source USDHHS. (2002). Science-based prevention
programs and principles, 2002. Rockville, MD
Author.
29Threats to Validity
- Evaluation design must establish a causal link
between the program and its presumed outcomes - Must be able to rule out other factors that could
explain outcomes, such as - Competing programs
- Concurrent media campaigns
- Effects of maturation among evaluation
participants - Source USDHHS. (2002). Science-based prevention
programs and principles, 2002. Rockville, MD
Author.
30Next Steps
- RTI IRB approval
- OPA review
- Staff and client committee review
- Pilot test standardized data collection
procedures - Debrief with pilot sites to receive feedback
- Incorporate comments, revise, improve
- Provide training and technical assistance
- RTI and AFL staff possibly conduct initial data
collection for cross-site evaluation together