Title: National Evaluation, Healthy Communities Access Program
1National Evaluation, Healthy Communities Access
Program
- Charles Daly
- Amisha Pandya
- Division of Clinical Quality
- Data Branch/BPHC/HRSA
2Key Points
- Opportunity to demonstrate
- that funding community coalitions is a sound
alternative to a silo funding approach - the importance of effective services integration
and coordination to the health care safety net - Success depends in large part on your voluntary
cooperation in providing information - An exceptional evaluation team is working on very
short timelines to help you tell your story
3Background
- National evaluation of HCAP required by P.L.
107-251 - Report to Congress Sept. 2005
- Final report Sept. 2006
- Report to describe success of funded projects in
- Improving effectiveness, efficiency, and
coordination of services for uninsured and
underserved - Providing better quality health care
- At lower cost than in the absence of projects
4Development
- Coordinated with program and grantees in
developing evaluation design - Evaluators meetings in June and Dec. 2003
- Progress described in All Grantees meeting
January 2004 - Highly structured approach
- HCAP objectives
- Reported grantee activities
- Performance measures, outputs and outcomes
- Tools used by grantees
- Information used to design hypothesis and
questions in scope of work
5Evaluation Design - HCAP Objectives
- Coordination/collaboration
- Effectiveness e.g., integrated services
activities to improve access - Efficiency e.g., services expansion
- Quality e.g., disease management, provider and
patient satisfaction -
6Evaluation Design - HCAP Objectives
- Potential cost savings from health system
improvements - Sustainability
- - Leveraging funds
- - Financial/in-kind support (partners,
- outside sources)
- Increase public awareness of safety net
7Design hypotheses and questions
8Design hypotheses and questions
9The Study
- Major features of the evaluation
- - two phases
- - multi-faceted (quantitative and
- qualitative)
- - minimizes new primary data collection,
sensitive to grantee reporting burden - - recognizes diversity of grantee
- characteristics
-
10Evaluation Design Key Considerations
- Keep major objectives at forefront
- Unique mechanism to fund consortia
- Integrate information from all sources
- Focus on uninsured/underserved
- Enrollment in public health insurance, plans
- Improved access to care
11Evaluation Design Key Considerations
- Capture all major results
- e.g., changes in legislation, regulation
improved patient outcomes -
- Demonstrate systems, quality of care
accomplishments and savings
12Implementation
- Related activities
- Data validation and editing (JSI)
- National evaluation (NORC)
- Moving on fast track to meet timelines
- JSI contacted grantees in October to include
accurate data - NORC evaluation instruments to be fielded
starting in January, 2005
13Multi-faceted Approach to Data
- Data sources
- - 6 month reports ( all cohorts 9/00 through
9/03 grantees --- legacy reports revised reports
for recent grantees) - - applications and baseline data
- - closeout reports
- - provider and project directors survey, leader
discussions, focus groups, case studies - - administrative safety net data (e.g. HCUP,
- ambulatory sensitive conditions, ER use)
-
14What we hope to learn from data collection
- Survey of Providers
- Members perspectives on--
- Role in consortium
- Changes in service delivery as part of consortium
- Long run impact on community
- Project Director Survey
- Views on program structure and funding
- Assessment of effectiveness of collaborative
approach - Usefulness of funding the safety net
infrastructure - Potential changes/improvements
15Focus of data collection, cont.
- Discussions with Consortia Leaders
- What are key factors in success and major
obstacles? - Expand on content from surveys
- Evolution of consortium, role of prior funds
- Role of organizational structure
- Keys to sustainability
- Focus on lessons learned in specific activity
areas - Select consortia based on key activities
- Individually tailored protocol
16Evaluation Design Goals, measures, data
17Contractor Contacts with HCAP Grantees
- Data validation and editing
- Provider survey
- To all provider types for all consortia
- Mail survey with telephone follow up
- Project Director questions
- Web-based
- Key informant Discussions
- Client focus groups
18 Timelines
- Activity Planned dates
- Discussions with 25-30 consortia
- leaders 1/15/05 - 4/29/05
- 3-5 client focus groups 2/2/05 - 4/29/05
- Project Director questions 5/03/05 -
5/28/05 - Safety net provider
- survey 5/03/05 8/26/05
- Due Dates
- Phase 1 Report to Congress Sept. 16, 2005
- National evaluation report Aug. 25, 2006
19Timelines Key Items
- OMB Approval required for provider survey
- Provider survey fielded after approval
- Congressional report due date
- Flexibility to complete the evaluation
20NORC Our Evaluation Contractor
- Principal Investigator
- Claudia Schur, Ph.D
- 7500 Old Georgetown Road, Suite 620
- Bethesda, MD 20814
- 301-951-5072 (voice)
- 301-951-5082 (fax)
- schur-claudia_at_norc.org
- Team Resources
- Survey Research (CODA)
- Safety Net Studies (NYU/Rutgers)
- Dissemination (Project Hope)
-
21Bureau Staff Contacts
- Charles Daly CDaly2_at_hrsa.gov
- 301-594-5110
- Amisha Pandya APandya_at_hrsa.gov
- 301-594-3724
- Diana Koorkanian-Sauders
- DDerKoorkanian_at_hrsa.gov
- 301-594-4113
-
- Sheri Downing-Futrell
- SDowningFutrell_at_hrsa.gov
- 301-594-4468
-