Planning Council Institute ASSESSMENT OF THE ADMINSTRATIVE MECHANISM: A New Collaborative Approach - PowerPoint PPT Presentation

1 / 72
About This Presentation
Title:

Planning Council Institute ASSESSMENT OF THE ADMINSTRATIVE MECHANISM: A New Collaborative Approach

Description:

Planning Council staff and non-conflicted members of Executive Committee conduct ... Avoids conflict of interest. Consultant hired for consecutive AAMs ... – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 73
Provided by: dougm75
Category:

less

Transcript and Presenter's Notes

Title: Planning Council Institute ASSESSMENT OF THE ADMINSTRATIVE MECHANISM: A New Collaborative Approach


1
Planning Council InstituteASSESSMENT OF
THEADMINSTRATIVE MECHANISMA New Collaborative
Approach
  • RWCA Training and
  • Technical Assistance
  • Grantee Meeting

Washington, DC August 30, 2006
2
INTRODUCTION andBACKGROUNDMODERATORKaren
Ingvolstad, Project OfficerDivision of Service
Systems, HIV/AIDS Bureau (HAB) Health Resources
and Services Administration
3
Legislative Intent
  • Legislative Mandate Planning Council Duties
  • CARE Act, Section 2602(b)(4)(E) . . . assess
    the efficiency of the administrative mechanism in
    rapidly allocating funds to the areas of greatest
    need within the eligible area, and at the
    discretion of the planning council, assess the
    effectiveness, either directly or through
    contractual arrangements, of the services offered
    in meeting the identified needs
  • Congressional Intent Emergency Services Grant

4
Assess the Efficiency of the Administrative
Mechanism
  • . . . for the rapid allocation of funds to areas
  • of greatest need.
  • Usually includes answering such questions
  • How quickly are contracts for service providers
    signed?
  • Are providers paid on time?
  • Is the grantee funding the Planning Council
    priorities? Consistent with Directives?
  • Is actual procurement consistent with the
    Planning Councils allocations?

5
Evaluate the Effectiveness of Care (Optional
Activity)
  • . . . at the discretion of the planning council,
  • assess the effectiveness . . . of the services
  • offered in meeting the identified needs.
  • Usually includes answering such questions
  • Have we achieved the expected service outcomes
    for consumers (e.g., clinical)?
  • Are supportive services contributing to entry and
    retention in primary care?
  • How well do services or the service mix funded by
    Title I meet community needs identified by the
    PC?
  • Used to improve overall system of care?

6
HAB Expectations
  • AAM Annual Requirement
  • Full/modified (may also include Optional
    Activities)
  • Scope limited
  • Anything broader must be mutually agreed
  • Information to be collected must be agreed in
    advance (no surprises)
  • When appropriate, should result in
    recommendations to the AA
  • AA must respond to PC recommendations
  • Title I grant application requires brief
    description of AAM process and summary of
    findings and recommendations

7
Workshop Outline
  • Three EMAs will describe
  • EMA, PC and AA
  • AAM process
  • Information collected
  • What worked, what didnt
  • Critical elements essential for a successful AAM
  • Conclusions, discussion, QA

8
Workshop Presenters
  • Marvin Krieger, Coordinator, Hudson County
    HIV/AIDS Services Planning Council
  • Cyd Lacanienta, MSW, Chairwoman and CEO,
    Intergroup Services, Inc., Greater Baltimore HIV
    Health Services Planning Council
  • Craig Vincent-Jones, Executive Director, Los
    Angeles County Commission on HIV

9
HUDSON COUNTY/JERSEY CITYASSESSMENT OF
THEADMINISTRATIVEMECHANISMMarvin Krieger,
CoordinatorHudson County HIV/AIDS Services
Planning Council
10
Hudson County/Jersey City EMA
  • Single County 12 municipalities, 47 sq. mi.
  • 4,340 HIV/AIDS prevalence (2005)
  • 5,000,000 Title I funds annually
  • 20 care/treatment providers
  • 3 primary medical Case Management sites

11
Administrative Mechanism Partners
  • Chief Elected Official County Executive
  • Grantee Department of Health and Human
    Services/Board of Chosen Freeholders
  • Administrative Agency Department of Health and
    Human Services (DHHS)
  • Planning Council Stand-Alone Entity

12
Administrative Relationships
  • Board of Chosen Freeholders approves all funds
    awarded
  • DHHS reports to Freeholders and County Executive
  • DHHS issues RFPs and awards contracts
    to providers
  • DHHS monitors contracts

13
Process
  • Planning Council staff and non-conflicted members
    of Executive Committee conduct AAM
  • Timeline developed with grantee outlining
    expectations and reporting responsibilities
  • Process begins at time of award announcement

14
Data Collection
  • Examine services by priority categories
  • Core service providers established
  • II and III Tier services sampling drawn
  • Previous AAMs reviewed
  • Finalize list of service providers to be
    examined
  • Consider reports of dissatisfaction from
    consumers and providers
  • Finalize provider surveys and interview
    questions

15
Analysis
  • Contract execution receiving of grant award
    notice to contractual execution
  • Timeliness of awarding contract letter of intent
  • Timeliness of execution of contract
  • Freehold approval of contract funds
  • Timeliness of agency returning signed contract

16
Program Monitoring
  • Provider contract award conference
  • Monitoring administrative/utilization data
    reporting
  • Incorporation of PC directives and consumer
    satisfaction findings
  • Notification of Corrective Action
  • Availability of TA
  • Corrective Action taken/Probation placement

17
Administrative Monitoring
  • Planned vs. actual utilization
  • Projected utilization service and funding
  • Council Action (CA)/Planning Council (PC)
  • Notification of under/over utilization
  • CA/PC recommendation for redistribution
  • Funding redistribution

18
Qualitative Analysis
  • Contract process reporting
  • Reported/actual timeline
  • TA availability
  • Monitoring process assessment
  • Quality evaluation
  • Internal tools and external tools and outcomes

19
What Worked, What Didnt
  • Successes
  • Development of MOU
  • Trust building between PC/Grantee/Providers/Consum
    ers
  • Open monitoring system
  • Accessibility of utilization data
  • Challenges
  • Timeliness of RFPs for new program designs
  • Reallocation of funds within current fiscal year
  • Standardization of agency internal evaluation and
    consumer satisfaction surveys

20
Next Steps
  • Rework PC self-evaluation survey
  • Standardize agency satisfaction surveys

21
BALTIMOREASSESSMENT OF THEADMINISTRATIVEMECHANI
SMCyd Lacanienta, MSW, Chairwoman and CEO,
Intergroup Services, Inc., Greater Baltimore HIV
Health Services Planning Council
22
Baltimore EMA
  • 7 jurisdictions Baltimore City and the counties
    of Anne Arundel, Baltimore, Carroll, Harford,
    Howard, Queen Annes. (Baltimore County and
    Baltimore City are wholly independent of each
    other.)
  • 20 million in Title I funding.
  • 28,000 people estimated to be living with
    HIV/AIDS.
  • Over 60 providers and 200 contracts.

23
Administrative Mechanism Partners
  • Chief Elected Official Mayor of Baltimore City.
  • Grantee Appointed by CEO Baltimore City Health
    Department (BCHD).
  • Administrative Agency Associated Black
    Charities (contracted by BCHD).
  • Planning Council Appointed by the CEO.
  • Planning Council Support InterGroup Services,
    Inc. (contracted by BCHD).

24
Administrative Relationships
25
Administrative Relationships (cont.)
  • Since 2002, BCHD has contracted out the
    administrative agency services.
  • Since 1992, BCHD has contracted out for the for
    the planning council support function. The
    contract has since 2002 been held by InterGroup
    Services, Inc., a management consulting and
    project management company.

26
Process
  • Evaluation Committee (EC) of the PC is
    responsible for AAM.
  • Each year, AAM begins in April and concludes by
    the end of August.
  • The PC develops, and directs IGS to produce, two
    items resulting from the assessment
  • Recommendation of findings to be included in the
    letter of assurance by the PC chair, with or
    without corrective action
  • Narrative of assessment process in the
    application.

27
Process (cont.)
  • EC develops/revises the assessment tool,
    data gathering surveys and questionnaires
    with input from grantee and AA.
  • Data from all providers, grantee and AA are
    solicited, gathered by a consultant hired by IGS
    and compiled for review by the committee.
  • Consultant presents data findings to EC.

28
Process (cont.)
  • EC holds a series of meetings to deliberate
    and determine scores based on predetermined
    criteria and measures.
  • EC provides recommendation to the PC as part
    of the letter of assurance and application
    contribution.

29
Data Collection
  • Quantitative data that are easily measured (e.g.,
    when award letter was received, when signed
    contract was received).
  • Qualitative data that include feedback from
    providers.

30
Analysis
  • EC does the analysis as a group.
  • EC scores AAM, using its AAM tool and based on
    information gathered by the IGS-hired consultant.
  • EC assesses the administrative mechanism based on
    65 variables or objectives. Each variable has a
    predetermined point value, with a total of 100
    possible points that can be achieved.
  • Scoring of each variable is based on whether
    grantee/AA has met its objective (full
    achievement, partial achievement,
    non-achievement, not applicable).

31
Qualitative Analysis
  • Rating categories
  • Procurement process.
  • Fiscal monitoring.
  • Program monitoring.
  • Tracking systems.
  • Contract development.
  • Allocation, priorities, reallocation, carry-over.
  • Communication and reporting.
  • Barriers and concerns.
  • Timeliness.
  • Flexibility.

32
What Worked, What Didnt
  • Successes
  • Soliciting input from grantee and AA on what will
    be collected and measured.
  • Determining the objectives to be measured and
    the documentation that supports those objectives.
  • Utilization of consultant for independently
    gathering data from providers, grantee and AA.
  • Common definition of terminology.

33
What Worked, What Didnt (cont.)
  • Challenges
  • Development of qualitative analysis of
    administrative mechanism.
  • Training of PC volunteers on assessment
    process.

34
Successes
  • A marked decrease of some two thirds
    in carry-over funds since 2002. In FY 2002,
    Baltimore had a cumulative carry-over amount of
    1.4 million. By FY 2004, the cumulative
    carry-over amount had been decreased to
    approximately 500,000.
  • The AAM provides payment to vendors
    within 30 days of receipt of an invoice and
    performance report.

35
Successes (cont.)
  • Reallocation within a fiscal year has resulted in
    over 95 percent of EMA funds being expended
    since FY 2001.

36
Successes (cont.)
37
Next Steps
  • EC to develop qualitative portion of the
    assessment.

38
LOS ANGELES COUNTYASSESSMENT OF
THEADMINISTRATIVEMECHANISMCraig
Vincent-Jones, Executive DirectorLos Angeles
County Commission on HIV
39
Los Angeles County (LAC) EMA
  • Single County 4,000 square miles
  • 50,000 HIV/AIDS prevalence
  • 37,000,000 Title I/II funds annually
  • 66 care/treatment providers, over 200
    contracts

40
Administrative Mechanism Partners
  • Chief Elected Official Board of Supervisors
  • Grantee Department of Public Health (DPH)
  • Administrative Agency Office of AIDS
  • Programs and Policy (OAPP)
  • Planning Council Commission on HIV

41
Administrative Relationships
  • OAPP is a unit of DPH
  • DPH and Commission each report to CEO
  • Commission used to be part of OAPP
  • separated two years ago
  • OAPP contracts providers
  • OAPP administratively monitors contracts
  • DPH fiscally monitors contracts

42
Purpose
  • HRSA Planning Council Manual, 2003
  • . . . assess the efficiency of the administering
    agency in rapidly allocating funds to the areas
    of greatest need.
  • The planning council may also, at their
    discretion, assess how well services that are
    funded by the grantee address the planning
    councils priorities, allocations, and
    instructions for addressing these priorities . .
    .

43
Purpose (cont.)
  • HRSA Planning Council Manual, 2003 (cont.)
  • . . . not an evaluation of the grantee or
    individual service providers, which is a grantee
    responsibility.
  • . . how efficiently providers are selected and
    paid . . .
  • . . how well (providers) contracts are
    monitored . . .
  • . . effectiveness of the services offered in
    meeting identified need . . .

44
Process
  • Commission conducts AAM
  • Contracts with consultant
  • AAM entails studying Commission along with
  • other partners
  • Avoids conflict of interest
  • Consultant hired for consecutive AAMs
  • Begins in March for prior year
  • Lasts four to five months

45
Data Collection
  • Quantitative
  • Separating contracts into bands by service
    category, size, contract renewal type
  • Stratified sampling
  • Qualitative
  • Literature review
  • Other EMAs
  • Past LAC AAMs
  • Key informant interviews
  • Provider surveys
  • Matched to stratified sampling

46
Quantitative Analysis
  • Contract execution events
  • Time lapse between project approval to executed
    contract
  • Key milestones
  • Recommended award to bidder
  • Filing to DPH Contracts and Grants (CG)
  • Board of Supervisors approval
  • Contract back from CG to OAPP
  • Agency for signature
  • Signed contract back from agency
  • To CG for full execution
  • Fully executed contract

47
Quantitative Analysis Care Services RFPs
48
Quantitative Analysis Program Monitoring
  • Program monitoring events
  • Time lapse between monitoring notification to
    POCA
  • Key milestones
  • Engagement letter
  • Entrance conference
  • Monitoring visit
  • Exit conference
  • Review report issued
  • Plan of Corrective Action (POCA) due
  • POCA submission and satisfactory status
  • POCA approved by OAPP

49
Quantitative Analysis Program Monitoring
(cont.)
50
Quantitative Analysis Administrative
Monitoring
  • Administrative monitoring events
  • Time lapse between desk audit to issuance of
    compliance letter
  • Key milestones
  • Executed contract
  • Desk audit
  • Findings issued
  • Due date of response
  • First, second reminder and do not pay letters
  • All items received
  • Compliance letter

51
Quantitative Analysis Administrative
Monitoring (cont.)
52
Quantitative Analysis Fiscal Review
  • Fiscal review events
  • Time lapse between audit and POCA
  • Key milestones
  • Engagement letter
  • Entrance conference
  • Draft report issued
  • Exit conference
  • Second draft report
  • POCA submission and satisfactory status
  • POCA approved by OAPP

53
Qualitative Analysis
  • Satisfaction
  • Timeliness
  • Clarity
  • Accuracy
  • Availability
  • Quality
  • Relationships

54
Qualitative Analysis Satisfaction
55
Qualitative Analysis Timeliness
56
Qualitative Analysis Clarity of Information
57
Qualitative Analysis Accuracy of Information
58
Qualitative Analysis Availability of
Assistance/Tools
59
Qualitative Analysis Quality of
Assistance/Tools
60
Qualitative Analysis Relationships
61
Findings and Observations
62
Recommendations/Follow-Up
63
What Worked, What Didnt
  • Successes
  • Benchmarking
  • Quantitative data collection
  • Qualitative analysis
  • Recommendations and follow-up process
  • Challenges
  • Expense
  • Consecutive-year AAMs
  • Literature review
  • Provider surveys

64
Next Steps
  • Select targets for further analysis in
    subsequent years
  • POCAs
  • Coordinate AAM methodology with
    other EMAs
  • Benchmark various timing events
  • Develop comparison baseline nationally

65
CRITICAL ELEMENTSESSENTIAL FOR AAMSUCCESS
66
Critical Elements
  • Collaborate between partners
  • Early agreement about AAM scope and boundaries
  • Define relationships between process contributors
  • Defined time spans
  • Follow-up mechanism and established benchmarks

67
Collaboration Between Partners
  • Jersey City
  • Baltimore
  • Los Angeles County

68
Early Agreement About AAM Scope and Boundaries
  • Jersey City
  • Baltimore
  • Los Angeles County

69
Define Relationships Between Process
Contributors
  • Jersey City
  • Baltimore
  • Los Angeles County

70
Defined Time Spans
  • Jersey City
  • Baltimore
  • Los Angeles County

71
Follow-Up Mechanism and Established Benchmarks
  • Jersey City
  • Baltimore
  • Los Angeles County

72
CONCLUSIONS/DISCUSSION/QUESTIONS/ANSWERS
Write a Comment
User Comments (0)
About PowerShow.com