ACCE CONFERENCE - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

ACCE CONFERENCE

Description:

... of using US troops to force the Iraqis to leave Kuwait? ... 'Should or should not the US take military action if Iraq refuses to withdraw from Kuwait? ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:3.0/5.0
Slides: 25
Provided by: hpcust
Category:
Tags: acce | conference | lao

less

Transcript and Presenter's Notes

Title: ACCE CONFERENCE


1
Accountability inCareer Development College
Placement (CDCP)
  • ACCE CONFERENCE
  • LONG BEACH, CA
  • FEBRUARY 27, 2009

2
Willard Hom
2
  • Director/Dean of Research Planning,
    Chancellors Office, California Community
    Colleges
  • In Research Planning at Chancellors Office
    since 1999
  • Researcher at California Dept. of Health Services
    and Calif. Employment Development Dept. beginning
    in 1981
  • MBA B.A. (Political Science)

3
Session Goals
3
  • Help college officials and staff to see the
    state-level administrative and policy dimensions
    of CDCP data reporting.
  • Help colleges to see the importance of their data
    to the performance indicators.
  • Help the Chancellors Office in Sacramento to see
    the effects of this reporting and areas of
    possible help.

4
Accountability in Context
4
  • ARCC (through AB 1417) A planned concept and
    top-down in history
  • CDCP (through SB 361) An adhoc concept and
    bottom-up in history

5
Multiple Functions in CDCP
5
  • Career Development---Career Technical Ed/Voc-Ed
  • College PreparationBasic Skills/Transfer-Awards

6
ARCC Technical Advisory Group (TAG)
6
  • ACCE representation
  • Chancellors Office Academic Affairs Division (4
    representatives)
  • College researchers, CEO, State Academic Senate,
    DOF, LAO, college MIS, and matriculation
  • Input to the CDCP reporting process outside of
    ARCC TAG

7
Budget Aspects
7
  • CDCP has a very small piece of the community
    college budget (a good and a bad thing).
  • Unlike ARCC, CDCP had no new funding for research
    or analysis at the Chancellors Office.
  • The cost of demonstrating the effect of CDCP in
    comparison to its cost may be prohibitive.
  • The burden of proof lies with program defenders
    and not with program critics/opponents.

8
Some Budget Hypotheses
8
  • Current government and public attitudes toward
    new expenditures create more strings attached
    than before.
  • All programs may have benefits but those with
    more convincing evidence of their relative public
    value can compete more successfully for scarce
    tax dollars (ignoring political clout of program
    supporters).
  • As documented program costs rise, some decision
    makers may decide a program is not affordable.

9
A Case of WYSIWYG
9
  • CDCP indicators match legislation.
  • The Chancellors Office has little control over
    the CDCP performance indicators.
  • The Chancellors Office has no funding to go much
    beyond the current indicators (esp. for the ARCC
    report).
  • The Chancellors Office can share its data with
    local researchers to promote local analysis.
  • DOF and LAO already expect very low success
    rates. Improvement is the key.

10
Goals of CDCP Analysis
10
  • Provide oversight bodies with basic (quick and
    dirty) means of detecting state-level benefit
    produced for the legislations intended target
    population.
  • Inform state-level policy decisions in the near
    future.
  • Inform local boards about CDCP.
  • Spur attention to data quality.
  • Other desired uses of CDCP data are
    coincidental.

11
College Comparisons
11
  • No peer grouping (data issues and staffing
    resources)
  • State policy decision not focused on low
    performers
  • Rankings avoided

12
Two Reports
12
  • ARCCJanuary draft at http//www.cccco.edu/Syste
    mOffice/Divisions/TechResearchInfo/ResearchandPlan
    ning/ARCC/tabid/292/Default.aspx
  • CDCP Supplemental ReportOfficial report
    athttp//www.cccco.edu/Portals/4/TRIS/research/r
    eports/cdcp_report_june_08.pdf
  • Explicit descriptions of the CDCP performance
    indicators appear in both documents.

13
Local Access to CDCP Data
13
  • In the works is a data on demand service for
    CDCP data to facilitate local analysis and/or
    local derivation of custom performance
    indicators.
  • MIS Dean Myrna Huffman and her staff have
    committed to this additional service.
  • College institutional research and MIS staff can
    help (to explain results to trustees, improve
    data quality, and to foster improvement of the
    local program).

14
Data Issues
14
  • Most CDCP outcomes specified in the legislation
    take time to materialize.
  • Curriculum Reporting for the Community Colleges
    (CRCC) of October 2007 and related work had
    effect.
  • Data quality in noncredit programs may limit the
    analysis (especially regarding SSNs for tracking
    across time and collegesno SSN, no wage gain).
  • State MIS lacks measures of student
    preparation.
  • Absence of grades in noncredit forces the use of
    other measures of success (i.e., term-to-term
    persistence).

15
Effect of Missing Student IDs
15
  • Probably underestimation of success where success
    requires tracking students across time and place
    (different colleges). With some 40 of NC
    students missing SSNs, this could be
    substantial.
  • If a student receives multiple student ID numbers
    from one college (in lieu of the SSN), this
    scenario also can lead to underestimation of
    success within a college (different from the
    first bullet).

16
The Sequence Issue
16
  • Oversight bodies interpreted this as a
    time-ordered required path that students had to
    take to succeed. (Nonlinear behavior conflicts
    with prior assumptions.)
  • There may be an equity issue because some
    colleges may achieve outcomes for very short
    course sequences while other colleges can only
    achieve outcomes for much longer sequences.

17
The Denominator Problem
17
  • Rates of success use this proportion (positive
    outcome within target pop/target pop)
  • What is the target population of CDCP?Is there
    a behavioral measure like the one for the SPAR
    in ARCC?Is there a usable student-reported
    flag?Do colleges effectively channel
    students?

18
CDCPs Current Denominator
18
  • Misses some beneficiaries of CDCPunderstating
    CDCPs total benefit
  • But focuses on the legislations specified target
    population
  • Probably captures the bulk of the CDCP effect
  • Expanding the population in the denominator may
    produce much lower rates of performance

19
Policy Dilemmas in Noncredit
19
  • Customization/adaptation to fit student
  • Local flexibility to respond to local needs
    (market for noncredit education/training)
  • Standardization for economy of scale
  • Standardization for evaluation research
  • Standardization for accountability

20
Research Dilemmas
20
  • Ongoing programs usually pose the most difficult
    challenge in terms of program evaluation.
  • Lack of data and funding impede the conduct of
    evaluation by unbiased third-parties.
  • We have a scarcity of published studies on this
    topic.
  • Pressure to obtain positive findings may limit
    the realization of rigorous evaluation.

21
The Most Convincing Evidence
21
  • Have an independent party conduct a true
    experiment with random assignment.
  • This is not a likely scenario in our current
    environment (cost and policy issues).
  • Next best option is a rigorous statistical
    analysis that passes a peer review process (or
    receives some other signal of quality). So
    on-campus researchers involved in CDCP analysis
    can be a critical factor.

22
Contact Information
22
  • whom_at_cccco.edu
  • (916) 327-5887
  • ARCC inquiries go to LeAnn Fong-Batkin (916)
    327-5886 or arcc_at_cccco.edu

23
23
  • Questions
  • Comments

24
Thank You
24
Write a Comment
User Comments (0)
About PowerShow.com