Title: What Do We Knowand How Do We Know it
1What Do We Knowand How Do We Know it?
- Patrick Perry, Vice Chancellor of Technology,
Research, and IS - California Community Colleges Chancellors Office
2Who is this guy? Why should we listen to you?
- Brad Pitt-like looks
- Vin Diesel physique
- I analyze and measure stuff for a living.
- I have all the data.
- Information Management Institutional Research
- IMtherefore IR.
3My Credo
- I realize that I will not succeed in answering
all of your questions. Indeed, I will not answer
any of them completely. The answers I provide
will only serve to raise a whole new set of
questions that lead to more problems, some of
which you werent aware of in the first place.
When my work is complete, you will be as confused
as ever, but hopefully, you will be confused on a
higher level and about more important things.
4Why collect data and report?
- Collecting and reporting is a hassle.
- But, its necessary.
- Man cannot rely on anecdote alone.
- We have to show progress, program/system efficacy
somehow. - Its a great help in knowing if what youre doing
actually works. - Start nowdont wait for a crisisthen its too
late.
5The Continuum(of stuff we know)
- Macro things we know
- Nationwide metrics (total number of students in
higher ed) - Statewide metrics (participation rate)
- Systemwide metrics (annual volume of transfers)
- Micro things we know
- District/college level metrics (annual volume of
degrees awarded) - Departmental level metrics (success rates)
- Course level metrics (SLOs)
- Student level metrics (GPA, total units earned)
6Where its all used
- National studies (IPEDS)
- Statewide reporting
- System/District accountability
- Local planning, self-evaluation
- Improvement of student success
- Accreditation
- Lobbying, telling our story
- Monetary prioritization/decision making
7What Things Do We Know?
- We know some things very well
- And have been successful in conveying some of
them - Other things we dont know so well
- Of course, everyone wants to know all about these
- Data availability plays a role
- We actually have the best MIS collection
- Research capability plays a role
- Tons of data, no bodies to mine it
8Some things we know well
- Student body large, diverse, mobile
- 1.4 million students in Fall
- 2.5 million annual unduplicated
- Staff and faculty composition
- 17,000 tenured faculty
- 35,000 academic temporary
9Some things we know well
- What awards students earn
- 75,000 AA/AS
- 43,000 Certificates
- Success, retention, persistence
- Success rate, all courses 67.3
- Retention rate, all courses 82.9
10Some things we know well
- Financial aid
- Over 1 Billion in financial aid provided last
year - What student services students use
- Matric, Assessment, EOPS, DSPS, CalWorks
11Some things we know pretty well
- Transfer patterns
- Annual volumes to CSU (52,000) and UC (12,000)
- Cohort transfer rates to all segments
- For behaviorally inclined to transfer students,
40 actually transfer in 6 years - How they do at UC/CSU after transfer
- Just as well as their own native students
12Things we know a little bit about
- Student intent
- What they say on their applicationnot so great
- Student behavior what they dobetter indicator
- Critical for creating outcome ratesyou only want
to count those who desire that outcome
13Some things we dont know very well
- Level of preparation upon entry
- Income data/socioeconomic status
- 1st Generation status
- We end up using poor proxies for these instead
14Some things we dont know very well
- Annual volume of transfers to privates/independent
s/out of state - Student engagement/satisfaction
- Economic impact of education value-added
15Some things we dont know very well
- True market demand
- Impact of special programs/interventions
- Facilities usage
- Basic skills progression
16Measuring Community College Outcomes
- Community colleges are the hardest to measure.
- Multiple missions
- Variety of student intentions
- Differing academic levels
17Accountability
- Focus too often is on institutional comparison
not institutional improvement or student success - Somebody must be good and bad
- Comparisons never made to national norm or
standard (is it possible?) - Burden of creating metrics is great
18Accountability
- 2 kinds of metrics Volumes and Rates
- Volumes show you how many
- Rates mitigate against size
- Both are tricky to evaluate
19Accountability
- College A 2,200 annual transfers to CSU/UC
- College B 25 annual transfers to CSU/UC
- Who did the better job?
20Accountability
- College A Transfer Rate of 42.4 of
transfer-intended students - College B Transfer Rate of 31.6
- Who did the better job?
21Accountability
- College A
- 30,000 students
- lt1 mile from large CSU campus
- Almost all transfers lt age 25
- 40 of students need remediation
- 16 on financial aid
- Local unemployment rate5.5
- Average county household income levelhigh
22Accountability
- College B
- 2,000 students
- 215 miles from nearest CSU campus
- 1/3 of transfers gt age 25
- 88 of students need remediation
- 67 on financial aid
- Local unemployment rate11.3
- Average county household income level15k less
than A - Who did the better job?
23Accountability
- Perfectly equitable metrics for community college
outcomes are almost impossible to create and
implement - Create rates to mitigate size
- Isolate denominator only to those with intent
- Seek out and account for factors out of
institutional control - Do about a million regressions
- With that, you can maybe hope to explain about
half the variance of what really went on.
24Partnership for Excellence
- Had 5 metrics4 of which were annual volume of
metrics (one rate) - Metrics tied to things outside of our control
- Transfer correlates with receiving institutions
budget - Awards, of course completions, basic skills
improvements tied to our own budget - Looks great while budgets are good!
25AB1417-Accountability
- Pacheco bill that required System Office to
create district level reporting framework - Nov. 04-Mar. 05creation, approval from
Consultation, BOG, Finance, Leg Analystand
hopefully Legislature - Senate presence throughout
26AB1417
- Create a district-level reporting framework that
- Had useful metrics
- Measured pertinent things of interest
- Minimized institutional burden
- Told our story
- Didnt rely on institutional comparison
- Satisfied accountability-driven parties
27AB1417
- The Model
- Measures 4 areas
- Student Progress Achievement-Degree/Certificate/
Transfer - Student Progress Achievement-Vocational/Occupati
onal/Workforce Dev. - Pre-collegiate improvement/basic skills/ESL
- Participation
28AB1417
- The Model
- Has two levels of reporting
- 6 District Core Indicators (micro)
- 7 System Indicators (macro)
- We went well into the barrel of what we know
with this one
29Student Prog. Achievement Degree/Cert/Xfer
- District Core
- Student Progress Achievement Rate
- 1st year to 2nd year persistence rate
- System
- Annual volume of transfers
- Transfer Rate for 6-year cohort of FTFs
- Annual of BA/BS grads at CSU/UC who attended a
CCC
30Student Prog. Achievement Voc/Occ/Wkforce Dev
- District Core
- Successful Course Completion rate vocational
courses - System
- Annual volume of degrees/certificates by program
- Increase in total personal income as a result of
receiving degree/certificate
31Precollegiate Improvement/Basic Skills/ESL
- District Core
- Successful Course Completion rate basic skills
courses - ESL Improvement Rate
- Basic Skills Improvement Rate
- System
- Annual volume of basic skills improvements
32Participation
- District Core
- None.
- System
- Statewide Participation Rate (by demographic)
33AB1417 The mechanics
- System Office will produce the reports with MIS
data. - Draft report will be given back to districts for
60-day review. - District will provide brief analysis of data
which will be included in final report. - District profile will also be included.
34AB1417 Performance Evaluation
- Districts will not be ranked.
- District performance will be against that
districts past performance. - Some peer grouping summaries will be available,
but anonymous. - Continual decline may trigger further review or
interaction. - No carrots or sticks (yet).
- Most important thing use the data for
self-evaluation.
35AB1417
- Will require legislation or budget act language
to implement. - Will require System Office staff augmentation.
- Will take many months to create first set of
reports. - So hang on to your hats.
- Butit will help advance our knowledgeand our
message.
36Where are we going?
- Ultimatelydata-driven decision making is
superior. - System has operated without a comprehensive
strategic plan for some time. - Chancellor/BOG have formed a Strategic Planning
effort. - Grant from Irvine Foundation to FCCC
37Strategic Plan
- To be completed by January 2006.
- Facilitated by project consultant and Steering
Committee - ASCCC Rep Ian Walton
- Ten regional Planning Meetings to occur in next
few months - Key system performance data will be presented and
discussed as an integral part of this Strategic
Planning process
38Soin conclusion
- Data collectors, institutional researchers,
faculty, program support, and students all need
each other. - Make friends with your institutional
researcher..ask him/her to help you design ways
to prove your successes.
39Soin conclusion
- Take ownership of outcomes datause it to
diagnose and improve. - Dont fear accountabilityhold accountability
accountable. - Use what we know to confuse you on a higher
level.