Title: From Infancy to Adolescence: Growing Assessment from the Grassroots
1From Infancy to Adolescence Growing Assessment
from the Grassroots
- Larry Worster, Director of Student Services
Technology and Assessment - Derrick Haynes, Director of Student Academic
Success Center - Metropolitan State College of Denver
2Goals of this Workshop
- The student (participant) will understand Metro
States process for combating assessment
resistance. - The student will understand the teaching model
for creating change. - The student will be able to layer activities to
create small wins. - The student will create rubrics for evaluating
assessment activities.
3Infancy The Discussion
- Change in institutional leadership (2003)
- New Board of Trustees
- New President
- New Interim VP of Student Services
- Imposed Mandate for Student Services (2004-2005)
- Director of Research and Development
- Meetings with directors
- Discussions
- Lists of possible outcomes created by departments
4Infancy The Framework
- Development of Framework (2005-2006)
- New VPSS
- New Assistant Vice President for assessment
- Division-wide Assessment Committee
- Learning Reconsidered 1 and 2
- Outline of MSCD Assessment Handbook
- Assessment Reports generated (2006)
- Conceptual Framework
5(No Transcript)
6(No Transcript)
7(No Transcript)
82006 Report
92006 Reports Excerpt
- Actual Results Career Services
- Student satisfaction survey post 2006 evaluations
did not adequately measure information/skills
questions to reflect learning outcomes. It did
measure if students were pleased with services. - Improvements Based on Results
- Improve evaluation forms questions to assess if
students are learning career management skills. - Resources needed
- Need better questions to assess accurately how
well students are learning career management
skills. -
10Adolescence Model for Change
(2006-2008)
- Director of Student Services Technology (Oct
2006) - VPSS and AVP leave the college (Oct 2006)
- Interim VPSS appointed (Oct 2006)
- Director of Technology becomes Director of
Student Services Assessment and Technology - He STUDIES!
- Learning Reconsidered 1 and 2
- He reads everything!
11Small Wins
- The massive scale on which social problems are
conceived precludes innovative action because
dysfunctional levels of arousal are induced. - Reformulation of social issues as mere problems
allows for a strategy of small wins wherein a
series of concrete, complete outcomes of moderate
importance build a pattern that attracts allies
and deters opponents. - Ironically, people often can't solve problems
unless they think they aren't problems.
Weick, Karl E., The Theory of Small Wins
(American Psychologist, 1984 39/1)
12StudentVoice
- StudentVoice creates surveys for administration
through web and/or PDA. - Data collection and tabulation is automatic.
- StudentVoice provides models for survey
development. - StudentVoice consultants provide feedback and
suggestions for improvement of surveys. - Analysis by filtering, cross-tabulation, and
question selection is easy. - Exporting of data and graphs is a feature of the
StudentVoice dashboard.
13Model for Change
Saturation
Organic Growth
Change
Practicality
Simplicity
14Simplicity
- Moving from conceptual activities to simple,
achievable tasks - Write a learning objective
- Work individually with directors to develop
assessment knowledge - Explaining assessment using the relationship
metaphor - Use StudentVoice to teach directors how to create
quality surveys. - Use StudentVoice expertise to adjust question
language and Likert scales.
15Practicality
- Use Learning Reconsidered to categorize student
learning objectives. - Host small workshops with directors to write
student learning objectives. - Use actual survey projects as case studies to
demonstrate how to assess - Share objectives, plans, and results in a private
website
16Organic Growth
- Carefully select and use of the division
assessment committee for peer review - Use champions to model assessment activities and
showcase the assessment process. - Highlight the growth of assessment activities
within the division in meetings - Provide visibility for activities using
different voices to advocate for assessment - Integrate assessment plans and reports in the
annual report
17Saturation
- Work to saturate the division with leaders who
participate in and value assessment activities - Make assessment a part of the job and not an
additional burden - Teach the most resistant staff how to use
StudentVoice on their own to create surveys
18Change Methodology The Teaching Model
- Teach Professionals about assessment simply and
practically, creating organic growth and moving
toward saturation - Concepts
- Activities
- Interpretations
- Impacts
- Use Small Wins to Create Incremental Progress
through the process of layering of activity
19Layering of Activities Learning Objectives
- Spring 2007
- Construct Learning Objectives with Learning
Reconsidered Categories and Sub-Categories - One-on-one sessions
- Small Workshops
- Voluntary
- Announced by Email
- Maximum of Six Participants
- Use one person as a case study
- All participate
- Principle The student will . . . do or know . .
. something
20Annual Report Assessment Report
- June 2007
- Department Reports on the status of assessment
activities included as a part of the annual
report - Include a minimum of one learning objective.
- What methodology was or will be used to measure?
- What did analysis show?
- How have the results been shared with
professionals and students? - What changes or adjustments in program activities
or assessment methodologies have occurred as a
result of the analysis of the assessment? - Principle Report something, no matter how small.
Dont invent!
21Peer Review of Reports
- Summer 2007
- Peer Review by Student Services Assessment
Committee - Reports and peer review documents on assessment
committee website password protected to create a
non-threatening environment - Reports also reviewed by Institutional Assessment
Committee with teams of student services and
academic affairs staff and faculty - Principle Use peers to make suggestions for
improvement of learning objectives, new learning
objectives, ways of analyzing, etc.
22Website
23Large Survey Administration
- Fall 2007-Spring 2008
- Large Survey Administration
- StudentVoice Benchmark Surveys Campus
Recreation, Orientation, Career Services, Profile
of the American College Student - NSSE
- Admissions, Financial Aid, Registrar, Career
Services - Principle Send each student no more than one
unsolicited, general survey. Collect large data
samples as a starting point.
24Small Survey Administration
- Spring-Fall 2007
- Small Survey Administration
- Use one-on-one sessions to develop survey
questions - Use StudentVoice assessment counselors to improve
questions and Likert scales - Administer surveys at point of service
- Open web administration, PDA, Transcribe paper
surveys into SV interface - Administer surveys by email distribution
- Principle Create surveys for each department to
address at least one learning objective
25Assessment Plans
- January 2008
- Require Assessment Plans
- Student learning objective or learning
objectives - How will the department track student activity?
- How will the objective(s) be assessed?
- How the assessment will be analyzed?
- Principle Expectation of that creating
objectives and surveys will result in
understanding the role of planning in assessment
development
26Assessment Plan Rubrics
- March 2008
- Develop a set of rubrics by examining SS
Assessment Plans for good practices - Student Services Assessment Committee
- Three questions
- What are the criteria that may be evaluated for
an aspect of the plan - Learning objectives, measure, analytical method,
plan for improvement - What is the gold standard for each performance
indicator? - What is the coal standard for each performance
indicator? - Rubrics will be used to evaluate plan in 2008
Reports. - Principle Rubrics clearly communicate
administrative expectations.
27Assessment Workshop
- Workshop Required for one member of each
department - Administer division-wide assessment first
- Design curriculum to address divisional
weaknesses - Develop website tutorials to support workshop
activities - Assess workshop
- Write Director of Assessment Assessment Report
(modeling)
28Weaknesses
- Using Blooms Taxonomy for constructing learning
objectives - Survey Development Skills
- Writing good questions
- Fighting ambiguity
- Spotting confounded questions
- StudentVoice Skills
- Filtering of responses
- Cross-tabulation
- Creating views
- Creating graphs
- Exporting data
29Too Many Weaknesses for This Year
- Goals for 2008-2009 Workshops How to Build
Proposals for Program Change or Emphasis - Tying proposals for program change to
assessments - Building effective rationales into program
changes - Building assessment into program changes
30Blooms Taxonomy
- Blooms Verbs for Use in Constructing Student
Learning Objectives
1.00 Knowledge Define Repeat Record List Recall Name Relate Underline 2.00 Comprehension Translate Restate Discuss Describe Recognize Explain Express Identify Locate Report Tell Review 3.00 Application Interpret Apply Employ Use Demonstrate Dramatize Practice Illustrate Operate Schedule Shop Sketch 4.00 Analysis Distinguish Analyze Differentiate Appraise Calculate Experiment Test Compare Contrast Criticize Diagram Inspect Question Relate Debate Inventory Solve Examine Categorize 5.00 Synthesis Compose Plan Propose Design Formulate Arrange Assemble Collect Construct Create Set up Organize Manage Prepare 6.00 Evaluation Judge Appraise Rate Compare Value Revise Score Select Choose Assess Estimate measure
31Assessment of Assessment Objectives
32Website Tutorials
- Blooms Verbs for Use in Constructing Student
Learning Objectives
33Report Rubrics
- Rubrics for the Assessment Report announced
- Assessment Plan Rubrics will be used for 2008
Plans - Assessment Plan and Report Rubrics will be used
for 2009 Plans and Reports - Rubrics will be adjusted by the assessment
committee after each cycle
34Workshop Outcomes
35Rubrics
- Student Learning Objectives Scope
a Scope
5 Department has learning objectives developed for all aspects of its program.
4 Department has learning objectives developed for two or more aspects of its program.
3 Department has defined a learning objective for an aspect of its program.
2 Department has an objective stated, however it is not stated as a learning objective.
1 Department has no learning true learning objectives defined.
Comments
36Rubric Workshop Exercise
- Pick an aspect of the Report
- Student Learning Objectives
- Measures
- Analysis
- Proposals for Program Change
- Create Criteria
- Create Performance Indicator Sentences
- Create the Gold Standard
- Create the Coal Standard
- Fill in to create a five-point scale
37Thank you!
- Larry Worster, Director of Student Services
Technology and Assessment - Derrick Haynes, Director of Student Academic
Success Center - Metropolitan State College of Denver
- All materials available at
- http//www.mscd.edu/ssac
- worster_at_mscd.edu
- haynesd_at_mscd.edu