Title: Some principles of assessment audits
1Some principles of assessment audits
- Cliff Adelman, Institute for Higher Education
Policy, Feb. 29, 2008
2Whats this about?
- Documentation
- Design
- Communication
- Validities
- Data Mapping
- Accountability
3If you are designing an assessment program, you
will be either
- Using off-the-shelf products
- Modifying or reweighting available products
- Developing your own products / procedures
- Drawing on unobtrusive data information
4The integrity of what you do
- requires an vigilant audit structure and process.
5Documentation right from the start
- Record keeping of everything you did
- A single location of records, e.g. IR office
- Scheduled review of records
- Assessment data library as part of records
- Why? Whether the subject of assessment is the
student, the program, or the institution, youre
on sensitive territory. Documentation is a form
of protective anticipation.
6Audit components design
- What do you want to know?
- About who or what?
- Process options in light of answers to those
questions. - Virtues / limitations of each option considered.
7Audit components content before populations
- Critical content, critical skills
- Weighting of domains says who?
- Rationale documentation
- Level(s) of difficulty / challenge says who?
- Distribution of difficulty / challenge
- Sufficiency of prompts
8Audit components populations
- Full census? why?
- Samples? of who?
- If sample, sufficiency of subgroups
- Nature and adequacy of population background
information - Time and population
9Relationship of instrument / method to judgment
- Conditions for ensuring maximum reliability
- Differences between restricted / unrestricted
response - Conditions for performance assessments
- Training for reliability (unrestricted and
performance)
10Directions and presentation
- Visual form fonts, screens, layout
- Clarity of directions in relation to the type of
assessment - Completeness of directions
- All of this pre-tested with representative focus
groups - Document revisions!
11Indeed, pre-testing of everything
- With records of administration, time,
reliability, item-analysis (where appropriate),
etc.
12And if its a test, documented reviews of. . .
- Adequacy of fit of item-response models to data
- Test characteristics compared with its
psychometric specifications - Test editions, to ensure content
representativeness - Test forms, to assure comparability of scoring
13Prior information for prospective examinees, i.e.
Do they know. . .?
- Purpose of the assessment
- What the assessment will be like, through samples
of prompts - Experience relevant to assessment performance
- Administration procedures
- The best documentation is a sample information
bulletin.
14Reliability of any assessment
- Document sources of variation (content, judges,
time interval between assessment and judgment,
etc.) - Document methods used to determine reliability
the rationale for using them - Document results of reliability analysis, e.g.
reliability coefficient, standard error of
measurement, degree of agreement between
independent judgments
15Post-Assessment Communication
- Score / results interpretation guides appropriate
for each category of user, e.g. instructor,
institution, agency, legislature, media - Recommend only those interpretations for which
supporting information is available - Strong statement of valid uses of results
equally strong statement of invalid uses of
results - Provide score / results recipients with an
appropriate frame of reference for evaluating the
performance represented in the scores / results - Be consistent!
16We could cover a lot more
- But this is a reasonable beginning on the process
and content of assessment audits - Much of it derives from the internal technical
manuals of ETS and ACT - You certainly could elaborate based on your
experience, but - Always think like an auditor!