Title: In 2005, Research
1Improving Annual Data Collecting an interactive
poster session
Linda Miller Research Assessment Services
(RAS) Cornell University Library
In 2005, Research Assessment Services (RAS)
took on responsibility for Cornell University
Librarys (CULs) annual data collection, and
focused on making the 2004/05 data collection
easier and more transparent for data providers
and library managers. It Developed a table
of the measures included in the most recent
annual statistical report and recurring national
surveys (ordered by functional area) to help
familiarize staff with all measures and ensure
that all core data was collected at one time
Developed an expanded definitions file to promote
consistency and to support data coordinators
Created Excel files to facilitate data input and
management, and to allow for percentage change
comparisons with 2003/2004 data Encouraged
and made it easier to include more notes
Collected a large part of the centrally-provided
data before the unit collection so that
individual units had more time to review figures
provided for them Expanded instructions and
provided training sessions Made it more
explicit to whom the data was being reported and
Involved the reporting units in data
verification and analysis. In 2004/05, RAS
also started to think about how to update tables
to mainstream e-resource statistics and make the
presentation of data more useful to a wider
variety of audiences. Finally, RAS requested
feedback from library managers in various forums.
In 2006, RAS is building upon this earlier
work. To ensure that current and future data
collection efforts are as meaningful as possible,
RAS asked each functional areas executive
committee (e.g., public services, collection
development, etc.) to take ownership of tables
representing their areas, including setting and
defining measures to be collected, and assisting
in data review and analysis. We envision that
this ongoing, cyclical process, involving staff
throughout the library, will allow us to create a
solid (and more easily gathered and shared) set
of repurposable data to support a full assessment
program one that will incorporate both
quantitative and qualitative metrics into future
strategic planning efforts. In this poster
session, Linda Miller will outline the CUL annual
data collection and related processes and welcome
discussing with other conference attendees their
data gathering efforts. She will share insights
gained on ARL-ASSESS.
2The number of reference transactions has
generally been dropping for over a decade. But
do those figures alone tell the whole story? As
directed by the Public Services Executive Group
(PSEC), the Reference Outreach committee
developed a standardized reference statistics
form that documents (by category or actual
minutes) the time devoted to each reference
transaction how frequently technology issues
arise what mode the question is received in and
whether a question is answered off-desk.
Decisions were based on the work of a
subcommittee that took into consideration
national reporting needs, and consulted with all
unit staff and a campus statistician. A group of
students in Computer Science 501 developed an
online data collection and reporting system for
Olin staff, which was then adapted for use across
the Library. It encourages standardization and
community building, but allows for individual
unit needs. PSEC also specified that reference
transactions handled by non-reference staff
should be included. Although the system may be
used year round if desired, units are only
required to take statistics for 12 randomly
sampled weeks to report annually. For 1 of those
12 weeks, all units will also record the
questions they receive for analysis.
1. Prepare for the annual data collection based
on Library assessment needs.
With the administration, determine what the
Librarys current assessment needs are.
Determine work timetable. Working with the
functional area communities and data providers
Keep staff up to date on national assessment
efforts and best practices. Review the latest
annual statistical report to determine which
measures need to be added (or dropped) to ensure
staff have the core data they need to manage,
evaluate and describe their work, and to report
to third parties. Investigate/clarify
existing measures identified as needing attention
in the previous data collection. Determine how
new measures can be gathered, and write
definitions. Appoint small committees as
needed. Make all appropriate staff aware of and
prepare them to gather new measures. Where
possible, work towards automating processes.
Consider assessment needs as new services and
systems are developed.
Functional area committees are asked to take
ownership of tables that represent their work.
Ownership includes setting and defining
measures to be collected, and assisting in data
review and analysis.
The latest review of the annual statistics report
showed a lack of tables related to, among others,
collection development work and e-resource
description and use. Two committees have agreed
to discuss these issues. Since the cataloging
backlog has been reduced to a working backlog,
that measure was dropped.
3One general data collection form is created to
include all measures to be collected from one or
more units. (The form is broken into functional
area worksheets (by report tables) for easier
distribution within units.) Copies are made and
customized for each unit. Notes boxes are
included after each tables data to allow for
units to record special practices or
considerations for their individual work.
2. Announce schedule and provide documentation.
Support data providers in their work.
Alert data providers to deadlines in advance.
Finalize make available from a staff web
page instructions definitions data input
forms report tables (populated with the
centrally supplied data already available)
supporting data data collection review
schedule. Provide individual orientation and
group support sessions, and answer questions
about definitions and forms. Identify needed
definition clarifications, and possible new
measures/data gathering methods. Put unit data
providers in contact with staff with more
specialized knowledge as needed (e.g.,
accounting), and supply additional supporting
data as needed. Collect any other feedback.
Provide reminders of due dates to data providers.
Data provided by this unit in the past coded
yellow.
Centrally collected data (usually system
provided) coded gray.
Putting unit data providers in contact with staff
with more specialized knowledge builds trust as
well as each individuals knowledge.
Experimental data or reminders coded blue.
Definitions are compiled by annual statistical
report table and coded with measure number. They
indicate what to include and exclude expand on
national definitions as needed and indicate
where CUL measures differ from national measures,
and measures CUL can not/will not provide.
Web page presentation of resources adds the extra
benefit of documenting files for future
reference.
Earlier years electronic forms easily
emailed/consulted to inform current work.
43. Data processing and unit review.
Populate tables centrally with unit input
files. Email data providers for clarification
as needed. Percentage change information from
previous year facilitates data checking,
resulting in corrections or explanations where
appropriate.
Notes box included after each tables measures
to encourage sharing notes to explain large
changes, and asking questions.
Input files are referenced by the central Excel
file to populate report tables.
change columns from previous year facilitate
data checking.
Percentage change data identifies input error.
54. Vetting analysis
Allow time for units to review their data in
the context of other units data. Provide
community functional area groups the data
collected from their areas, and report on
problems and suggestions identified/raised by
data providers for discussion and resolution.
Outline possible trends and explanations you have
identified, and ask groups to add to or correct.
Seek group agreement. Consider whether data
from other assessment efforts can be drawn in for
triangulation. Ask groups to OK data
collected. Resolve outstanding questions with
appropriate data providers, and seek additional
expertise if warranted. Ask groups which
trends and highlights they think are most
important to be included in one page summary, and
which measures are missing that would help them
better manage and measure the success of their
work for future reports. Ask what presentation
changes would make the data more valuable.
Allow yourself time to make changes as new data
is brought to light, and new ideas are suggested.
Provide additional historical trend data as
needed.
Work ahead of time to ensure you have collected
the resources necessary to make each groups work
as easy as possible, and let them know you are
doing so.
6- Writing, formatting, seeking administrations
final approval for, - and publishing reports.
Complete report in formats required by
audiences identified in discussion with
administration. Improve readability and data
presentation issues where needed. Include
summary and present data to draw administrations
attention to important facts/trends for possible
action. Publish! (Next steps develop
listserv to ease communication tie data to
success measures and suggest areas for
attention/change consider creating a
repurposable repository.)
For 2004/2005 provided PDF as well as printed,
summary version.
Decided to include total figures, and figures
minus Medical and Law which report separately to
ARL and others.