Title: Electronic Statistics: Practical Approaches
1Electronic Statistics Practical Approaches
- Charles R. McClure ltcmcclure_at_lis.fsu.edugt
- Francis Eppes Professor, and Director
- Information Use Management and Policy Inst.
- School of Information Studies, FSU
- Concord, New Hampshire
- July 27, 2005
2Importance of Evaluating Networked Services
- Determine the degree to which service objectives
are accomplished - Assess how well services support larger
organizational goals - Monitor the status of services and operations to
produce trend data that assesses the service over
time - Justify and show accountability of services
3Importance of Evaluating Networked Services
(cont)
- Compare the relative costs and benefits of one
versus other or similar services - Provide input for planning and development of
future services - Assist the organization in staff training
- Inform governing boards as to the nature and
success of network services
4Importance of Evaluating Networked Services
(cont)
- Identify those aspects of network services that
are successful and those that need to be refined - Educates the staff as to costs, benefits,
impacts, and problems - Forces the organization to think in terms of
outcomes what are the outcomes of these services
and are they what you want?
5Importance of Evaluating Networked Services COSTS
- Organizations have grossly under-resourced
IT-related services - IT is no longer a luxury -- IT is the cornerstone
affecting ALL services - Hardware and software upgrades will occur
regularly - Network services are only in their infancy in
terms of IT support
6Contingencies Affecting the Success of Network
Evaluation Efforts
- Amount of time and resources available
- Staff knowledge, skills, and experience in
evaluation methods and data collection - Administrative support to conduct evaluation
- Complexity of the services being provided (one
library versus a consortia) - Organizational culture toward evaluation
7Current Context
8Current Context
- Qualitative notion that we are providing
increased digital services, busier than ever, to
more users, especially for online/web-based
services, BUT - - Not showing up in what we traditionally
- count
- - No comprehensive national data
- - Difficulty in justifying new expenditures
9(No Transcript)
10What Weve Learned So Far
- Librarians point to problems associated with data
collection, especially vendor statistics. - Lack of consistent, comparable and detailed data,
problems with interpreting and summarizing data,
lack of technology and personnel support,
inability to link data to peer comparison,
quality of service, and outcomes. - Little organizational infrastructure to support
data collection, analysis, and reporting.
11What Weve Learned So Far
(Continued)
- Libraries are collecting some data, often
statistics related to patron accessible resources
and cost of electronic databases. - For use statistics, libraries depend almost
solely on vendor reports. - Libraries have little information about the users
of networked services. - Data most frequently used for immediate decision
makings licensing contracts, budget requests - Less frequently used to paint a bigger picture of
information usage patterns.
12Manuals and Resources
- Statistics and Performance Measures for Public
Library Networked Services, ALA, 2000 (ISBN
0-8389-0796-2) - - Basic statistics and how to use them
- - How to select which statistics to
- collect and why
-
13Manuals and Resources (Continued)
- ARL E-Metrics Project
- Data Collection Manual for Academic and
Research Library Network Statistics and
Performance Measures ARL Libraries - - 19 Statistics and Measures described in a
- data collection manual
- - Powerpoint presentations on using
electronic - statistics
14Statistics/Measures Described in the Manual
Patron Accessible Electronic Resources R1
Number of electronic full-text journals
R2 Number of electronic reference sources
R3 Number of electronic books Use of
Networked Resources and Services U1
Number of electronic reference transactions
U2 Number of logins (sessions) to
electronic databases U3
Number of queries (searches) in electronic
databases U4 Items requested
in electronic database U5 Virtual
visits to librarys website and catalog
15Statistics/Measures Described in the Manual
Expenditures for Networked Resources and
Infrastructure C1 Cost of electronic
full-text journals C2 Cost of electronic
reference sources C3 Cost of electronic
books C4 Library expenditures for bib.
utilities, networks, and
consortia C5 External expenditures for
bib. utilities, networks, and
consortia
16Statistics/Measures Described in the
Manual
- Library Digitization Activities
- D1 Size of library digital collection
- D2 Use of library digital collection
- D3 Cost of digital collection construction and
- management
- Performance Measures
- P1 of electronic reference transactions of
total - reference
- P2 of virtual library visits of all library
visits - P3 of electronic books to all monographs
-
- Total of 19 Statistics and Measures Described
with data collection procedures.
17Importance of the E-metrics Project
- 19 new statistics and measures have been
developed and field tested to describe use,
users, and uses of networked resources - ARL members have a better understanding of the
resources and organizational concerns needed to
conduct effective evaluation efforts - We have better knowledge of the role of academic
libraries in Institutional Outcomes - The project provides a springboard for additional
research in measurement.
18Manuals and Resources (Continued)
- Assessing Quality in Digital Reference (funded by
OCLC, DLF, LC and consortia). - http//quartz.syr.edu/quality/
- - Statistics, Measures and Quality Standards for
Assessing Digital Reference Library Services
Guidelines and Procedures
19Other Resources
- Florida State University, Information Institute.
Clearinghouse for Networked Statistics Includes
a national training effort EMIS (funded by U.S.
IMLS) - NISO/ISO, EQUINOX, COUNTER, ICOLC, eValued
Project, and others (see EMIS Resources)
20Assessing Network Services
- Know what to measure
- Know how to measure
- Know why measuring
- Know that data and measures are accurate and
timely - Know to whom the data will be reported and why
21Assessing Network Services Approaches
- Counts and statistics
- Comparisons to other libraries
- If it werent for this service
- Return on investment
- Costs and savings
- Public goods (services) provided
22Evaluation Assessment Criteria
- Extensiveness How much of a service the network
provides (e.g., number of transactions per week,
number of remote sessions per week) - Efficiency The use of resources (usually time or
money) in providing or accessing networked
information services (e.g., average cost per
reference transaction) - Effectiveness How well the information service
met specific objectives of the provider or the
user (e.g. success rate of identifying and
accessing the information needed by the user)
23Evaluation Assessment Criteria (cont)
- Service Quality How well the service or activity
is provided (e.g. does the service meet
organizational or user expectations?) - Impact How a service made a difference in some
other activity or situation (e.g., the degree to
which users enhanced their ability to save time,
resolve a problem, or identify new and innovative
applications/services - Usefulness The degree to which the services are
useful or appropriate for individual users (e.g.
did the service assist or solve problems for
different types of user audiences).
24Methodologies
- Traditional qualitative and quantitative
- Focus groups, surveys, interviews, logs
- Adapted qualitative and quantitative
- Pop-up Web surveys
- New methodologies
- Web log and URL analysis
25Methodologies (contd)
- New methodologies Web usage analysis
- Determine the overall Web site's traffic,
including the - Location of users
- Portions of the site that are accessed
- Number of document downloads
- Errors encountered by users
26Types of Web Log Files
- Four standard text-based log files
- Access Log
- Provides such Web user data as IP/Domain name,
time and date of access, server pages accessed,
frequency of document downloads - Agent Log
- Provides type and version of browser (e.g.,
Netscape) and browser platform (e.g., Windows,
Mac, Unix) data
27Types of Web Log Files (contd)
- Error Log
- Provides data on any user-received error while
accessing the site. Includes "file not found"
errors and user-halted page hits (i.e., lengthy
page loading due to large graphic images) - Referrer Log
- Provides Web administrators with data concerning
other Web sites that link to their site
28So What is Digital Reference?
- Digital Reference (DR) refers to a network of
expertise, intermediation, and resources put a
the disposal of a user seeking answers in an
online/networked environment. - The DR mechanism may be via email, live chat,
interactive video, or other means. - It can operate 7X24 without regard to location of
resources or people.
29Comparing Traditional Reference to Digital
Reference
- Accuracy, cost, training, benefits, time,
transactions - Library Organizational structures including
state, regional, and national consortia - Role of technology
- Knowledge and skill set of staff
30Key Digital Reference Evaluation Questions
- What are the demographics of DR users?
- What frequency and type of use is made of DR?
- What are the costs for DR and how do they track
over time? - How has provision of DR affected the library
organization and other staff?
31Key Digital Reference Evaluation Questions (cont)
- Is the library organized successfully to deploy
DR? - What type of IT support is necessary and is it
available for DR? - With what aspects of DR services are users
satisfied? - What constitutes quality DR services and how does
your service compare to others?
32Composite Statistics and Performance Measures
- An approach to link traditional and
- networked statistics and performance
- measures
- Must be used carefully
- Can help educate the local community about uses,
users, and use of networked services and resources
33Next Steps Quality Standards
- Quality standards are specific, measurable,
expectations of the desired quality of some
service or activity. -
- They define the level of quality or
performance that an organization is willing to
accept as representing Quality for that
particular service.
34Importance of Quality Standards
- Encourages staff to discuss and agree upon what
constitutes quality for a specific service - Provides clear guidance as to the quality
targets that are acceptable - Recognizes that there may be differing acceptable
levels of quality for different types of services - Provides a basis for rewards and demonstrating
accountability.
35Performance Measures versus Quality Standards
- Performance Measures
- - Accuracy of digital reference answers
- - User satisfaction
- Quality Standard
- - 90 of all quick fact and bibliographic
- questions will be answered correctly within
6 - hours of receipt and be assessed at least
at a - 3.5 on a scale of satisfaction (1not
satisfied, - 5very satisfied)
36 Next Steps Describing and Measuring
Institutional Outcomes
- Definition
- Outcomes are clearly identified results or end
products that occur as a consequence of
individual or combined activities from units at
the Institution. These outcomes are a preferred
or desired state and clarify specific
expectations of what should be products from the
institution.
37Putting the Pieces Together
38Rethinking Statistics and Measures
- Some traditional statistics will translate to the
networked environment web transactions or
visits, cost per transaction, correct answer fill
rate, user satisfaction, etc. - Some will not reference transactions per capita
- Some will be new remote versus onsite questions,
full text down loads
39The Need for New Statistics and Measures
- We have inadequately considered the revision
and updating of the traditional statistics being
used to reflect new networked services such as
Digital Reference, database use and access, etc. - - Underestimated the extent of services
- - Cant demonstrate costs and impacts
- - Unable to define quality services
- - Poor planning and development for services
40Remember Multiple Approaches are Available to
Assess Networked Services
- The inputs-outputs approach (as used in this
project) for statistics and performance measures - - Quantitative
- - Qualitative
- LibQual
- Service Quality
- Quality Standards
- Educational and Institutional Outcomes
- And others...
41Internal Library Issues
- Linking statistics and measures to library
planning - Selecting WHAT measures and quality standards are
right for YOUR library - Organizing the evaluation effort who does what
when? Training? - Organizing the evaluation data in a management
information system
42Other Issues
- Need to integrate traditional statistics with
statistics that describe networked services and
resources - Lowest common denominator?
- Determining what sells at the local versus the
national level - Convincing libraries that impacts/benefits
assessment is worth the effort?
43Other Issues (contd)
- New models for data collection reporting
- Statistics development/longevity -- 3-5 years at
most - Capturing data
- IT configurations
- Comparability
- Data collection and dissemination
- Traditional reporting structures may not work
- Alliances to get whole picture since some of key
data are beyond librarys control, e.g. Vendors,
ISPs, Web hosting agencies
44Resources
- McClures homepage
- http//slis-two.lis.fsu.edu/cmcclure/
- IMLS and other Information Inst. Projects
- http//www.ii.fsu.edu
- ICOLC http//www.library.yale.edu/consortia/webs
tats.htm - Public Library Association
- http//www.pla.org/electronicstats.htm
45Resources (continued)
- Bib http//vrd.org/resources.html
- ARL http//www.arl.org/stats/
- newmeas/e-usage.html
- Webtrends http//www.webtrends.com
- OCLC, DLF, et al. Assessing Quality in Digital
Reference http//quartz.syr.edu/quality/
46Parting Shots
- Rethinking planning and evaluation WITHIN the
library - What is the library trying to accomplish with
electronic statistics? - Some years maybe needed to design, test, and
refine needed statistics and measures - The changing IT environment and evolving user
behaviors will require ongoing refinement of
measurement techniques
47Parting Shots (continued)
- Collaboration with vendors, local networking, and
others is essential - Statistics and measurements are likely to be
estimates that are good enough - Recognize the role of politics in assessment,
e.g., Florida and Hawaii examples - There is much learning yet to be accomplished and
we need to start NOW!
48Parting Shots (continued)
- Many libraries have no culture of assessment
- Inadequate organizational resources and staff
time are give to evaluation in general and
networked services in particular - There is a significant need for staff training
and education in evaluating networked services
and resources.
49(No Transcript)
50GET OVER IT!
- Need to rethink what networked services are and
how to measure them - Need to keep on experimenting with IT and
services - Need to develop, test, and refine new statistics
and measures for networked services including
Digital Reference - Need to learn new assessment skills
- Need to play in the political playground
51Questions and Comments?
- Chuck McClure, Francis Eppes Professor and
- Director, Information Use Management
- and Policy Institute
- School of Information Studies, FSU
- ltcmcclure_at_lis.fsu.edugt
- http//slis-two.lis.fsu.edu/cmcclure/
- http//www.ii.fsu.edu/