Benchmarking Of Library Web Sites - PowerPoint PPT Presentation

About This Presentation
Title:

Benchmarking Of Library Web Sites

Description:

WebWatch approach of monitoring UK HE Web sites can be extended into a benchmarking exercise: ... WebWatch approach has been applied to a small number of UK ... – PowerPoint PPT presentation

Number of Views:89
Avg rating:3.0/5.0
Slides: 33
Provided by: briankelly3
Category:

less

Transcript and Presenter's Notes

Title: Benchmarking Of Library Web Sites


1
Benchmarking Of Library Web Sites
  • Brian Kelly
  • UK Web Focus
  • UKOLN
  • University of Bath
  • Email
  • B.Kelly_at_ukoln.ac.uk

Penny Garrod Public Library Networking
Focus UKOLN University of Bath Email P.Garrod_at_ukol
n.ac.uk
UKOLN is supported by
2
Contents
  • Introduction
  • Background to Benchmarking at UKOLN
  • Benchmarking UK Public Library Web Sites for
    Accessibility and Usability
  • Survey Methodologies
  • Limitations of Approach
  • Where to from here?

BK
3
UKOLN
  • UKOLN
  • National focus of expertise in digital
    information management
  • Based at University of Bath
  • Funded by JISC (HE and FE sector) and Resource
    The Council for Museums, Archives and Libraries,
    together with project funding (e.g. EU and JISC)
  • About 27 FTEs
  • Carries out applied research (e.g. in metadata),
    software development and provides policy and
    advisory services

BK
4
UK Web Focus
  • UK Web Focus
  • Funded by JISC to provide advice on Web
    developments to UK Higher Further Education
  • Public Library Networking Focus
  • Funded by Resource and JISC to provide advice on
    networking issues to UK Public Library Sector
  • Synergies
  • The Focus posts will be increasingly working
    together to maximise benefits to the two sectors
    and to support the development of community
    working across these sectors

BK
5
WebWatch Project
  • WebWatch project
  • Initially funded for 1 year in 1997 by BLRIC to
    develop and use automated robot software to
    analyse Web developments across various UK
    communities
  • Once funding finished the work continued, but
    made use of (mainly) freely available Web
    services to analyse various features of Web site
    communities
  • Supports community-building work across UK HE/FE
    Web managers (sharing, not flaming)
  • See lthttp//www.ukoln.ac.uk/web-focus/webwatch/gt

BK
6
WebWatch Surveys
  • Search Engines Used To Index UK HE Web Sites
  • ht//Dig most popular and growing in popularity
    followed by an MS solution
  • Interest in licensed Ultraseek/Inktomi solution
  • Interest in externally-hosted indexers (e.g.
    Google)
  • Surprising number of institutions with no search
    facility
  • See lthttp//www.ukoln.ac.uk/web-focus/surveys/uk-
    he-search-engines/gt
  • Nos. of Links
  • Cambridge has most (231,000 links to all servers)
  • Sheffield has the most to a single server
    (46,000)
  • See lthttp//www.ariadne.ac.uk/issue23/web-watch/gt
  • Nos. Of Web Servers
  • Cambridge has most (200)
  • See lthttp//www.ariadne.ac.uk/issue25/web-watch/gt

BK
7
Update On Search Engines
  • Sept 1999
  • ht//Dig 25 ? Excite 19
  • Microsoft 12 ? Harvest 8
  • Ultraseek 7 ? SWISH 5
  • Other 23 ? None 59
  • Jan 2002
  • ht//Dig 48 ? Microsoft 17
  • Ultraseek/Inktomi 12 ? Google 11
  • Excite 5 ? Webinator 5
  • Others 22 ? None 29

NOTE
The growth in popularity of ht//Dig, the
unexpected appearance of the Google
externally-hosted service and the move from SWISH
and Harvest would not have been noticed without
the snapshots. The discussion of surveys
informed decision-making.
BK
8
Benchmarking
  • WebWatch approach of monitoring UK HE Web sites
    can be extended into a benchmarking exercise
  • Making comparisons with peers
  • Checking compliance with standards
  • Checking compliance with community or funders
    guidelines (e.g. e-Government guidelines)
  • This has advantages for organisations
  • Observing best practices and learning from them
  • Ditto for bad practices
  • Community building
  • and some potential disadvantages
  • Establishment of leagues tables
  • Inappropriate comparisons
  • Penalty clauses for failure to comply with
    standards

BK
9
Benchmarking Library Web Sites
  • WebWatch approach has been applied to a small
    number of UK Public Library Web sites
  • Small selection chosen in order to
  • Keep resource requirements to a minimum
  • Validate methodology
  • Gauge interest in this approach
  • Survey sample
  • Focus on Public Library Web sites
  • Survey undertaken in February 2002

Details of survey available from
lthttp//www.ukoln.ac.uk/web-focus/events/conferen
ces/ili-2002/benchmarking/gt
PG
10
Benchmarking Public Library Web Sites
  • Choosing the sample
  • Web sites nominated for the EARL Best on the
    Web Awards competition 1999
  • 16 Public Library websites nominated from across
    the UK
  • judging criteria for award available from the
    Wayback Machine http//web.archive.org/
  • includes good web site design and planning
    information content interactive features
    Internet resources

EARL ceased to operate in Sept 2001
PG
11
Survey Methodology
  • Analysis of domain names
  • Analysis of 404 error pages
  • WAVE analysis (accessibility tool)
  • BOBBY (accessibility tool)
  • Analysis of search facilities
  • Small scale survey to compare accessibility of
    Home Pages plus existence of basic usability
    functions

PG
12
1. Domain Names
  • Findings
  • Survey looks at entry points which are the domain
    name
  • The survey notes that majority of Public
    Libraries currently use .gov.uk domain
  • Discussion
  • Do the domains have a short, memorable URL?
  • Are a variety of top level domains used that will
    confuse the end user?

Note naming conventions local authorities may
generally use the format area.gov.uk unless
there is the possibility of confusion with
another authority (e.g. city and county)
From Moderning government framework for
information age government websites at
lthttp//www.e-envoy.gov.uk/publications/guidelines
/webguidelinesgt
13
2. 404 Error Page
  • Information on the 404 error page will be
    provided
  • Findings
  • How many sites use a default 404 error message
  • How many sites use a lightly branded error
    message,
  • How many sites provide rich functionality?
  • Issues
  • The 404 error page is (sadly) likely to be widely
    accessed
  • It is desirable that it
  • Reflects the Web sites look-and-feel
  • Provides functionality to assist a user who is
    lost
  • Provides access to a search facility / site map
  • Provides contact details
  • The 404 page can also be context-sensitive (e.g.
    different pages for users following a local link
    / remote link / no link)

PG
14
3. Accessibility
  • Entry points were examined for compliance with
    W3C WAI (Web Accessibility Initiative)
    Accessibility Guidelines
  • Web-based tools used
  • 1 the WAVE 2.01
  • http//www.temple.edu/inst_disabilities/piat/wave
  • Pennsylvanias Initiative on Assistive Technology
    (PIAT)
  • Does not tell you if page is accessible - no tool
    does this
  • Adds icons and text to page to help you judge if
    its accessible - use downloadable tutorial
  • Requires exercise of judgment and provides
    information to help you make that judgment

PG
15
4. Accessibility continued
  • Web-based tools used
  • 2 Bobby http//www.cast.org/bobby/
  • You need to select the guidelines to use
  • Web Accessibility Initiative WAI World Wide Web
    Consortium's ( W3C) Web Content Accessibility
    Guidelines
  • Section 508 guidelines developed by the U.S.
    Federal Government.
  • Select 1 the WAI option

PG
16
5. Search Facility
  • Information on search facilities will be
    provided
  • Findings
  • Number of sites with a search facility 68 of
    sample
  • Is the search facility working? 2 very slow so
    gave up 1 not available at time
  • Issues
  • user expectations many head straight for the
    search facility as they know what theyre looking
    for
  • It can take lt 30 minutes (and little technical
    expertise) to make an externally hosted search
    engine available - suitable for simple static
    Web sites (not many people know this)

PG
17
Evaluating The Results
  • Accessibility issues
  • How many sites have nil WAI Priority 1 errors?
  • Are WAVE and Bobby results consistent - are there
    glaring differences?
  • Issues
  • Compliance with accessibility standards is
    important for ensuring access to resources for
    people with a range of disabilities (e.g.
    dyslexia)
  • Compliance with accessibility standards may be an
    organisational requirement and a legal
    requirement Disability Discrimination Act 1995
    the Human Rights Act 1998.
  • Compliance benefits everyone - not just those
    with disabilities - it improves general
    usability.
  • Meeting the UK Government agenda delivering
    e-government social inclusion lifelong
    learning etc.

PG
18
PG
19
Limitations Of Survey
  • Limitations of this type of benchmarking approach
    include
  • Lack of standards
  • Limitations of the tools
  • Resources needed to carry out surveys
  • Scoping of library sites and invalid comparisons
  • Automated approach fails to address content
    issues which require a manual approach
  • results of automated tools (e.g. Bobby/WAVE)
  • often require interpretation by humans

BK
20
Limitations - Standards
  • There is a lack of standards to support
    benchmarking work (or conflicting standards).
    For example
  • Size of a page
  • How do you measure the size of the librarys
    entry point? You need this in order to make
    comparisons and if, say, you have guidelines on
    the maximum file size.
  • Problems
  • What do you measure (HTML file, inline images,
    external CSS and JavaScript files, )?
  • Changes in file content (e.g. user-agent
    negotiation, news content, frames and refresh
    elements, etc.)
  • How do you handle the robot exclusion protocol
    (REP)

NOTE Bobby and NetMechanic work differently the
former only measure HTML and images, the latter
obeys the REP
BK
21
Limitations - Tools
  • Definitions
  • Auditing tools tend to make implicit definitions
    (e.g. measuring page size). Different results
    may be obtained if different tools used (or if
    vendor changes its definition)
  • Use of Web-based auditing services
  • Talk has described use of (mainly free) Web-based
    services
  • The providers may change their policy
  • Use of the URL interface to pass parameters
    (rather than direct use of the form on the Web
    page) may not be allowed
  • Use of desktop auditing tools
  • Use of desktop tools avoids the problems of
    change control of Web based services
  • It may be difficult for others to reproduce
    findings

BK
22
Limitations - Resources
  • It can be time-consuming to
  • Maintain URL of entry point to library Web sites
    (need to have close links with provider of
    central portal)
  • Manage the input to the variety of Web-based
    services
  • Process the output from the Web-based services
    (current need to initiate inquiry, wait for
    results and manually copy and paste results)

BK
23
Limitations Scope of Web Site
  • Scope
  • What is a Library Web site?
  • What is not part of a Library Web site?
  • It can be difficult to answer these questions.
  • There are no standard ways to define a Web site
    other than by use of domain names and directory
    structures
  • Even directory structures can be inadequate if
    they are not used correctly
  • Comparisons
  • It may not to sensible to make comparisons
    between libraries of different types and sizes

BK
24
Limitations Automated Only
  • Use of an automated approach
  • Would not (easily) address content issues
  • Has been supplemented with manual observations
    (e.g. home page, 404 page search engine page)
  • However
  • An automated approach can be more objective and
    reproducible
  • An automated approach should be less
    resource-intensive (once software has been set up
    to maintain links to resources, surveys sites and
    process results)
  • A automated approach could be used in conjunction
    with a manual survey (of a representative sample
    set of resources)

BK
25
Beyond A Pilot
  • Despite the limitations which have been
    described, would a comprehensive and systematic
    benchmark of, say, UK Library Web sites be of
    benefit?
  • Can we address the resource issues?
  • Are the lack of standards being addressed?
  • Can we find someone to do the work?
  • Should the focus be developmental?
  • Can the work be extended to provide notification
    of problems (e.g. search engine not working)?

What may happen if we dont do this? Might we
find that funders set up inappropriate or flawed
performance indicators?
BK
26
A Model For Implementation
  • The benchmarking process can be made less
    time-consuming if a more flexible model for
    managing the data was used

At present we seem to have a HTML page with links
to library Web sites Unfortunately HTML pages are
difficult to repurpose
A better model is to store links in a neutral
databases, and to generate pages for viewing by
end users and for input into benchmarking Web
services The database could also be reused for
other purposes e.g. checking links and email
notifications of problems
Page for inputto Web services
Page for viewing
BK
27
Towards Web Services
  • Background
  • Web initially implemented for provision of
    information
  • CGI allowed users to input data and provided
    integration with backend applications
  • Techniques described use URL as input to auditing
    service. However this provides limited
    functionality and is susceptible to vagaries of
    marketplace
  • Future
  • Web Services will support machine integration
    by providing a standard messaging infrastructure
    which uses HTTP protocol
  • XML output (e.g. EARL) will provide a neutral
    format for benchmarking output, and can describe
    benchmarking environment (EARL is RDF)

BK
28
Need For Standard Definitions
  • Need For Standard Definitions
  • There is a need for standard definitions of
    terminology such as Web page, visit, unique
    visit, session, etc. in order to ensure that
    meaningful and objective comparisons can be made
  • The market place is addressing current
    deficiencies within Web Advertising and Web
    Auditing communities (and there are financial
    incentives for this to be solved)
  • With the growth in e-governments internationally
    and governments setting targets (X of government
    work to be carried about electronically by 2005)

BK
29
Doing The Work
  • If there is further interest, who should do the
    work?

Project partners
Who?
Researcher
Why?
Funding body
Student project
current/new remit
Auditing body
Other(s)
Single Regional Agency
Research interest
BenchmarkingWork
Dissemination
What?
benefits community
Maintain central database
Best Value - Performance Indicators e.g. BV157 -
electronic interactions
Software development
Producing reports
PG
30
What Next?
  • To summarise
  • Approach to the automated benchmarking of a small
    set of Public Library Web sites has been shown
  • Implications of the findings have been discussed
  • There are limitations of the methodology
  • It is suggested that
  • Despite the limitations of benchmarking the
    approach can aid
  • Community building
  • Learning from successes and mistakes
  • Performance Measurement/Best Value Review
  • Are there advantages in carrying out this work on
    a regional/local basis/with existing partners
    basis?

PG
31
Questions
  • Any questions?

PG
32
Useful resources
  • How people with Disabilities Use the Web W3C
    working draft, 4 January 2001 (Human Computer
    Interaction) http//www.w3.org/WAI/EO/Drafts/PWD-
    Use-Web/20010104.html
  • Bobby http//www.cast.org/bobby/
  • WAVE http//www.temple.edu/inst_disabilities/pia
    t/wave/

PG
Write a Comment
User Comments (0)
About PowerShow.com