Library Assessment on a Budget: Using Effect Size MetaAnalysis to Get the Most Out of the LibraryRel - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Library Assessment on a Budget: Using Effect Size MetaAnalysis to Get the Most Out of the LibraryRel

Description:

Effect Size. Magnitude (amount) of phenomena in population under study ... About 40 effect size metrics: Recommend two. Cohen's d. Standardize mean difference ... – PowerPoint PPT presentation

Number of Views:57
Avg rating:3.0/5.0
Slides: 16
Provided by: radfo95
Category:

less

Transcript and Presenter's Notes

Title: Library Assessment on a Budget: Using Effect Size MetaAnalysis to Get the Most Out of the LibraryRel


1
Library Assessment on a Budget Using Effect Size
Meta-Analysis to Get the Most Out of the
Library-Related Survey Data
  • Library Assessment Conference
  • University of Virginia, Sept. 25-27, 2006
  • Eric Ackermann
  • Reference/Instruction Assessment Librarian
  • McConnell Library, Radford University
  • egackerma_at_radford.edu

2
The Opportunity
  • Library data (can) exist in non-library survey
    results
  • Where?
  • Local questions in national surveys
  • NSSE (up to 20 local questions)
  • HERI (up to 21 local questions)
  • Library questions in local RU surveys
  • University 100 Student Satisfaction Survey
  • (1 library question)
  • Undergraduate Exit Survey (4 library questions)

3
The Challenge
  • How to compare results (sample means) from
    disparate data sources?
  • Incompatible scales
  • RU surveys NSSE (4-pt Likert)
  • HERI (no scale)
  • Incompatible outcomes reporting
  • RUs surveys (means, freq, percents)
  • NSSE (means, statistical significance, d, freq
    distribution, percents)
  • HERI (percents)

4
The Solution
  • Effect size meta-analysis
  • Quantitative method (statistically defensible)
  • Research synthesis
  • Developed in the social sciences
  • Data comparisons across disparate studies

5
Effect Size
  • Magnitude (amount) of phenomena in population
    under study
  • Avoids over-reliance on statistical significance
  • About 40 effect size metrics Recommend two
  • Cohens d
  • Standardize mean difference
  • U3DIFF
  • Degree of overlap between the distributions of
    the two sample means under comparison

6
Cohens d and U3DIFF
  • Popular, widely used metrics
  • Has magnitude and direction (positive/negative)
  • No sophisticated statistical background needed
  • Relatively easy to understand
  • Cohens d
  • For example, if d .40, then 4/10 of a standard
    deviation separates the two means under
    comparison
  • U3DIFF
  • reports d as a percent advantage. For example, if
    d 0.4, then U3DIFF 15.5 advantage for the
    larger-meaned group
  • Relatively easy to communicate to non-specialists
    (colleagues, administrators, legislators)

7
Meta-analysis
  • Statistical process
  • Metrics standardized
  • Weighted by sample size
  • Then averaged
  • Advantages over individual studies
  • Explicit
  • Increased accuracy
  • Greater statistical power

8
Visualizing Effect Size Meta-analysis
Example from ESCI, Exploratory Software for
Confidence Intervals http//www.latrobe.edu.au/psy
/esci/mathnk89.htm
9
Effect Size Meta-Analysis Practical and
Sustainable
  • Practical
  • The data already exists
  • No sophisticated statistical procedures required
  • No specialize software required
  • Freeware
  • Excel
  • Sustainable
  • Regularly scheduled and administered on campus
  • Once set-up, ongoing meta-analysis relatively
    easy to maintain

10
Effect Size Meta-Analysis Cost Effective
  • No cost to library for
  • The assessment instrument
  • Management/administration of instrument
  • Data gathering
  • Preliminary analysis
  • Only costs to library
  • Negotiate inclusion of library questions
  • Retrieve the data
  • Data clean-up
  • Additional data analysis

11
Suggested Readings
  • McNamara, J. (1994) Surveys and Experiments in
    Education Research. Lancaster, PA Technomic
    Publishing Co, Inc.
  • Cooper, H. (1998) Synthesizing Research, A Guide
    for Literature Reviews (3rd ed.). Thousand Oaks,
    CA Sage Publications
  • Kline, R. (2005) Beyond Significance Testing,
    Reforming Data Analysis Methods in Behavioral
    Research. Washington, DC American Psychological
    Association
  • Grissom, R.J. and Kim, J.J. (2005) Effect Sizes
    for Research, A Broad Practical Approach. Mahwah,
    NJ Lawrence Erlbaum Associates, Inc.

12
Freeware
  • Decoster, J. and Leistico, A. (2005) d to U3
    statistical calculator for Windows.
    lthttp//www.stat-help.comgt
  • Devilly, G.J. (2005) ClinTools Software for
    Windows Version 3.5 (computer program). Psytek
    Ltd. lthttp//www.clintools.comgt

13
Appendix A The Study
  • Data sources
  • LibQUAL 2005 (undergraduate data)
  • RU Undergraduate Exit Survey 2005
  • Constructs for comparison
  • Access
  • Analysis
  • Collections
  • Retrieval
  • Hours
  • Staff

14
Appendix B Preliminary Findings - look for
hidden constructs
15
Appendix C Final Findings
Write a Comment
User Comments (0)
About PowerShow.com