Practice%20Makes%20Perfect:%20%20applying%20and%20adapting%20best%20practices%20in%20information%20literacy - PowerPoint PPT Presentation

About This Presentation
Title:

Practice%20Makes%20Perfect:%20%20applying%20and%20adapting%20best%20practices%20in%20information%20literacy

Description:

Students listen to a symphony to identify the dynamism of the myth and interpret ... compare and contrast a selection of primary sources (music) Summary ... – PowerPoint PPT presentation

Number of Views:98
Avg rating:3.0/5.0
Slides: 81
Provided by: sherilhook
Category:

less

Transcript and Presenter's Notes

Title: Practice%20Makes%20Perfect:%20%20applying%20and%20adapting%20best%20practices%20in%20information%20literacy


1
Practice Makes Perfect applying and adapting
bestpractices in information literacy
  • Sheril Hook Esther Atkinson
    Andrew Nicholson
  • Instruction Coordinator Liaison Librarian
    GIS/Data Librarian
  • University of Toronto Mississauga

WILU Conference, May 18, 2007
2
Agenda
  • IL Program Development (Sheril)
  • Category 5 articulation with the curriculum
  • Examples of BP Category 5 (Andrew)
  • research-based learning
  • IL learning outcomes
  • IL Program Development (Sheril)
  • Category 10 Assessment/Evaluation
  • Examples of BP Category 10 (Esther)
  • data and its impact on instruction and planning

3
ALA/ACRL Characteristics of Programs of
Information Literacy that Illustrate Best
Practices
  • Category 5 Articulation with the Curriculum
  • Articulation with the curriculum for an
    information literacy program
  • is formalized and widely disseminated
  • emphasizes student-centered learning
  • uses local governance structures to ensure
    institution-wide integration into academic or
    vocational programs
  • identifies the scope (i.e., depth and complexity)
    of competencies to be acquired on a disciplinary
    level as well as at the course level
  • sequences and integrates competencies throughout
    a students academic career, progressing in
    sophistication and
  • specifies programs and courses charged with
    implementation.
  • http//www.ala.org/ala/acrl/acrlstandards/characte
    ristics.htm

4
IL Program Development PlanningPart 1
  • ACRL Best Practices Document
  • environmental scan
  • internal scan internal development
  • external scan external development
  • current state next steps
  • Shared Philosophical Framework
  • training development
  • informing our pedagogical practices
  • developing expertise as shared responsibility
  • use of IL Standards and terminology

5
Environmental Scan
  • Core curricula
  • (horizontal/vertical integration in Part 2)
  • Departmental goals
  • Required courses for baseline expectations
  • Representation on curriculum committees
  • Movements in teaching/learning
  • student engagement

6
Environmental Scan
  • Student Engagement
  • NSSE http//nsse.iub.edu/
  • Peer learning, aka peer assisted learning,
    supplemental instruction
  • http//www.peerlearning.ac.uk/
  • http//www.umkc.edu/cad/SI/index.htm
  • Re-invention Center http//www.sunysb.edu/Reinvent
    ioncenter/
  • Inquiry-based, discovery, problem-based, or
    research-based learning

7
(No Transcript)
8
http//www.reinventioncenter.miami.edu/BoyerSurvey
/index.html
9
http//www.reinventioncenter.miami.edu/pdfs/2001Bo
yerSurvey.pdf
10
Student Engagement
  • research-based learning
  • problem-based learning
  • inquiry-based learning
  • discovery learning
  • knowledge building

Scardamalia, M., Bereiter, C. (2003).
11
Shared Philosophical Framework
  • information literacy as concept
  • tool-based vs. concept-based teaching
  • other literacies, e.g., technology, media,
    spatial, data
  • inventory of current practices and outreach
    activities
  • articles workshops that help develop framework
  • Learning theory
  • Blooms taxonomy
  • SOLO Taxonomy (Biggs)
  • development use of assessment tools

12
What is embedded IL?
  • Embedded
  • Assignment(s) collaboratively developed with
    instructor. IL stated learning outcomes in
    instructor's course materials. Session by
    librarian may or may not have been delivered
    during class time (e.g., series of walk-in
    workshops)
  • Integrated
  • Session content tailored to course assignment in
    consultation with instructor. Session may or may
    not have been delivered during class time (e.g.,
    series of open workshops available to students).
    Session may or may not have been optional.
  • Supplemental
  • Generic information literacy instruction is not
    tied directly to course outcomes or an
    assignment. Session may or may not have been
    optional for students. Session may or may not
    have been delivered during class time.

ANZILL, p6 ANZIL Framework, 2004 ACRL,
2007 Learning Commons, University of Guelph, n.d.
13
IL Standards
  • Standard One
  • The information literate student determines the
    nature and extent of the information
  • Performance Indicator
  • 2. The information literate student identifies a
    variety of types and formats of potential sources
    for information.
  • Outcomes include
  • Knows how information is formally and informally
    produced, organized, and disseminated
  • Recognizes that knowledge can be organized into
    disciplines that influence the way information is
    accessed
  • Identifies the value and differences of potential
    resources in a variety of formats (e.g.,
    multimedia, database, website, data set,
    audio/visual, book)
  • Differentiates between primary and secondary
    sources, recognizing how their use and importance
    vary with each discipline
  • Realizes that information may need to be
    constructed with raw data from primary sources

"Information Literacy Competency Standards for
Higher Education." American Library Association.
2006.http//www.ala.org/acrl/ilcomstan.html
(Accessed 15 May, 2007)
14
Examples of IL Standards tailored and embedded
into course curricula
15
U of T Mississauga Library
  • When we collaborate with our instructors on
    designing a class assignment, we emphasize
  • the Library Vision -Leading for Learning
  • the availability of thousands of Research and
    Information Resources through the U of T
    Libraries
  • as of May 15, 2007
  • 395,184 e-holdings including e-books, journals,
    newspapers, etc.
  • the key role of these resources in enhancing
    student engagement with their learning.

16
U of T Mississauga Library
  • We also stress to instructors that our
    electronic resources can be utilized
  • to enhance their instructional content.
  • to foster an active learning environment in the
    course. Students will begin to think both
    conceptually and critically about the material.
  • to develop information literacy competencies
    among the students, such as retrieving and
    critically evaluating information in any format.

More details about information literacy can be
found at the Association of College Research
Libraries (ACRL) website. http//www.ala.org/ala/a
crl/acrlstandards/informationliteracycompetency.ht
m Many disciplines are now releasing their own
information literacy standards, based on the ACRL
model.
17
Examples from
  • Social Sciences
  • Sciences
  • Humanities

18
Assignment Changes in Canadian Society
  • Outcomes
  • identify and locate statistics needed
  • evaluate statistics for use (do they cover the
    correct geography?, time period?, etc)
  • analyze statistics
  • communicate the results in term paper and
    presentation
  • acknowledge the use of information

Social Sciences
1.
19
Research Question
  • By examining census data related to occupation,
    how have womens working lives changed in a 100
    year period?

2.
Social Sciences
20
  • Outcomes
  • identify and locate statistics needed.
  • Students recognize that the Census collects
    statistics on occupation

Social Sciences
3.
21
  • Outcomes
  • evaluate statistics for use.
  • Students differentiate between census years and
    census geographies available.
  • Students identify value and differences of
    resources in a variety of formats.

Social Sciences
4.
22
  • Outcomes
  • analyze statistics
  • Students recognize the occupation categories
    being used

5.
Social Sciences
5.
23
  • Outcomes
  • analyze statistics
  • Students create a cross tabulation table between
    Occupation and Sex

1901 Census of Canada Occupation by Sex
6.
Social Sciences
24
  • Outcomes
  • analyze statistics

2001 Census of Canada Occupation by Sex
  • Students next identify and locate the
  • 2001 Census Variables relating to
  • occupation and sex.
  • On the next slide
  • A 2001 Census cross tabulation is
  • then compared with 1901 Census
  • cross tabulation.
  • Students will recognize that
  • occupation categories will have
  • changed in the 100 year time span.
  • Students realize that the data can
  • be extrapolated into multiple
  • categories

Social Sciences
7.
25
1901 Census of Canada Occupation by Sex
2001 Census of Canada Occupation by Sex
  • Outcomes
  • analyze statistics

8.
26
  • Outcomes
  • communicate the results in term paper and
    presentation
  • Students add tables to term paper and also to a
    class slideshow presentation.
  • acknowledge the use of information

1901 Census of Canada Bibliographic
Entry Canada. Statistics Canada. Census of
Canada, 1901 public use microdata file
individuals file computer file. Victoria, B.C.
University of Victoria Canadian Families Project
producer distributor. January 2002.
lthttp//myaccess.library.utoronto.ca/login?urlht
tp//r1.chass.utoronto.ca/sdaweb/html/canpumf.htmgt
2001 Census of Canada Bibliographic
Entry Canada. Statistics Canada. Census of
Canada, 2001 public use microdata file -
individuals file computer file. Revision 2.
Ottawa, Ont. Statistics Canada producer
Statistics Canada. Data Liberation Initiative
distributor, 2006/04/26. (STC 95M0016XCB)
lthttp//myaccess.library.utoronto.ca/login?urlht
tp//r1.chass.utoronto.ca/sdaweb/html/canpumf.htmgt

Social Sciences
9.
27
Examples from
  • Social Sciences
  • Sciences
  • Humanities

28
Assignment Cited Reference Searching in the
Sciences
  • Outcomes
  • evaluate available resources to see if their
    scope will include citation tracking statistics
    and journal impact factor
  • locate and interpret the citation information

Sciences
1.
29
Research Question
  • WYTTENBACH, R. and HOY, R. DEMONSTRATION OF THE
    PRECEDENCE EFFECT IN AN INSECT JOURNAL OF THE
    ACOUSTICAL SOCIETY OF AMERICA 94 (2) 777-784
    AUG 1993.
  • Before including this reference in a paper, check
    to see how reputable both the article and the
    journal is in the discipline. Should it be
    included?

2.
Sciences
30
  • Outcomes
  • Evaluate available resources to see if their
    scope includes citation tracking.
  • Students recognize that journal articles have
    value in a particular discipline and that they
    can be measured in a variety of ways, including
    specialized citation indexes.

Sciences
3.
31
  • Outcomes
  • Evaluate available resources.
  • Students recognize the ability
  • to perform cited reference searching
  • in a variety of ways.

Sciences
4.
32
  • Outcomes
  • locate and interpret the citation information.
  • Students locate the citation and realizes that
    the authors consulted a variety
  • of sources (Cited References) and more
    importantly this citation has been
  • cited frequently (Times Cited) in the years
    since publication.

Sciences
5.
33
  • Outcomes
  • interpret the citation information.
  • Students can review the cited references from the
    article and examine
  • the origins of the research

Sciences
6.
34
  • Outcomes
  • interpret the citation information.
  • By checking the Times Cited, students gain
    insight into the impact of the
  • article in the discipline.

Sciences
7.
35
  • Outcomes
  • interpret the citation information.
  • Students also access the JCR to check the Impact
    Factor

Sciences
8.
36
  • Outcomes
  • interpret the citation information.
  • Students can also rank other journals in the
    discipline by impact factor.

Sciences
9.
37
Examples from
  • Social Sciences
  • Sciences
  • Humanities

38
AssignmentMyth over Time
  • Outcomes
  • Explore the dynamism of myth by comparing and
    contrasting a selection of ancient and modern
    primary sources of a myth (at least one literary,
    one material)
  • Identify the most significant changes from
    ancient to modern source and discuss those
    changes in light of the context in which each
    source was created
  • Interpret those changes in terms of how they
    affect the meaning of the myth and how they came
    about in the first place

Humanities
1.
39
Research Question
  • How have myths changed over time?

2.
Humanities
40
  • Outcomes
  • compare and contrast a selection
  • of primary sources (art)
  • Students begin by finding primary sources--art
    works, music, scripts, opera and background
    information on artists

Google has images, but no provenance information
Camio has images, plus provenance and usage
rights information
Humanities
3.
41
  • Outcomes
  • identify the most significant changes...in light
    of the context in which each source was created.

Students build on the learning acquired by
finding background information on a time
period/place
Humanities
4.
42
  • Outcomes
  • identify the most significant changes...in light
    of the context in which each source was created.

Students place a myth in the cultural context in
which its being used or re-told
Humanities
5.
43
  • Outcomes
  • compare and contrast a selection of primary
    sources (music)

Students listen to a symphony to identify the
dynamism of the myth and interpret its
significance
Humanities
6.
44
Summary
  • The U of T Mississauga Library provides access to
    thousands of digital and interactive resources
    for a variety of active and conceptual based
    learning activities.
  • These resources can be utilized to promote both
    student engagement and the embedding of IL
    standards and outcomes.

45
ALA/ACRL Characteristics of Programs of
Information Literacy that Illustrate Best
Practices
  • Category 10 Assessment/Evaluation
  • Assessment/evaluation of information literacy
    includes program performance and student
    outcomes and
  • for program evaluation
  • establishes the process of ongoing
    planning/improvement of the program
  • measures directly progress toward meeting the
    goals and objectives of the program
  • integrates with course and curriculum assessment
    as well as institutional evaluations and
    regional/professional accreditation initiatives
    and
  • assumes multiple methods and purposes for
    assessment/evaluation-- formative and
    summative-- short term and longitudinal
  • http//www.ala.org/ala/acrl/acrlstandards/characte
    ristics.htm

46
ALA/ACRL Characteristics of Programs of
Information Literacy that Illustrate Best
Practices
  • Category 10 Assessment/Evaluation (contd)
  • Assessment/evaluation of information literacy
    includes program performance and student
    outcomes and
  • for student outcomes
  • acknowledges differences in learning and teaching
    styles by using a variety of appropriate outcome
    measures, such as portfolio assessment, oral
    defense, quizzes, essays, direct observation,
    anecdotal, peer and self review, and experience
  • focuses on student performance, knowledge
    acquisition, and attitude appraisal
  • assesses both process and product
  • includes student-, peer-, and self-evaluation
  • http//www.ala.org/ala/acrl/acrlstandards/characte
    ristics.htm

47
How are we teaching/Who are we reaching?
  • Reflective teaching practices
  • Teaching portfolios
  • Sharing with colleagues and course instructors
  • Evaluation and assessment
  • Student focus groups
  • Inventory of outreach teaching
  • How are you reaching students? How many?
  • Who are current campus partners?
  • Who are potential campus partners?
  • Who will keep these relationships going?
  • As a group where are you teaching?
  • Horizontally and vertically

48
IL Program Development PlanningPart 2
  • Assessment
  • standardized assessments (ETS, SAILS, JMU)
  • creation, use and reflection of assessments
    (background knowledge probe, muddiest point,
    observation, dialogue)
  • instruction database

49
National standardized tools
  • iSkills (aka Information and Communication
    Technology (ICT) Literacy Assessment) developed
    by the Educational Testing Service. 35.00 US per
    student
  • http//www.ets.org/
  • Measures all 5 ACRL Standards. Two test options
    Core and Advanced. Computerized, task-based
    assessment in which student complete several
    tasks of varying length, i.e., not multiple
    choice. Intended for individual and cohort
    testing. 75 minutes to complete
  • Standardized Assessment of Information Literacy
    Skills (SAILS) developed by Kent State University
    Library and Office of Assessment. It is also
    endorsed by the Association of Research
    Libraries. 3.00 US per student (capped at
    2,000), but we can also administer ourselves for
    free.
  • https//www.projectsails.org/
  • Measures ACRL Standards 1,2,3,5. Paper or
    Computerized, multiple-choice. Intended for
    cohort testing only. 45 questions, 35 minutes to
    complete.
  • Information Literacy Test (ITL) developed by
    James Madison University (developed by JMU
    Libraries and Center for Assessment and Research
    Studies)
  • http//www.jmu.edu/icba/prodserv/instruments_ilt.h
    tm
  • Measures ACRL Standards 1,2,3,5. Computerized,
    multiple-choice. Intended for cohort and
    individual testing. 60 questions, 50 minutes to
    complete.

NPEC Sourcebook on Assessment http//nces.ed.gov/
pubs2005/2005832.pdf
50
ETS Advanced Level Access
http//www.ets.org/Media/Products/ICT_Literacy/dem
o2/index.html
51
ETS Core Level - Manage
http//www.ets.org/Media/Products/ICT_Literacy/dem
o2/index.html
52
ETS sample score report
  • Access
  • Find and retrieve information from a variety of
    sources.
  • What was I asked to do?
  • Search a stores database in response to a
    customers inquiry
  • How did I do?
  • You chose the correct store database on your
    first search.
  • You selected the most appropriate category for
    searching.
  • You chose the best search term for the database
    you selected.
  • You selected one inappropriate item for the
    customer in addition to appropriate ones.

http//www.ets.org
53
ETS Pilot at UTM
  • Evaluating the Results
  • The relationship between the Core and Advanced
    score ranges is not clear. Are the two tests on
    a continuous scale (e.g., with Core representing
    100 300 and Advanced 400 700)?
  • The University of Toronto Mississauga norms seem
    to be consistent with the norms from other
    institutions, and they all seem to be clustering
    in the middle.
  • Though students received written feedback on
    their performance within each category, it is
    unclear how this feedback relates to their
    aggregate score and how it is derived from the
    students performance on the test (e.g., time
    taken to perform each task, number of clicks).
  • It is unclear if students are being tested on the
    same variables within each category across all
    different versions of the test (e.g., the student
    reports suggest that some students were evaluated
    on different criteria in certain categories).
  • The institution does not receive any granular
    statistical data (e.g., by performance within
    each category or by question), and only has
    access to individual student reports and the
    aggregate score for each student.

54
(No Transcript)
55
Learning Outcomes Assessment
  • classroom assessment techniques (CATs)
  • self-awareness inventories
  • in-class pre-/post-assessments
  • class assignments

56
Instruction Database
57
Instruction Database
58
U of T Mississauga Library
  • Information Literacy Program Data
  • Records various characteristics of the
    instruction sessions
  • May 2005 to April 2007
  • Early data reflects what is being done and what
    needs to be addressed

59
U of T Mississauga Library
  • Assessing Our Program
  • Market penetration
  • Reflective of current teaching practices

60
U of T Mississauga
  • 1. Market Penetration
  • Number of students reached
  • Departmental contact
  • Number of instruction sessions given
  • Level of vertical integration

61
U of T Mississauga Library
Table 1 Number of students reached per course
62
Fig. 1 Number of students reached per department
63
Fig. 2 Number of unique instruction sessions
taught per department
64
Fig. 3 Number of instruction sessions per course
level
65
Fig. 4 Number of instruction sessions per course
level per department
66
U of T Mississauga Library
  • What next?
  • How do we gain further access to underserved
    departments?
  • How do we add new departments to our IL program?
  • Would we abandon classes with little impact on
    student experience?
  • Developing stronger vertical integration by
    including more upper year courses

67
U of T Mississauga Library
  • 2. Reflective of current teaching practices
  • Type of session
  • Which ACRL Standards are addressed
  • What tools are covered in the sessions
  • Building a class profile

68
U of T Mississauga Library
Fig. 5 Number of unique instruction sessions
given by type
69
U of T Mississauga Library
Table 2 Number of instruction sessions with
stated ACRL Standards
70
U of T Mississauga Library
Table 3 Number of instructions sessions teaching
specific tools
71
U of T Mississauga Library
Table 4 Number of instruction sessions teaching
specific tools by department
72
U of T Mississauga Library
Table 5 Tools taught in instruction sessions
Department of Anthropology 2005-2007
73
U of T Mississauga Library
  • Reflective of teaching practices
  • Identify strengths and weaknesses
  • Gain an understanding of current teaching
    sessions
  • Develop strategies to address the goals of an
    embedded program across the curriculum

74
U of T Mississauga Library
  • Building a class profile
  • First year Classics course
  • 55 students enrolled
  • Summer session

75
U of T Mississauga Library
Table 6 Number of students enrolled in a first
year Classics course with previous instruction
sessions
76
U of T Mississauga Library
Table 7 Courses with previous instruction takes
by students enrolled in a Classics course
77
U of T Mississauga Library
  • Course profile
  • 50 have already had at least one instruction
    session
  • 10 students have had two or more
  • Questions
  • What were our assumptions?
  • How do we approach this class?

78
U of T Mississauga Library
  • Course profile continued
  • No easy answer
  • The data allows us to look closely at our
    sessions
  • Is there repetition across classes? Year after
    year?
  • What were the learning outcomes?
  • What type of session was it?
  • We are now in the process of reflection and
    learning to build in time to work towards an
    embedded program

79
Thank you!
  • Questions?

80
References
  • ACRL, Information Literacy Glossary. last updated
    March 2007. Online at http//www.ala.org/ala/acrl/
    acrlissues/acrlinfolit/infolitoverview/infolitglos
    sary/infolitglossary.htm
  • Anderson Krathwohl, 2001
  • ANZILL, Australia and New Zealand Information
    Literacy Framework. 2nd edition. Adelaide, AU,
    2004. http//www.anziil.org/resources/Info20lit2
    02nd20edition.pd
  • Biggs, J. (1999). Teaching for quality learning
    at university. Buckingham, U.K. Society for
    Research into Higher Education (SRHE) Open
    University Press.
  • Learning Commons, University of Guelph, (n.d.).
    Framework for the design and delivery of learning
    commons programs and services.
  • Scardamalia, M., Bereiter, C. (2003). Knowledge
    Building. In J. W. Guthrie (Ed.), Encyclopedia of
    Education, Second Edition (pp.). New York
    Macmillan Reference, USA. Retrieved from
    http//ikit.org/fulltext/2003_knowledge_building.p
    df
  • Bereiter, C., Scardamalia, M. (2003). Learning
    to work creatively with knowledge. In E. De
    Corte, L. Verschaffel, N. Entwistle, J. van
    Merriënboer (Eds.), Unravelling basic components
    and dimensions of powerful learning environments.
    EARLI Advances in Learning and Instruction
    Series Retrieved from http//ikit.org/fulltext/in
    resslearning.pdf
  • Reinvention Center. http//www.reinventioncenter.m
    iami.edu/pdfs/2001BoyerSurvey.pdf
Write a Comment
User Comments (0)
About PowerShow.com