Evaluating JISC 5/99 Content for Reusability as Learning Objects PowerPoint PPT Presentation

presentation player overlay
1 / 37
About This Presentation
Transcript and Presenter's Notes

Title: Evaluating JISC 5/99 Content for Reusability as Learning Objects


1
  • Evaluating JISC 5/99 Content for Reusability as
    Learning Objects
  • Sarah Currier
  • DNERLO Research Fellow / CETIS EC SIG
    Coordinator
  • Centre for Academic Practice
  • University of Strathclyde

2
  • OVERVIEW
  • Aimed to evaluate whether JISC content could be
    made available for reuse in elearning across UK
    FE/HE.
  • Funded by the JISC Learning Teaching Programme.
  • Project partners University of Hull Newark
    Sherwood College University of Strathclyde's
    Centre for Academic Practice and Dept. of
    Computer and Information Sciences.
  • CETIS Educational Content SIG gave invaluable
    help!

3
  • What is a learning object?
  • Many definitions, still subject to debate.
  • We began with a simple overarching one
  • "A learning object is an entity, digital or
    non-digital, that can be used, reused, or
    referenced during technology supported learning."
  • Koper, R. Modeling units of study from a
    pedagogical perspective the pedagogical
    meta-model behind EML. http//eml.ou.nl/introducti
    on/docs/ped-metamodel.pdf

4
  • Survey and Evaluation of 5/99 Content
  • Map 5/99 content, identifying categories and
    aggregation levels.
  • Identify key issues regarding use and reuse of
    5/99 content as learning objects for each
    category and level.
  • Develop criteria for evaluation of reusability of
    content.
  • Evaluate content by these criteria.

5
  • Methodology
  • Identified 27 content producing 5/99 projects.
  • Gathered data about the content produced.
  • Developed criteria for evaluating content.
  • Evaluated the content of 18 selected projects.
  • 33 complete 50 incomplete/pilot 17 none!
  • Summarised and analysed the data.

6
  • Useful Outcomes
  • Snapshot of 5/99 content as it was early 2002
    data to pass on to JORUM and rest of X4L for
    JISC to take forward.
  • Evaluation criteria process of defining these
    and the criteria themselves- important
    contribution to elearning.

7
  • But, Please Note
  • Content was not evaluated for educational merit.
  • No criticism of projects is implied 5/99 began
    BEFORE these issues became prominent DNER
    Guidelines were made available after most
    projects planning stages.
  • Evaluation criteria and other definitions are not
    intended to be authoritative or final we hope
    they will be useful contributions to ongoing
    dialogue within elearning.

8
  • Evaluation Criteria
  • Granularity and Aggregation Level
  • Reusability
  • Vertical Reusability
  • Horizontal Reusability / Subject Specificity
  • Interactivity
  • Metadata (Gathered data only did not evaluate
    use)
  • VLE reuse (Surveyed all 27 projects re pilots
    plans)
  • Interoperability (Not evaluated by!)

9
  • Evaluation Criteria Interoperability
  • Why not?
  • IEEE defines interoperability as
  • The ability of two or more systems or components
    to exchange information and to use the
    information that has been exchanged. (IEEE,
    1990).
  • Therefore

10
  • Evaluation Criteria Interoperability
  • Why not?
  • Interoperability is not a property of a resource
    rather it is a property of the relationship
    between systems in a particular context.
    Therefore it is impossible to evaluate the
    interoperability of JISC content. However by
    evaluating and recording the factors outlined in
    this study it should be possible to gauge a
    resources potential to interoperate with virtual
    learning environments, digital repositories and
    content management systems.

11
  • Aggregation Levels IEEE LOM Levels
  • The smallest level of aggregation, e.g. raw media
    data or fragments.
  • A collection of level 1 learning objects, e.g. a
    lesson.
  • A collection of level 2 learning objects, e.g. a
    course.
  • The largest level of granularity, e.g. a set of
    courses that lead to a certificate.

12
  • Aggregation Levels / Top Level Content Categories
  • Information object
  • Information resource
  • Learning object
  • Unit of study
  • Module
  • Course
  • Collection

13
  • Aggregation Levels / Top Level Content Categories
  • Information object Information
  • Information resource Information
  • Learning object Educational
  • Unit of study Educational
  • Module Educational
  • Course Educational
  • Collection Information

14
  • Aggregation Levels (1)
  • Information object A simple object that does not
    have a specific educational objective and is not
    situated within an educational scenario e.g. an
    image or text file.
  • Information resource An aggregation of
    information objects, which does not have a
    specific educational objective, and which is
    presented as a cohesive unit e.g. an online
    encyclopaedia or e-journal.

15
  • Aggregation Levels (2)
  • Learning object An object that demonstrates, or
    focuses on, a specific educational concept, e.g.
    a learning activity task or assessment.
  • Unit of study An aggregation of learning objects
    and information objects. Sometimes referred to as
    a lesson.
  • Module An aggregation of lessons and learning
    objects.

16
  • Aggregation Levels (3)
  • Course A large aggregation of lessons, modules
    and other related resources.
  • Collection An aggregation of two or more of any
    of the above types of resources, which does not
    have a specific educational objective overall,
    and which is not presented as a cohesive unit,
    but rather is tied together via a search or
    browse mechanism such as a catalogue or search
    engine e.g. a collection of digitised slides, a
    database of learning objects.

17
  • Second Level Content Categories
  • Information objects Text files Still images
    Still images with text, etc. 3D images
    Animations Moving images Sound files
    Collections data Links pages Glossaries
    Bibliographies Spreadsheets Promotional
    material
  • Learning objects Text files Media files Case
    studies Still images with text etc. Animations
    Moving images Models Simulations Assessments
    Worksheets Exercises Promotional material
  • Units of study How-to guides Slideshows Moving
    images Themed pathways

18
  • Reusability Factors to consider
  • Technical format. Is the resource tied to a
    single delivery platform or technology?
  • Contextual dependency. Does the content of the
    resource reference other related, but external,
    resources? E.g. a resource may refer to a
    glossary or to the next module in a sequence.
  • Technical dependency. Is the delivery of the
    content technically dependent on other resources?
    E.g. HTML pages that are linked in a linear
    navigation sequence, interactive content that
    relies on server side scripts, Java applets with
    class files residing on remote servers.

19
  • Evaluation Criteria Reusability (1)
  • Reusable May be delivered via a wide variety of
    platforms or technologies, do not reference
    related external content, are not technically
    dependent on external resources.
  • Somewhat reusable May be restricted to a single
    delivery technology but are still relatively
    reusable due to the ubiquitous nature of that
    technology.

20
  • Evaluation Criteria Reusability (2)
  • Potentially reusable Have potential for reuse,
    i.e. they may be delivered in a standard format,
    e.g. HTML, but are dependent on related
    resources.
  • Not reusable Restricted to a specific delivery
    platform or technology, and/or highly dependent
    on related resources.

21
  • Evaluation Results Reusability
  • 22 of 18 projects produced some Reusable
    content.
  • Of these, only 11 (2 projects!) produced
    primarily Reusable content.
  • 61 produced some Somewhat Reusable content.
  • 56 produced some Potentially Reusable content.
  • Only 17 (3 projects) produced any Not Reusable
    content. (All justified by project objectives).

22
  • Evaluation Criteria Vertical reusability
  • No Resources that are only appropriate for use
    at a single specific level of study.
  • Potential Resources that are not necessarily
    developed with vertical reusability in mind, but
    that may be used at different levels of study.
  • Yes Resources that include specific support for
    use at different levels of study.

23
  • Evaluation Results Vertical Reusability
  • 50 included resources classified Yes
  • 78 included resources classified Potential
  • 11 included resources classified No both of
    these projects also included resources classified
    Potential

24
  • Evaluation Criteria
  • Horizontal Reusability / Subject Specificity (1)
  • Generic Resources that can be used for teaching
    and learning in any subject field or discipline.
  • Interdisciplinary Resources whose subject
    content makes them applicable to teaching and
    learning in more than one discipline or subject.
  • These two define resources which are
    horizontally reusable.

25
  • Evaluation Criteria
  • Horizontal Reusability / Subject Specificity (2)
  • Subject specific Resources that are designed
    only for use within a specific subject or
    discipline.
  • Resource specific Resources that are designed
    only for use with a specific resource.
  • These two define resources which are NOT
    horizontally reusable.

26
  • Evaluation Results
  • Horizontal Reusability / Subject Specificity
  • 28 included Generic resources.
  • 61 included Interdisciplinary resources.
  • 67 included Subject specific resources.
  • 33 included Resource specific resources.
  • 85 included at least some horizontally reusable
    resources.

27
  • Evaluation Criteria Interactivity
  • What is interactivity? (1)
  • At the most basic level, interaction involves
    communication and the degree of control a user is
    afforded over the learning resource. A user acts,
    the system reacts and the resultant process is
    termed interaction.ICONEX web site
    http//www.iconex.hull.ac.uk/interact.cfm

28
  • Evaluation Criteria Interactivity
  • What is interactivity? (2)
  • For the purposes of the study an interactive
    element was defined as an activity that the user
    of a resource may perform, which may result in
    more than one potential response from the
    resource.
  • Examples of such interactive content include
    simulations and multiple-choice assessments.

29
  • Evaluation Criteria Interactivity
  • No No interactive elements within the resource
    e.g. text, images, etc. only.
  • Some There are interactive elements within the
    resource e.g. some multi-choice self-assessments.
  • Yes The entire resource is interactive, or is
    based around interactivity e.g. simulations with
    resources to support their use.

30
  • Evaluation Results Interactivity
  • 50 produced No interactive resources.
  • 22 produced resources with Some interactivity.
  • 22 produced resources classified Yes.
  • None of the remaining 9/27 JISC 5/99 projects
    stated intentions to produce interactive content.

31
  • Survey Results Metadata
  • Wide range of levels of understanding stages of
    planning
  • DC already implemented 39 (70 are planning
    IMS)
  • DC planned 28
  • IMS implemented 0
  • IMS Meta-data planned 39 (70 already
    implemented DC)

32
  • Survey Results VLE reuse
  • About half of all projects said content was
    reusable in a VLE, or that they were working
    towards this end.
  • Another fifth said their content was intended to
    be used only as is.
  • Remainder (ca. 30) either didnt answer the
    question or didnt appear to understand it.

33
  • Summary
  • Most 5/99 content is Somewhat or Potentially
    reusable, and has Potential for vertical reuse.
  • About half of projects produced horizontally
    reusable content about same for interactive
    content.
  • Only 39 had actually implemented metadata.
  • The 2 projects which created Reusable content
    were those which had intended to do so from the
    beginning as a core aim.
  • They both consisted of Collections of Learning
    Objects and/or Units of Study (i.e. educational
    aggregations).

34
  • Some Conclusions
  • Creating reusable content with potential for
    granularisation and interoperability requires
  • Understanding of the requirements and issues.
  • Planning.
  • Time.
  • Resources (particularly people with the right
    expertise learning technologists librarians).

35
  • Whats Coming from DNERLO
  • Introductory and summary documents.
  • Methodology and evaluation criteria explained.
  • Summary of survey of projects for
    interoperability, metadata VLE issues.
  • Summary and analysis of evaluation data.
  • Colour coded spreadsheet with ALL the data and
    evaluations.
  • Tables listing projects by aggregation levels,
    content categories and evaluative criteria
    applied. 

36
  • Other Useful Resources
  • Writing and Using Reusable Educational Materials
    A Beginners Guide by Mhairi McAlpine and John
    Casey (CETIS Educational Content SIG)
  • http//www.gla.ac.uk/rcc/staff/mhairi/index.html
  • Pac-Man (JISC project) tutorial on creating
    reusable learning materials by Boon Low
  • http//www.met.ed.ac.uk/pac-man/tutorial-access.sh
    tml
  • CETIS Educational Content Metadata SIGs
  • http//www.cetis.ac.uk/
  • JORUM (Strand B X4L) Digital Materials
    Repository Development Bay
  • No website yet they will be in touch!

37
  • Finally
  • Keep an eye on the DNERLO website for much, much
    more detail and interesting stuff from the study
  • http//www.strath.ac.uk/Departments/CAP/dnerlo/ind
    ex.html
  • Availability of public versions of documents will
    be announced on CETIS EC SIG mailing list
  • http//www.jiscmail.ac.uk/lists/CETIS-ECSIG.html
  • Contact Sarah Currier
  • sarah.currier_at_strath.ac.uk
Write a Comment
User Comments (0)
About PowerShow.com