Quality Matters: InterInstitutional Quality Assurance in Online Learning - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

Quality Matters: InterInstitutional Quality Assurance in Online Learning

Description:

Quality assurance of online courses is important ... Netiquette expectations (I.3) 32% Self-check/practice with quick feedback (III.5) 38 ... – PowerPoint PPT presentation

Number of Views:81
Avg rating:3.0/5.0
Slides: 42
Provided by: CFr89
Category:

less

Transcript and Presenter's Notes

Title: Quality Matters: InterInstitutional Quality Assurance in Online Learning


1
Quality Matters Inter-Institutional Quality
Assurance in Online Learning
  • Sponsored by the U.S. Dept. Education Fund for
    the Improvement of Postsecondary Education
    (FIPSE)

Instructional Technology Council April 18, 2006
2
Quality Matters
  • Quality does matter to
  • students
  • faculty
  • administrators
  • institutions
  • consortia
  • accrediting agencies
  • legislators
  • tax-payers
  • How do we
  • identify recognize it?
  • motivate instill it?
  • assess measure it?
  • insure it?
  • assure it?

3
FIPSE Interested Because
  • Quality assurance of online courses is important
  • Voluntary inter-institutional assurance has never
    been done before
  • This can serve as a national model

Quality Matters!
4
Quality Matters Inter-Institutional Quality
Assurance in Online Learning
  • Grantor FIPSE
  • Grant period 9/03 8/06
  • Award 509,177
  • Grantee MarylandOnline
  • Statewide consortium 14 community colleges, 5
    senior institutions
  • http//www.QualityMatters.org

5
MarylandOnline
  • Statewide consortium dedicated to support of
    distance learning in Maryland
  • Partners 14 community colleges, 5 senior
    institutions
  • Goals
  • Web gateway for online higher education in
    Maryland
  • Faculty training
  • Facilitate online course and program sharing
  • Facilitate collaborations among member
    institutions
  • Provide statewide leadership in distance
    education

6
Course Peer Review Process
  • Institutions
  • CAOs
  • ARs

Faculty Course Developers
National Standards Research Literature
Course
Rubric
Faculty Reviewers
Training
Peer Course Review
Feedback
Instructional Designers
7
For Our Purposes, Quality Is
  • More than average more than good enough
  • An attempt to capture whats expected in an
    effective online course at about an 85 level
  • Based on research and widely accepted standards

85
8
What this process is NOT
  • Not about an individual instructor
  • (its about the course design)
  • Not about faculty evaluation
  • (its about course quality)
  • Not a win/lose, pass/fail test
  • (its about a continuous improvement process in
    a supportive environment)

9
QM Collegial Review vs. Evaluation
  • A QM review is
  • Ongoing
  • Focus design
  • Outcome course improvement
  • Voluntary, non-threatening
  • Team approach that includes the faculty member
  • Full disclosure to faculty
  • A faculty evaluation is
  • Single point in time
  • Focus delivery
  • Outcome decision on performance for
    promotion/tenure
  • Win/lose situation
  • Confidential/secretive

10
Major Themes
  • develop inter-institutional consensus about the
    criteria process for online course QA
  • assure improve course quality
  • positively impact student learning
  • faculty-centered activities
  • promote voluntary participation and adoption
  • ensure institutional autonomy
  • replicable, reliable, and scalable processes
  • foster sharing of materials and expertise
  • create opportunities for professional development

11
Strengths
  • QM is grounded in
  • research literature
  • national standards of best practice
  • instructional design principles

12
Whats In It For Institutions
  • Validation by an external process
  • Strengthen reaccreditation package
  • Raise QA as a priority activity
  • Gain access to a sustainable, replicable,
    scalable QA process
  • Inform online course training practices
  • Provide professional development activities
  • Increase course program sharing (MOL)

13
Whats In It For Faculty
  • Improve your online course
  • Gain access to instructional design support
  • QA validation by external peers
  • Expand professional community
  • Review other courses gain new ideas for your
    own course
  • Useful for annual evaluations, promotion
    applications, professional development
    plan/requirements
  • 150 for each completed peer course review

14
Your Questions?
15
Rubric
  • Based in
  • research literature
  • nationally recognized standards of best practice
  • instructional design principles
  • Used by review teams to
  • assess course quality in 8 key areas (40 review
    elements)
  • provide feedback to faculty course developer
  • provide guidance to instructional design support
    team

16
The Rubric
  • Eight standards
  • Course Overview and Introduction
  • Learning Objectives
  • Assessment and Measurement
  • Resources and Materials
  • Learner Interaction
  • Course Technology
  • Learner Support
  • ADA Compliance

Key components must align.
17
Rubric Features
  • Living document
  • Web-based
  • Automated compiling of team report
  • Annotations
  • Examples

18
Rubric Scoring
  • Team of three reviewers
  • One score per standard based on majority
  • Two criteria to meet quality expectations
  • Yes to all 14 Essential Standards
  • Receive at least a total of 68 points

19
Review Teams
  • 3 Faculty Peer Reviewers
  • 1 from home institution, 2 from others
  • 1 from same discipline, 2 from others
  • mix of CC 4 yr schools
  • mix of large small schools
  • mix of public private schools
  • Faculty Course Developer
  • access to rubric prior to review
  • involved in pre-review discussions
  • consulted during review

20
Peer Reviewers
  • Selection Factors
  • Prior training to teach online
  • Extent of online teaching experience
  • Currency of online teaching experience
  • Content area
  • Requirements
  • Sign MOU
  • Attend peer reviewer/rubric training

21
Rubric Training
  • Focus on
  • Application of rubric to course review
  • Interpretation of review elements
  • Providing constructive feedback
  • Competency-based

22
Your Questions?
23
QM to Date
  • Overall Participation
  • Individuals programs from 109 institutions
    across 26 states
  • Course Reviews
  • 79 reviews completed
  • 16 MOL schools, 5 other schools
  • 26 reviews underway in spring 2006
  • 14 MOL schools, 7 other schools
  • Peer Reviewer Rubric Training
  • over 500 trained

24
Awards - 2005
  • WCET Outstanding Work (WOW) Award
  • USDLA 21st Century Best Practice Award
  • Maryland Distance Learning Association (MDLA)
    Best Program Award

25
External Partners
  • Sloan Consortium
  • Southern Regional Education Board (SREB)
  • Western Cooperative for Education
    Telecommunications (WCET)
  • Towson University (MD)
  • Kentucky Virtual University
  • Michigan Virtual Community College Consortium
  • Portland Community College (OR)
  • Florida Community College of Jacksonville (FL)
  • Raritan Valley Community College (NJ)

26
Advisory Board
  • Middle States Commission on Higher Education
  • MD Higher Education Commission
  • MD State Department of Education
  • Penn State University
  • US Naval Academy
  • Miami University (OH)
  • South Dakota Electronic University Consortium
  • Minnesota Online
  • Northern Virginia Community College
  • Bucks County Community College (PA)
  • Defense Acquisition University
  • Education Direct
  • Kaplan College

27
Overall Course Review Results
  • Upon initial review
  • 51 meet expectations
  • 19 do not meet expectations - missing at least
    one essential 3-point element(s)
  • 30 do not meet expectations - missing at least
    one essential 3 point element(s) and a minimum of
    68 points

28
Overall Course Review Results
29
Course Reviews Over Time
30
Common Themes
  • Course reviews revealed 11 common areas for
    course improvement
  • Elements that are missing in 20 or more of the
    courses reviewed
  • These are potential targets for
  • faculty training
  • special attention in the initial course
    development phase

31
Common Areas for Improvement
  • Instructor self-introduction (I.4) 22
  • Activities that foster interaction (V.2)
    22
  • Technology/skills/pre-req knowledge stated (I.6)
    24
  • Links to academic support, student services,
    tutorials/resources (VII.2-VII.4)
    24-27
  • Learning objectives at module/unit level (II.5)
    27
  • Netiquette expectations (I.3) 32
  • Self-check/practice with quick feedback (III.5)
    38
  • B/W alternatives to color content (VIII.4)
    54
  • Alternatives to auditory/visual
    content (VIII.2) 59

32
Your Questions?
33
Lessons Learned - 1
  • QM is part of an on-going process and continuum
    of activities
  • Must address and minimize faculty anxiety prior
    to review
  • Need for faculty training at individual
    institutions during course design and prior to
    implementing a review process
  • Need for pre-course development checklist tied to
    rubric
  • QM System has multiple uses and adaptations .

34
Reported Uses of QM System
  • Quality assurance of existing courses
  • Checklist/guidelines for initial online course
    development
  • Ongoing faculty professional development
  • Institutional reaccredidation packages
  • Formation of distance learning policies
    steering committees

35
Lessons Learned - 2
  • Approach to the Rubric and the Review process
    needs to be holistic
  • Alignment concept
  • QM process and tools achieve their intended
    purpose, are rigorous, replicable, reliable,
    scalable

36
Lessons Learned - 3
  • Participants indicate QM is a valuable activity
  • 97 of trained faculty believe QM will positively
    impact teaching learning at their institution
  • Trainees report immediate impact
  • Raised awareness of standards
  • Made improvements to their own course
  • Receive rich feedback with specific suggestions
    for course improvement
  • Gain access to instructional design support
  • Able to view other online courses and gain ideas
    for improving their own course

37
Lessons Learned - 4
  • Keys to wide adoption
  • Based in research literature
  • Faculty-centered
  • Collegial review, not evaluation
  • Open collaboration and sharing
  • Focus on professional development
  • Voluntary
  • Institutional autonomy
  • Unexpected and unintended positive consequences
    may arise embrace them!

38
Looking Ahead - 1
  • Assess the impact of QM on student learning
    through research projects
  • In progress now, spring 2006 completion
  • Described on website
  • Annual update cycle
  • Rubric
  • Research Matrix

39
Looking Ahead - 2
  • Adapt rubric review process to other formats
  • Hybrids/Blended
  • Pilot rubric developed, reviews in spring 06
  • Emphasis on relationship between online f2f
    components, and appropriate use of each
  • Continuing Education
  • K-12
  • F2F
  • Training Courses

40
Looking Ahead - 3
  • Adapt rubric process for specific institutional
    needs
  • Promote the integration of the QM process within
    institutions
  • Explore the QM Program/Institution concept
  • Diversify training program
  • Sustainability plan
  • Develop partnerships

41
To Participate or for More Information
  • www.QualityMatters.org
  • Project Management Team
  • Chris Sax csax_at_umuc.edu
  • Mary Wells mwells_at_pgcc.edu
  • Project Coordinator
  • Kay Kane kkane_at_pgcc.edu
Write a Comment
User Comments (0)
About PowerShow.com