Why - PowerPoint PPT Presentation

1 / 133
About This Presentation
Title:

Why

Description:

Why – PowerPoint PPT presentation

Number of Views:117
Avg rating:3.0/5.0
Slides: 134
Provided by: ErinB63
Category:
Tags: aree

less

Transcript and Presenter's Notes

Title: Why


1
Why How to Build Better Blackboard Reports
  • John Fritz, UMBC
  • Jeffrey Berman, Drexel University

2
Agenda
  • Intros
  • Why (John)
  • Break
  • How (Jeffrey)
  • Q A (All, Anytime)

3
Why Did We Do This?
  • Problem
  • A Solutionnot THE Solution
  • Other Examples
  • Future Plans
  • More Information
  • Q A

4
About UMBC
  • Founded in 1966
  • Research extensive university Carnegie
    classification
  • Fall 2008 Stats
  • 12,268 Students
  • 9,612 undergrad, 2,656 grad
  • 1,018 Faculty
  • 714 FT, 304 PT
  • Selected Brags
  • One of 50 Best Colleges for Women
  • 1st in undergrad chemistry degrees
  • awarded to African Americans
  • Six-time National College Chess
  • Champions

5
About Blackboard _at_ UMBC
  • Learning System Version 8.0
  • As of Fall 2008
  • 11,614 students (94 of all students)
  • 1,014 Bb course sites (includes multi-section
    courses)
  • 808 Instructors (79 of all instructors)
  • 356 Communities
  • Includes all student, faculty and staff senates
  • Support Staff
  • 2 FTE (Admin Support)
  • 1 Server Admin

6
PROBLEMHOW DO YOU ANSWER?
  • So, is Blackboard making a difference?
  • Former UMBC Provost Art Johnson in 2002

7
Current Reporting in Blackboard
8
Course vs. System Drill Down View?
9
Questions
  • Functional
  • What is the relationship between Blackboard use
    and teaching and learning?
  • What tools can we give users to shed light on
    (and improve) their own performance within the
    system?
  • Technical
  • How do we query the system without breaking it?
  • How do we scale and maintain the process?
  • Bb Core product?
  • Community Building Blocks?
  • Other?

10
A Modest Proposal
  • Given budget limitations (even outright cuts),
    what if we said IT can only (primarily?) be
    provided where it is determined to be effective
    in helping the largest number of students achieve
    their learning outcomes?
  • How would we know?
  • Who would help us find out?

11
New Tools Approaches
12
UMBC SOLUTION
13
Recent News
14
Transparency Self Help
  • Show faculty what peers are doing through
    publicly available reports of student use.
  • Bb Sysadmins shouldnt have the only birds eye
    view.
  • Average hits per student course rankings dont
    favor large courses over smaller ones.
  • Usage alone is no indicator of quality.
  • But is activity by students piques faculty
    curiosity.
  • New user tools build on activity as an
    indicator (not a cause) of student success.
  • We are NOT interested in if Blackboard makes good
    students, but how good students use Blackboard.

15
(No Transcript)
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
(No Transcript)
20
(No Transcript)
21
bb
22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
(No Transcript)
28
(No Transcript)
29
(No Transcript)
30
(No Transcript)
31
(No Transcript)
32
(No Transcript)
33
(No Transcript)
34
Bb Reports GDR of D F students
  • Based on voluntary GDRs run by instructors in 72
    courses, students earning a D or F tend to use Bb
    about 35 percent less than students earning a C
    or higher.
  • SP2009 (11 Courses 47 percent less)
  • FA2008 (13 courses 40 percent less)
  • SU2008 (7 courses 33 percent less)
  • SP2008 (26 courses 32 percent less)
  • FA2007 (15 courses 36 percent less)
  • What is the institutions ethical obligation of
    knowing?- John Campbell, Purdue

35
(No Transcript)
36
(No Transcript)
37
(No Transcript)
38
(No Transcript)
39
(No Transcript)
40
(No Transcript)
41
(No Transcript)
42
(No Transcript)
43
(No Transcript)
44
(No Transcript)
45
(No Transcript)
46
(No Transcript)
47
(No Transcript)
48
(No Transcript)
49
(No Transcript)
50
(No Transcript)
51
(No Transcript)
52
(No Transcript)
53
Effective Practice Show Tell
54
Effective Practice QA (iTunesU)
55
(No Transcript)
56
(No Transcript)
57
(No Transcript)
58
(No Transcript)
59
(No Transcript)
60
WHAT DO WE DO WITH THIS?
61
Identifying Effective Practitioners
  • Faculty are sharing their rankings in
    departmental meetings.
  • Summer Winter Hybrid Course Redesign Workshop
  • Proposed Faculty have to be in top 50 of their
    departments Bb ranking for student activity.
  • Fall 2009 hybrid courses?
  • 10 courses with 40-70 students (acute shortage of
    classrooms)
  • Instructors have taught hybrid or rank high for
    student activity in Bb.

62
FA2009 Hybrid Courses (Exp.)
63
FA2009 Hybrid Courses (Time)
64
WHY DOES THIS MATTER?
  • Three Golden Rules for Faculty Development in
    Instructional Tech.

65
How Faculty Learn
  • Most Faculty Learn Best From Other FacultyNot
    Instructional Technology Staff.
  • Penn State University World Campus study (2008)
  • faculty preferred some form of one-on-one
    relationship, either with a mentor/colleague (56
    percent rated this highly effective) or
    instructional designer (53 percent rated this
    highly effective).

66
How Faculty Teach
  • Most Faculty Teach The Way They Were Taught--And
    Most Weren't Taught with Technology.
  • Nobody wants to look silly in front of a class.
  • Training should walk the talk
  • Hybrid Course Re-Design Workshop Assignments
  • Deliverable presentations by faculty jump start
    rule 1.
  • Stipends can be used to entice faculty to return
    to being like a student.

67
Faculty Learning Communities
  • Instructional Technology Should Solve A
    Pedagogical Problem or Create A New Student
    Learning Opportunity.
  • Problems or opportunities provide the context
    around which communities form and operate.
  • Faculty learning curve to become efficient
    (effective?) with technology is 3-4 years.
  • Brad Cohen, University of Minnesota, ELI 2007
    Fall Focus Meeting

68
PROBLEMHOW DO YOU GET STUDENTS TO LISTEN?
69
Educause Center for Applied Research
70
Online Gradebook A Symbiotic Relationship?
71
Does Technology Make a Difference?
  • Student Success and IT in courses
  • Convenience IT makes doing my course activities
    more convenient. (65.6 agree)
  • Learning The use of IT in my courses improves
    my learning. (45.7 of respondents agree)
  • Student engagement I get more actively involved
    in courses that use IT. (31.8 agree)
  • 2008 Educause Center for Applied Research (ECAR)
    study, Undergraduates and IT (www.educause.edu/e
    car)
  • Completed by 27,317 freshmen seniors at
    approximately 98 public private institutions.
    UMBC has participated every year.

72
PERSPECTIVES TO CONSIDER
73
Student Perspective
  • Check My Activity Tool Hasnt Been Used As Much
    As Faculty Grade Distribution Report Tool. Why?
  • May be too buried in myUMBC portal for students
    to find it
  • Not yet integrated into Bb under My Grades
    (B2?)
  • Leery of Big Brother or Academic Profiling?
  • See Colleges Mine Student Data to Predict
    Dropouts (5/30/08 Chronicle of Higher Education)
  • Univ. of Northern Arizona researchers reported
    student concerns with intrusive advising
    (Academic Analytics, Educause Review, Vol. 42,
    No. 4, July/August, 2007)
  • Maybe students dont think this is what a CMS is
    supposed to do?

74
FA2008 SCI100 Findings
  • How would you describe the CMAs view of your Bb
    activity compared to your peers?
  • 28 I was surprised by what it showed me
  • 12 It confirmed what I already knew
  • 42 Id have to use it more to determine its
    usefulness
  • 16 I havent used it.
  • 2 did not respond to this question

75
Faculty Perspective
  • Concerns about course ranking based on average
    Bb hits per user?
  • Not a problem at UMBC (so far)
  • Cheap feedback to students in terms of faculty
    effort to produce it.
  • Better if they publish grade distribution
    reports in their courses, so students see how
    activity informs success.
  • Can CMS feedback change student motivation? And
    behavior (performance)?

76
FA2008 SCI100 Findings
  • If your instructor published a GDR for past
    assignments, would you be more or less inclined
    to use the CMA before future assignments are due?
  • 54 More inclined
  • 10 Less inclined
  • 36 Not sure

77
FUTURE PLANS
78
myUMBC Blackboard Flyover
79
What We Want Building Block or New App
80
Student Questions Ethics Usability
  • Is this a direction UMBC should be pursuing?
  • If so, is it easy to understand and use?
  • If not, what do you think would make it better?
  • What concerns or kudos (if any) do you have about
    the project?

81
OTHER EXAMPLES
82
New Tools Approaches
83
Examples of other CMS Data Mining projects
  • 5/30/08, Chronicle of Higher Education
  • Argosy University
  • Purdue University
  • Slippery Rock University of Pennsylvania
  • South Texas College
  • SUNY Buffalo
  • Tiffin University
  • University of Alabama
  • University of Central Florida
  • University System of Georgia
  • Blackboard Greenhouse Grant - Project ASTRO
  • OSCELOT.org, Advanced System Tracking Reporting
    tool
  • Hofstra University

84
Colleges Mine Data to Predict Dropouts
  • At the University System of Georgia, researchers
    monitored how frequently students viewed
    discussion posts and content pages on course Web
    sites for three different courses to find
    connections between online engagement and
    academic success. In the graph below, students
    who were "successful" received an A, B, or C in
    the class, and students who were "unsuccessful"
    received a D, F, or an incomplete.
  • - 5/30/08 Chronicle of Higher Ed.

85
Project Astro Building Block
86
Project Astro Building Block _at_ UMBC
87
Project Astro Team
  • Santo Nucifora (Seneca College)santo.nucifora_at_sen
    ecac.on.ca
  • Eric Kunnen (Grand Rapids Community
    College)ekunnen_at_grcc.edu
  • Project Infohttp//projects.oscelot.org/gf/projec
    t/astro/
  • BbWorld09 Poster Presentation7/14/2009 500 PM -
    700 PM Exhibit Hall

88
DREXELS APPROACH TO REPORTING
89
Background
  • Morningstar Reports compiled at Drexel
  • Primary LMS is Blackboard Vista
  • Not a hits per user approach
  • Partially publicly available
  • http//drexel.edu/irt/STAR/index.html

90
Methodology
  • Quantitative vs. Qualitative
  • Looked at available data to try to get standard
    deviations across tool data
  • Found that divergence was so great it was
    meaningless
  • So many different ways to use, very difficult to
    rank fairly
  • Centered in on Quartile data as a simple method
    for ranking.
  • Combined different variables into 5 categories

91
Process Data Gathering
  • At the end of each quarter we clone the
    Blackboard Vista database to another system
  • We are then able to run queries against the
    Tracking table to gather
  • Course Information
  • Cross-List Mappings
  • 5 rankings
  • Pedagogical Complexity
  • Organizational Complexity
  • Course Complexity
  • Average Logins Per Student
  • Average Time Per Login

92
Pedagogical Complexity
  • Sum of
  • Number of Assignments
  • Number of Assessments
  • Number of Discussion Topics
  • Measures
  • Amount of Assessable work available for the
    students to complete

93
Organizational Complexity
  • Sum of
  • Organizer Pages (Folders)
  • Learning Modules
  • Media Library Items
  • Measures
  • Amount of content holders
  • Depth of the structure of the course

94
Course Complexity
  • Sum of
  • Web Links
  • Content Pages
  • Measures
  • Amount of material available for students

95
Average Logins Per Student
  • Ratio of
  • Number of student logins
  • Number of students
  • Measure of
  • Frequency of use of the course

96
Average Time Per Login
  • Ratio of
  • Amount of time spent by students (Dwell Time)
  • Number of student logins
  • Measure of
  • Duration of course usage

97
Process Star Rank Calculation
  • Quartile cutoffs are calculated for each of the 5
    rankings
  • Each course given a quartile score for each
    ranking
  • Star ranking is determined by the average of the
    5 rankings

98
Example Star Rank Calculation
  • Suppose a course had ranking values of
  • It would have quartile rankings of
  • Averaging to a star ranking of 2.4

99
Process Report Creation
  • 2 reports are generated for each college
  • Courses that fall above the mean Star Ranking
  • Courses that fall below the mean Star Ranking
  • All courses/instructors over a ranking of 3 are
    publicly published
  • Summary data is stored in a pivot table
  • Allows for future reporting needs
  • Allows for term by term comparisons between data

100
Progression of Means
101
Progression of Means
102
Engagement Time by Term
103
References
  • Campbell, J.P., DeBlois, P.B. Oblinger, D.G.
    (2007, July/August) Academic analytics A new
    tool for a new era. EDUCAUSE Review, 424) pp.
    41-57. Retrieved March 3, 2009 from
    http//connect.educause.edu/Library/EDUCAUSERevie
    w/AcademicAnalyticsANewTool/44594
  • Rampell, C. (2008). Colleges Mine Data to Predict
    Dropouts. The Chronicle of Higher Education,
    5/30/08. Retrieved March 6, 2009 from
    http//chronicle.com/weekly/v54/i38/38a00103.htmw
    eb-course (login required)
  • Young, J. (2009). College 2.0 A wired way to
    rate professorsand connect teachers. The
    Chronicle of Higher Education, January 8, 2009.
    Retrieved April 23, 2009 from http//chronicle.com
    /free/2009/01/9311n.htm

104
BREAK
105
How Did UMBC Do This?
  • Overview
  • How You Can Do This
  • Code Download
  • Video Show Tell

106
Hey, Jeffrey, what if . . . ?
107
First Method (Production Queries)
Blackboard (Production)
Queries (PHP Scripts)
Cached Reports
108
First Method (Production Queries)
  • Pros
  • Contained current semester data
  • As opposed to 6 month gap with the Stats Db
  • Data was guaranteed to be up-to-date.
  • Cons
  • Large queries could take down Blackboard

109
Second Method (Cloned Database)
Blackboard (Clone)
Queries (PHP Scripts)
Updated every 4 minutes
Blackboard (Production)
Cached Reports
110
Second Method (Cloned Database)
  • Pros
  • Could query current semester data with minimal
    impact on production
  • Data was close to up-to-date (no more than 4
    minute delay)
  • Cons
  • Cloning process inserted extra columns
  • Replication service broke numerous times limiting
    query accuracy

111
How We Query Bb Static Replica
Blackboard (Static Replica)
Queries (PHP Scripts)
Complete copy of database made infrequently, but
as needed for reports
Blackboard (Production)
Cached Reports
112
Technical Issues to Consider
  • Pros
  • Can query current semester data with no impact on
    production
  • Data is up-to-date (at time of the static copy)
  • Cons
  • Requires a manual process to make the static copy
  • Need to know in advance when we want to run
    queries

113
Do You Really Need To Use The Code?
  • All of the queries can be adapted to be run
    directly at your database
  • Not production db
  • Some of the reports are a combination of multiple
    queries
  • Saves time
  • Generates the output

114
But Let's Assume You Really Want To
  • What do you need to get started?
  • What are the different types of files?
  • What does the code do?
  • What do the queries do?
  • What needs to be changed in the queries for my
    institution?
  • How do I publish the reports?

115
What We're Running On
  • Apache 1.3.37
  • PHP 4
  • PDO
  • DBLIB driver
  • Bb Database is on SQL Server 2005

116
Types of Files
  • Cached Reports
  • Run at the end of semesters
  • Generate files
  • Live Admin Reports
  • Run during semesters
  • Generates live output
  • Useful for checking the cached reports
  • Self-Service Reports
  • Used by individuals
  • Up-to-date information

117
Code Walkthrough
  • courseactivity.php
  • Cached Report
  • Generates the top 50 courses by activity
  • Customizable based on
  • Sort criteria
  • Activity type
  • Course level

118
Code Walkthrough
  • Step A A number of variables are set to NULL
  • Step B Customizable variables generate strings
    for the query
  • if (who Student)
  • role AND (d.role S)
  • Step C Database Connection
  • dbh new PDO (dblibhosthostnameportdbname
    dbname,username,pw)

119
Code Walkthrough
  • Step D Count query insures that data exists
  • Step E Real query fetches the data
  • Step F Filename is set to a local folder
  • filename "./courseactivity-levelcourses/da
    te-semesterpure-who.html"
  • Step G Count query is executed
  • Parameters are bound to prevent SQL injection
    attacks
  • Top of the HTML Page is constructed and added to
    an output string

120
Code Walkthrough
  • Step H If count comes back with data, collect
    data
  • Step I Data is stored in an associative array
  • In this case we also store course IDs in a string
  • We'll use the course IDs to run a second query to
    fetch instructors
  • Also stored in the associative array
  • Step J Rows are added to the output string
  • Step K The output string is written to the file

121
Code Walkthrough
  • Step L Tool usage query
  • Tools from the activity accumulator are grouped
    into broader categories
  • Step M Tool Usage data is written to a file

122
What About The Other File Types
  • Live Queries
  • Nearly identical to their cached counterparts
  • Output string is echoed to the screen
  • Instead of written to a file
  • Self Service Queries
  • Like live queries for output
  • Queries differ to be based on an individual

123
Queries for Cached Reports
  • Course/Community Activity
  • Courses By Department
  • Tool Usage
  • User Activity
  • Grade Distribution

124
Course/Community Activity
  • SELECT TOP 50
  • count(a.event_type) as Hits,
  • count(DISTINCT a.USER_PK1) as Users,
  • count(a.event_type) / count(DISTINCT
    a.USER_PK1) as HitsPerUser,
  • c.course_id as CourseID,
  • c.course_name as CourseName
  • FROM
  • activity_accumulator a,
  • course_main c,
  • course_users d,
  • users u
  • WHERE
  • (c.course_id LIKE semester) AND
  • (a.event_type 'COURSE_ACCESS') AND
  • (a.course_pk1 c.pk1) AND
  • (a.user_pk1 u.pk1) AND
  • (c.pk1 d.crsmain_pk1) AND
  • (d.users_pk1 u.pk1)?
  • role

125
Courses By Discipline
  • SELECT
  • g.batch_uid as DepartmentID,
  • g.title as DepartmentName,
  • count (c.pk1) as Courses
  • FROM
  • course_main c,
  • bb_bb60_rpt..gateway_categories g,
  • bb_bb60_rpt..gateway_course_categories h
  • WHERE
  • (c.course_id LIKE semester) AND
  • (c.pk1 h.crsmain_pk1) AND
  • (h.gatewaycat_pk1 g.pk1)?
  • grad
  • GROUP BY
  • g.batch_uid, g.title
  • ORDER BY
  • orderby

126
Tool Usage
  • SELECT
  • count() as Hits, a.internal_handle as Tool
  • FROM
  • activity_accumulator a, course_main c,
    users u, course_users d
  • WHERE
  • a.event_type 'COURSE_ACCESS'
  • AND a.course_pk1 c.pk1
  • AND a.user_pk1 u.pk1
  • AND d.crsmain_pk1 c.pk1
  • AND d.users_pk1 u.pk1
  • AND c.course_id LIKE semester
  • GROUP BY
  • a.internal_handle
  • ORDER BY
  • count() DESC

127
User Activity
  • SELECT
  • count(a.event_type) as Hits,
  • u.user_id as UserName,
  • u.firstname as FirstName,
  • u.lastname as LastName,
  • c.course_id as CourseID
  • FROM
  • activity_accumulator a, users u, course_main
    c, course_users d
  • WHERE
  • (c.course_id LIKE semester) AND
  • (a.event_type 'COURSE_ACCESS') AND
  • (a.user_pk1 u.pk1) AND
  • (a.course_pk1 c.pk1) AND
  • (c.pk1 d.crsmain_pk1) AND
  • (d.users_pk1 u.pk1)?
  • role
  • grad
  • GROUP BY u.user_id, u.firstname, u.lastname,
    c.course_id
  • ORDER BY count(a.event_type) DESC

128
Grade Distribution
  • SELECT
  • att.grade as grade, count (distinct
    u.user_id) as users, count(a.event_type) as hits
  • FROM
  • bb_bb60_rpt..gradebook_grade g,
  • bb_bb60_rpt..gradebook_main m,
  • users u,
  • course_users d,
  • course_main c,
  • bb_bb60_rpt..attempt att,
  • activity_accumulator a
  • WHERE
  • c.course_id LIKE semester
  • AND m.crsmain_pk1 c.pk1
  • AND g.gradebook_main_pk1 m.pk1
  • AND g.course_users_pk1 d.pk1
  • AND d.users_pk1 u.pk1
  • AND m.title 'GRADE'
  • AND att.pk1 g.last_attempt_pk1
  • AND a.user_pk1 u.pk1

129
Adjusting the Queries
  • UMBCs Course IDs follow a pattern
  • DisciplineCourse_Section_Semester
  • SCI100_0101_SP2009
  • All Semester Courses can be found with
  • c.course_id LIKE SP2009
  • Grad vs Undergrad courses can be found with
  • ((c.course_id LIKE 00-90-9!_!_escape
    !) OR(c.course_id LIKE 10-90-9!_!_esc
    ape !) OR(c.course_id LIKE 20-90-9!_!_
    escape !) OR(c.course_id LIKE
    30-90-9!_!_escape !) OR(c.course_id
    LIKE 40-90-9!_!_escape !))

130
How To Publish Reports
  • Assuming
  • You have adjusted the files for your institution
  • Including query changes
  • Contact Information
  • Header/Footer text
  • You understand the risks of running these reports
  • We STRONGLY RECOMMEND having a reporting (cloned)
    database.
  • You understand UMBC is in no way responsible for
    supporting your use of our code
  • Then do the following . . .

131
How To Publish Reports
  • In the cache directory, the index.php file
    contains a link to a form to update the cache
  • This form has report sets that can be run for
    specific semesters
  • All of the customizable options are hardcoded
    into update.php
  • We strongly recommend you use the live versions
    of the reports to test your output before trying
    to publish data

132
Code Download Video Show Tell
  • Code Download
  • http//www.umbc.edu/oit/newmedia/blackboard/stats/
    getthecode.php
  • Video Show Tell Walkthrough (same as above)

133
Thanks!www.umbc.edu/blackboard/reports
fritz_at_umbc.edu
  • jtb77_at_drexel.edu
  • Questions? Comments?
Write a Comment
User Comments (0)
About PowerShow.com