Title: Why
1Why How to Build Better Blackboard Reports
- John Fritz, UMBC
- Jeffrey Berman, Drexel University
2Agenda
- Intros
- Why (John)
- Break
- How (Jeffrey)
- Q A (All, Anytime)
3Why Did We Do This?
- Problem
- A Solutionnot THE Solution
- Other Examples
- Future Plans
- More Information
- Q A
4About UMBC
- Founded in 1966
- Research extensive university Carnegie
classification - Fall 2008 Stats
- 12,268 Students
- 9,612 undergrad, 2,656 grad
- 1,018 Faculty
- 714 FT, 304 PT
- Selected Brags
- One of 50 Best Colleges for Women
- 1st in undergrad chemistry degrees
- awarded to African Americans
- Six-time National College Chess
- Champions
5About Blackboard _at_ UMBC
- Learning System Version 8.0
- As of Fall 2008
- 11,614 students (94 of all students)
- 1,014 Bb course sites (includes multi-section
courses) - 808 Instructors (79 of all instructors)
- 356 Communities
- Includes all student, faculty and staff senates
- Support Staff
- 2 FTE (Admin Support)
- 1 Server Admin
6PROBLEMHOW DO YOU ANSWER?
- So, is Blackboard making a difference?
- Former UMBC Provost Art Johnson in 2002
7Current Reporting in Blackboard
8Course vs. System Drill Down View?
9Questions
- Functional
- What is the relationship between Blackboard use
and teaching and learning? - What tools can we give users to shed light on
(and improve) their own performance within the
system? - Technical
- How do we query the system without breaking it?
- How do we scale and maintain the process?
- Bb Core product?
- Community Building Blocks?
- Other?
10A Modest Proposal
- Given budget limitations (even outright cuts),
what if we said IT can only (primarily?) be
provided where it is determined to be effective
in helping the largest number of students achieve
their learning outcomes? - How would we know?
- Who would help us find out?
11New Tools Approaches
12UMBC SOLUTION
13Recent News
14Transparency Self Help
- Show faculty what peers are doing through
publicly available reports of student use. - Bb Sysadmins shouldnt have the only birds eye
view. - Average hits per student course rankings dont
favor large courses over smaller ones. - Usage alone is no indicator of quality.
- But is activity by students piques faculty
curiosity. - New user tools build on activity as an
indicator (not a cause) of student success. - We are NOT interested in if Blackboard makes good
students, but how good students use Blackboard.
15(No Transcript)
16(No Transcript)
17(No Transcript)
18(No Transcript)
19(No Transcript)
20(No Transcript)
21bb
22(No Transcript)
23(No Transcript)
24(No Transcript)
25(No Transcript)
26(No Transcript)
27(No Transcript)
28(No Transcript)
29(No Transcript)
30(No Transcript)
31(No Transcript)
32(No Transcript)
33(No Transcript)
34Bb Reports GDR of D F students
- Based on voluntary GDRs run by instructors in 72
courses, students earning a D or F tend to use Bb
about 35 percent less than students earning a C
or higher. - SP2009 (11 Courses 47 percent less)
- FA2008 (13 courses 40 percent less)
- SU2008 (7 courses 33 percent less)
- SP2008 (26 courses 32 percent less)
- FA2007 (15 courses 36 percent less)
- What is the institutions ethical obligation of
knowing?- John Campbell, Purdue
35(No Transcript)
36(No Transcript)
37(No Transcript)
38(No Transcript)
39(No Transcript)
40(No Transcript)
41(No Transcript)
42(No Transcript)
43(No Transcript)
44(No Transcript)
45(No Transcript)
46(No Transcript)
47(No Transcript)
48(No Transcript)
49(No Transcript)
50(No Transcript)
51(No Transcript)
52(No Transcript)
53Effective Practice Show Tell
54Effective Practice QA (iTunesU)
55(No Transcript)
56(No Transcript)
57(No Transcript)
58(No Transcript)
59(No Transcript)
60WHAT DO WE DO WITH THIS?
61Identifying Effective Practitioners
- Faculty are sharing their rankings in
departmental meetings. - Summer Winter Hybrid Course Redesign Workshop
- Proposed Faculty have to be in top 50 of their
departments Bb ranking for student activity. - Fall 2009 hybrid courses?
- 10 courses with 40-70 students (acute shortage of
classrooms) - Instructors have taught hybrid or rank high for
student activity in Bb.
62FA2009 Hybrid Courses (Exp.)
63FA2009 Hybrid Courses (Time)
64WHY DOES THIS MATTER?
- Three Golden Rules for Faculty Development in
Instructional Tech.
65How Faculty Learn
- Most Faculty Learn Best From Other FacultyNot
Instructional Technology Staff. - Penn State University World Campus study (2008)
- faculty preferred some form of one-on-one
relationship, either with a mentor/colleague (56
percent rated this highly effective) or
instructional designer (53 percent rated this
highly effective).
66How Faculty Teach
- Most Faculty Teach The Way They Were Taught--And
Most Weren't Taught with Technology. - Nobody wants to look silly in front of a class.
- Training should walk the talk
- Hybrid Course Re-Design Workshop Assignments
- Deliverable presentations by faculty jump start
rule 1. - Stipends can be used to entice faculty to return
to being like a student.
67Faculty Learning Communities
- Instructional Technology Should Solve A
Pedagogical Problem or Create A New Student
Learning Opportunity. - Problems or opportunities provide the context
around which communities form and operate. - Faculty learning curve to become efficient
(effective?) with technology is 3-4 years. - Brad Cohen, University of Minnesota, ELI 2007
Fall Focus Meeting
68PROBLEMHOW DO YOU GET STUDENTS TO LISTEN?
69Educause Center for Applied Research
70Online Gradebook A Symbiotic Relationship?
71Does Technology Make a Difference?
- Student Success and IT in courses
- Convenience IT makes doing my course activities
more convenient. (65.6 agree) - Learning The use of IT in my courses improves
my learning. (45.7 of respondents agree) - Student engagement I get more actively involved
in courses that use IT. (31.8 agree) - 2008 Educause Center for Applied Research (ECAR)
study, Undergraduates and IT (www.educause.edu/e
car) - Completed by 27,317 freshmen seniors at
approximately 98 public private institutions.
UMBC has participated every year.
72PERSPECTIVES TO CONSIDER
73Student Perspective
- Check My Activity Tool Hasnt Been Used As Much
As Faculty Grade Distribution Report Tool. Why? - May be too buried in myUMBC portal for students
to find it - Not yet integrated into Bb under My Grades
(B2?) - Leery of Big Brother or Academic Profiling?
- See Colleges Mine Student Data to Predict
Dropouts (5/30/08 Chronicle of Higher Education) - Univ. of Northern Arizona researchers reported
student concerns with intrusive advising
(Academic Analytics, Educause Review, Vol. 42,
No. 4, July/August, 2007) - Maybe students dont think this is what a CMS is
supposed to do?
74FA2008 SCI100 Findings
- How would you describe the CMAs view of your Bb
activity compared to your peers? - 28 I was surprised by what it showed me
- 12 It confirmed what I already knew
- 42 Id have to use it more to determine its
usefulness - 16 I havent used it.
- 2 did not respond to this question
75Faculty Perspective
- Concerns about course ranking based on average
Bb hits per user? - Not a problem at UMBC (so far)
- Cheap feedback to students in terms of faculty
effort to produce it. - Better if they publish grade distribution
reports in their courses, so students see how
activity informs success. - Can CMS feedback change student motivation? And
behavior (performance)?
76FA2008 SCI100 Findings
- If your instructor published a GDR for past
assignments, would you be more or less inclined
to use the CMA before future assignments are due? - 54 More inclined
- 10 Less inclined
- 36 Not sure
77FUTURE PLANS
78myUMBC Blackboard Flyover
79What We Want Building Block or New App
80Student Questions Ethics Usability
- Is this a direction UMBC should be pursuing?
- If so, is it easy to understand and use?
- If not, what do you think would make it better?
- What concerns or kudos (if any) do you have about
the project?
81OTHER EXAMPLES
82New Tools Approaches
83Examples of other CMS Data Mining projects
- 5/30/08, Chronicle of Higher Education
- Argosy University
- Purdue University
- Slippery Rock University of Pennsylvania
- South Texas College
- SUNY Buffalo
- Tiffin University
- University of Alabama
- University of Central Florida
- University System of Georgia
- Blackboard Greenhouse Grant - Project ASTRO
- OSCELOT.org, Advanced System Tracking Reporting
tool - Hofstra University
84Colleges Mine Data to Predict Dropouts
- At the University System of Georgia, researchers
monitored how frequently students viewed
discussion posts and content pages on course Web
sites for three different courses to find
connections between online engagement and
academic success. In the graph below, students
who were "successful" received an A, B, or C in
the class, and students who were "unsuccessful"
received a D, F, or an incomplete. - - 5/30/08 Chronicle of Higher Ed.
85Project Astro Building Block
86Project Astro Building Block _at_ UMBC
87Project Astro Team
- Santo Nucifora (Seneca College)santo.nucifora_at_sen
ecac.on.ca - Eric Kunnen (Grand Rapids Community
College)ekunnen_at_grcc.edu - Project Infohttp//projects.oscelot.org/gf/projec
t/astro/ - BbWorld09 Poster Presentation7/14/2009 500 PM -
700 PM Exhibit Hall
88DREXELS APPROACH TO REPORTING
89Background
- Morningstar Reports compiled at Drexel
- Primary LMS is Blackboard Vista
- Not a hits per user approach
- Partially publicly available
- http//drexel.edu/irt/STAR/index.html
90Methodology
- Quantitative vs. Qualitative
- Looked at available data to try to get standard
deviations across tool data - Found that divergence was so great it was
meaningless - So many different ways to use, very difficult to
rank fairly - Centered in on Quartile data as a simple method
for ranking. - Combined different variables into 5 categories
91Process Data Gathering
- At the end of each quarter we clone the
Blackboard Vista database to another system - We are then able to run queries against the
Tracking table to gather - Course Information
- Cross-List Mappings
- 5 rankings
- Pedagogical Complexity
- Organizational Complexity
- Course Complexity
- Average Logins Per Student
- Average Time Per Login
92Pedagogical Complexity
- Sum of
- Number of Assignments
- Number of Assessments
- Number of Discussion Topics
- Measures
- Amount of Assessable work available for the
students to complete
93Organizational Complexity
- Sum of
- Organizer Pages (Folders)
- Learning Modules
- Media Library Items
- Measures
- Amount of content holders
- Depth of the structure of the course
94Course Complexity
- Sum of
- Web Links
- Content Pages
- Measures
- Amount of material available for students
95Average Logins Per Student
- Ratio of
- Number of student logins
- Number of students
- Measure of
- Frequency of use of the course
96Average Time Per Login
- Ratio of
- Amount of time spent by students (Dwell Time)
- Number of student logins
- Measure of
- Duration of course usage
97Process Star Rank Calculation
- Quartile cutoffs are calculated for each of the 5
rankings - Each course given a quartile score for each
ranking - Star ranking is determined by the average of the
5 rankings
98Example Star Rank Calculation
- Suppose a course had ranking values of
- It would have quartile rankings of
- Averaging to a star ranking of 2.4
99Process Report Creation
- 2 reports are generated for each college
- Courses that fall above the mean Star Ranking
- Courses that fall below the mean Star Ranking
- All courses/instructors over a ranking of 3 are
publicly published - Summary data is stored in a pivot table
- Allows for future reporting needs
- Allows for term by term comparisons between data
100Progression of Means
101Progression of Means
102Engagement Time by Term
103References
- Campbell, J.P., DeBlois, P.B. Oblinger, D.G.
(2007, July/August) Academic analytics A new
tool for a new era. EDUCAUSE Review, 424) pp.
41-57. Retrieved March 3, 2009 from
http//connect.educause.edu/Library/EDUCAUSERevie
w/AcademicAnalyticsANewTool/44594 - Rampell, C. (2008). Colleges Mine Data to Predict
Dropouts. The Chronicle of Higher Education,
5/30/08. Retrieved March 6, 2009 from
http//chronicle.com/weekly/v54/i38/38a00103.htmw
eb-course (login required) - Young, J. (2009). College 2.0 A wired way to
rate professorsand connect teachers. The
Chronicle of Higher Education, January 8, 2009.
Retrieved April 23, 2009 from http//chronicle.com
/free/2009/01/9311n.htm -
104BREAK
105How Did UMBC Do This?
- Overview
- How You Can Do This
- Code Download
- Video Show Tell
106Hey, Jeffrey, what if . . . ?
107First Method (Production Queries)
Blackboard (Production)
Queries (PHP Scripts)
Cached Reports
108First Method (Production Queries)
- Pros
- Contained current semester data
- As opposed to 6 month gap with the Stats Db
- Data was guaranteed to be up-to-date.
- Cons
- Large queries could take down Blackboard
109Second Method (Cloned Database)
Blackboard (Clone)
Queries (PHP Scripts)
Updated every 4 minutes
Blackboard (Production)
Cached Reports
110Second Method (Cloned Database)
- Pros
- Could query current semester data with minimal
impact on production - Data was close to up-to-date (no more than 4
minute delay) - Cons
- Cloning process inserted extra columns
- Replication service broke numerous times limiting
query accuracy
111How We Query Bb Static Replica
Blackboard (Static Replica)
Queries (PHP Scripts)
Complete copy of database made infrequently, but
as needed for reports
Blackboard (Production)
Cached Reports
112Technical Issues to Consider
- Pros
- Can query current semester data with no impact on
production - Data is up-to-date (at time of the static copy)
- Cons
- Requires a manual process to make the static copy
- Need to know in advance when we want to run
queries
113Do You Really Need To Use The Code?
- All of the queries can be adapted to be run
directly at your database - Not production db
- Some of the reports are a combination of multiple
queries - Saves time
- Generates the output
114But Let's Assume You Really Want To
- What do you need to get started?
- What are the different types of files?
- What does the code do?
- What do the queries do?
- What needs to be changed in the queries for my
institution? - How do I publish the reports?
115What We're Running On
- Apache 1.3.37
- PHP 4
- PDO
- DBLIB driver
- Bb Database is on SQL Server 2005
116Types of Files
- Cached Reports
- Run at the end of semesters
- Generate files
- Live Admin Reports
- Run during semesters
- Generates live output
- Useful for checking the cached reports
- Self-Service Reports
- Used by individuals
- Up-to-date information
117Code Walkthrough
- courseactivity.php
- Cached Report
- Generates the top 50 courses by activity
- Customizable based on
- Sort criteria
- Activity type
- Course level
118Code Walkthrough
- Step A A number of variables are set to NULL
- Step B Customizable variables generate strings
for the query - if (who Student)
- role AND (d.role S)
- Step C Database Connection
- dbh new PDO (dblibhosthostnameportdbname
dbname,username,pw)
119Code Walkthrough
- Step D Count query insures that data exists
- Step E Real query fetches the data
- Step F Filename is set to a local folder
- filename "./courseactivity-levelcourses/da
te-semesterpure-who.html" - Step G Count query is executed
- Parameters are bound to prevent SQL injection
attacks - Top of the HTML Page is constructed and added to
an output string
120Code Walkthrough
- Step H If count comes back with data, collect
data - Step I Data is stored in an associative array
- In this case we also store course IDs in a string
- We'll use the course IDs to run a second query to
fetch instructors - Also stored in the associative array
- Step J Rows are added to the output string
- Step K The output string is written to the file
121Code Walkthrough
- Step L Tool usage query
- Tools from the activity accumulator are grouped
into broader categories - Step M Tool Usage data is written to a file
122What About The Other File Types
- Live Queries
- Nearly identical to their cached counterparts
- Output string is echoed to the screen
- Instead of written to a file
- Self Service Queries
- Like live queries for output
- Queries differ to be based on an individual
123Queries for Cached Reports
- Course/Community Activity
- Courses By Department
- Tool Usage
- User Activity
- Grade Distribution
124Course/Community Activity
- SELECT TOP 50
- count(a.event_type) as Hits,
- count(DISTINCT a.USER_PK1) as Users,
- count(a.event_type) / count(DISTINCT
a.USER_PK1) as HitsPerUser, - c.course_id as CourseID,
- c.course_name as CourseName
- FROM
- activity_accumulator a,
- course_main c,
- course_users d,
- users u
- WHERE
- (c.course_id LIKE semester) AND
- (a.event_type 'COURSE_ACCESS') AND
- (a.course_pk1 c.pk1) AND
- (a.user_pk1 u.pk1) AND
- (c.pk1 d.crsmain_pk1) AND
- (d.users_pk1 u.pk1)?
- role
125Courses By Discipline
- SELECT
- g.batch_uid as DepartmentID,
- g.title as DepartmentName,
- count (c.pk1) as Courses
- FROM
- course_main c,
- bb_bb60_rpt..gateway_categories g,
- bb_bb60_rpt..gateway_course_categories h
- WHERE
- (c.course_id LIKE semester) AND
- (c.pk1 h.crsmain_pk1) AND
- (h.gatewaycat_pk1 g.pk1)?
- grad
- GROUP BY
- g.batch_uid, g.title
- ORDER BY
- orderby
126Tool Usage
- SELECT
- count() as Hits, a.internal_handle as Tool
- FROM
- activity_accumulator a, course_main c,
users u, course_users d - WHERE
- a.event_type 'COURSE_ACCESS'
- AND a.course_pk1 c.pk1
- AND a.user_pk1 u.pk1
- AND d.crsmain_pk1 c.pk1
- AND d.users_pk1 u.pk1
- AND c.course_id LIKE semester
- GROUP BY
- a.internal_handle
- ORDER BY
- count() DESC
127User Activity
- SELECT
- count(a.event_type) as Hits,
- u.user_id as UserName,
- u.firstname as FirstName,
- u.lastname as LastName,
- c.course_id as CourseID
- FROM
- activity_accumulator a, users u, course_main
c, course_users d - WHERE
- (c.course_id LIKE semester) AND
- (a.event_type 'COURSE_ACCESS') AND
- (a.user_pk1 u.pk1) AND
- (a.course_pk1 c.pk1) AND
- (c.pk1 d.crsmain_pk1) AND
- (d.users_pk1 u.pk1)?
- role
- grad
- GROUP BY u.user_id, u.firstname, u.lastname,
c.course_id - ORDER BY count(a.event_type) DESC
128Grade Distribution
- SELECT
- att.grade as grade, count (distinct
u.user_id) as users, count(a.event_type) as hits - FROM
- bb_bb60_rpt..gradebook_grade g,
- bb_bb60_rpt..gradebook_main m,
- users u,
- course_users d,
- course_main c,
- bb_bb60_rpt..attempt att,
- activity_accumulator a
- WHERE
- c.course_id LIKE semester
- AND m.crsmain_pk1 c.pk1
- AND g.gradebook_main_pk1 m.pk1
- AND g.course_users_pk1 d.pk1
- AND d.users_pk1 u.pk1
- AND m.title 'GRADE'
- AND att.pk1 g.last_attempt_pk1
- AND a.user_pk1 u.pk1
129Adjusting the Queries
- UMBCs Course IDs follow a pattern
- DisciplineCourse_Section_Semester
- SCI100_0101_SP2009
- All Semester Courses can be found with
- c.course_id LIKE SP2009
- Grad vs Undergrad courses can be found with
- ((c.course_id LIKE 00-90-9!_!_escape
!) OR(c.course_id LIKE 10-90-9!_!_esc
ape !) OR(c.course_id LIKE 20-90-9!_!_
escape !) OR(c.course_id LIKE
30-90-9!_!_escape !) OR(c.course_id
LIKE 40-90-9!_!_escape !))
130How To Publish Reports
- Assuming
- You have adjusted the files for your institution
- Including query changes
- Contact Information
- Header/Footer text
- You understand the risks of running these reports
- We STRONGLY RECOMMEND having a reporting (cloned)
database. - You understand UMBC is in no way responsible for
supporting your use of our code - Then do the following . . .
131How To Publish Reports
- In the cache directory, the index.php file
contains a link to a form to update the cache - This form has report sets that can be run for
specific semesters - All of the customizable options are hardcoded
into update.php - We strongly recommend you use the live versions
of the reports to test your output before trying
to publish data
132Code Download Video Show Tell
- Code Download
- http//www.umbc.edu/oit/newmedia/blackboard/stats/
getthecode.php - Video Show Tell Walkthrough (same as above)
133Thanks!www.umbc.edu/blackboard/reports
fritz_at_umbc.edu
- jtb77_at_drexel.edu
- Questions? Comments?