OnLine Testing Center - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

OnLine Testing Center

Description:

How about the much-maligned US Post Office? 6. Comparison: Versus Post Office. Tuition Airmail Stamp Ratio. 1959 $ 1,200 $0.08 15,000 ... – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 42
Provided by: jeff458
Category:
Tags: online | center | office | post | testing | us

less

Transcript and Presenter's Notes

Title: OnLine Testing Center


1
On-Line Testing Center
  • Database Laboratories
  • Root Questions
  • Automating Homeworks

2
Tools for Increasing the Efficiency of Teaching
  • Laboratories that give immediate, accurate
    feedback for teaching SQL, e.g.
  • Automated homeworks that simulate the effect of
    carefully graded long-answer homework.
  • Lectures consisting of PowerPoint slides with
    voiceover.

3
Productivity in Education
  • The education industry has a terrible
    productivity-improvement record.
  • Using database courses as a focus, we have
    developed a system, OTC (On-Line Testing Center)
    that automates grading and allows teaching effort
    to be reused.

4
Comparison Versus Telecom
  • Tuition 3-min LD call Ratio
  • 1959 1,200 3.00 400
  • 2004 30,000 0.15 200,000

In 45 years, high-end college tuition has
gotten 5000 times more expensive relative to
a long-distance phone call!
5
But Isnt ?
  • The telecom industry is arguably the best example
    of the use of technology to reduce costs.
  • How about the much-maligned US Post Office?

6
Comparison Versus Post Office
  • Tuition Airmail Stamp Ratio
  • 1959 1,200 0.08 15,000
  • 2004 30,000 0.37 81,000

In 45 years, high-end college tuition has
gotten 5.4 times more expensive relative to a
stamp!
7
One Thing Id Like to Do, but Havent
  • Global, on-line TA system.
  • Students can ask questions about a given course,
    via email.
  • They get fast email response.
  • A TA network is guided by a database of previous
    questions and answers.

8
Lectures
  • We have PowerPoint slides with voiceover for an
    introductory DB course.
  • Intended use play for 50-60 of the lecture use
    the rest of the time for discussion.
  • Pace is critical --- stop for class thought after
    each slide.

9
Solution
  • SELECT beer, AVG(price)
  • FROM Sells
  • GROUP BY beer
  • HAVING COUNT(bar) gt 3 OR
  • beer IN (SELECT name
  • FROM Beers
  • WHERE manf Petes)

10
Laboratory Assignments
  • Conventional SQL homework Here is a database
    write these queries in SQL.
  • TAs look at SQL answers and try to figure out
    whether the queries do what theyre supposed to
    do.
  • Rate of regrades tells me this task is too hard
    to get right.

11
OTC Laboratory
  • Students get a description of a schema and an
    English description of several queries to write.
  • Queries are sent to a database system (Oracle or
    mySQL) for testing on a preset, sample database.

12
OTC Laboratory --- (2)
  • Three possible outcomes
  • Syntactically incorrect --- they get the feedback
    of the DBMS.
  • Semantically incorrect --- they get to see what
    their query did on another sample database and
    what it should have done.
  • Semantically correct --- they get credit for the
    problem.

13
Creating an SQL Lab
  • Stem to describe the schema and queries.
  • CREATE TABLE statements to define the schema.
  • INSERT statements for the test DB.
  • INSERT statements for the sample DB.
  • Reference queries for the correct answers.

14
Other Labs
  • Recently added similar lab-creation faciltities
    for
  • Relational algebra.
  • JDBC.
  • XQUERY.

15
Policy Issues
  • The lab is set up so students may submit a query
    as many times as they like.
  • Once correct, a query can be stored and the next
    one worked on.

16
Problem Automate Construction of Sample DBs.
  • Queries involve particular constants.
  • Changing the constants in your explanation
    doesnt explain anything.
  • Example find all the bars in Boston.
  • The sample DB better not change Boston in
    tuples or youll be explaining if the DB
    contains (Joes Bar, Miami) you need to
    produce Joes Bar in your answer.

17
A Harder Example
  • Consider query find all the beers Joes Bar
    sells for less than 5.
  • You cant change prices in tuples like (Joes
    Bar, Bud, 4.00) randomly, or youll give
    advice like if the DB contains (Joes Bar,
    Coors, 6.50), you need to produce Coors.

18
Example --- Continued
  • You need a less than 5 preserving
    transformation.
  • Example p -gt 2p 5.

19
Automating Homework
  • The heart of OTC is a system for automating
    homeworks and exams.
  • Goal 1 Encourage students to work long-answer
    problems for themselves.
  • Goal 2 Inhibit cheating.
  • Goal 3 Eliminate the drudgery of grading.

20
Modeling Long-Answer Questions with
Multiple-Choice
  • Here is a typical long-answer question we might
    ask in a DB course

Relation R consists of the following tuples, and
relation S has the following tuples. Compute the
join of R and S.
21
Root Questions
  • A root question is a multiple-choice question
    with several right and many wrong answers.
  • Example

Relation R consists of the following tuples, and
relation S has the following tuples. Which of
these tuples is in the join of R and S ?
22
Writing a Root Question
  • The question-designer provides several correct
    answers.
  • In our example, each tuple of the join could be
    one correct answer.
  • Many wrong answers are also provided.
  • Here, any tuple of the correct length that is not
    in the join could be used.

23
Writing Root Questions --- (2)
  • For each wrong answer, write a choice explanation
    that gives student a hint or explanation of why
    it is wrong.
  • For the question as a whole, write a question
    explanation.

24
Assigning Root Questions
  • The instructor develops an assignment consisting
    of several root questions.
  • 4-6 seems to be the right number --- well see
    why.
  • Students take the assignment as many times as
    they like and are encouraged to get a perfect
    score.
  • Only the final score counts.

25
Assigning Root Questions --- (2)
  • Each time the student opens the assignment, they
    are given the same questions, but with a
    different choice of one correct and three
    incorrect answers, in random order.
  • To prevent rapidfire guessing, the student may
    open an assignment only once per x minutes
    (instructor choice).

26
Student Responses
  • Each root question suggests a conventional,
    long-answer question, that the student should
    work.
  • Example for the join question, they may as well
    compute the entire join.
  • With the join tuples listed on scratch paper,
    they can quickly solve any instance of the root
    question.

27
Automatic Student Help
  • When a student submits work, they immediately get
    the choice explanations for those questions they
    get wrong.
  • After the due date, students can see their
    assignments, with not only the choice
    explanations but the question explanations as
    well.

28
How Many Questions?
  • We recommend 4-6 questions per assignment.
  • Fewer than 4 encourages students to guess too
    many questions runs the risk a student will miss
    one for carelessness.
  • When first given at Stanford with no limit, some
    students tried hundreds of times.

29
Comparison
  • There is a simpler scheme used in courses like
    physics, where questions are parametrized, and
    the correct answer computed by a formula.

A weight of w kilograms is dropped from height
h. How long does it take the weight to reach
the ground?
30
Comparison --- (2)
  • Question is generated by choosing random values
    of the parameters, and the answer checked against
    the result of the formula.
  • Root questions simulate this question type by
    selecting many parameter values and asking for a
    correct pairing of parameters and result.

31
Comparison --- (3)
  • Example

A weight of w kilograms is dropped from height
h. For which of the following triples (w, h, t )
is t the time it takes the weight to reach the
ground?
32
Comparison --- (4)
  • In the database domain, many kinds of questions
    cannot have their answer computed by arithmetic
    formula
  • Which of these functional dependencies follows
    from the given FDs?
  • Which of these schedules is serializable?
  • For which relation sizes is query plan A better
    than plan B?

33
Comparison --- (5)
  • If you are willing to write a program to (say)
    test serializability, you can write a program
    that generates a root question with lots of
    serializable and lots of unserializable
    schedules.
  • The output of this program can be input
    automatically to OTC.

34
OTC Status
  • About 400 root questions, mostly on databases,
    developed.
  • Many have explanations included.
  • Lets face it writing a root question correctly
    is hard.
  • But once done and debugged, it can be used in
    many courses.

35
OTC History --- Spring, Fall, 2002
  • One assignment in Stanford CS347
    (Transaction-Processing and Distributed
    Databases) supported, Spring 2002.
  • CS145 (Intro. DB course at Stanford) supported in
    Fall, 2002.
  • 2 Lab assignments, 11 root-question assignments,
    midterm (not root questions).

36
OTC --- Winter, 2003
  • Supported CS245 (DB Implementation, Hector
    Garcia) at Stanford.
  • Supported a CS145/245-like course at North
    Carolina State (Rada Chirkova).

37
OTC Status --- Spring 2003
  • Supported CS145, CS347, and CS345 (DB Theory) at
    Stanford.
  • Continued support at NC State.
  • Supported CS145-like courses at UC Santa Cruz
    (Arthur Keller) and Univ. of Leipzig (Erhard
    Rahm).
  • Supported a Discrete Math course at NTU Athens
    (Foto Afrati).

38
OTC Status --- Fall 2003
  • New Sites Penn State, U. Chicago, Yale, U.
    Alabama, U. Karlsruhe, York College/CUNY, U.
    Business Econ. (Athens).

39
OTC Development Team
  • The core software was developed by Murty Valiveti
    and his team at Gautami Software.
  • Alan Beck and Ramana Yerneni adapted the OTC core
    for database instruction and implemented a number
    of important features.

40
Content Creators
  • Alan Beck SQL, JDBC, and XQuery labs.
  • Austin Shoemaker relational algebra lab.
  • Root-question developers Foto Afrati, Rada
    Chirkova, Mayur Datar, Prasanna Ganesan, Wang
    Lam, Anand Rajaraman, Jeff Ullman, Jennifer
    Widom, Ramana Yerneni.

41
Find Out More
  • A tutorial for instructors is at
  • www-db.stanford.edu/ullman/pub/otc.pdf
  • Demo site
  • chub.stanford.edu8181/CS145-demo/index.html
Write a Comment
User Comments (0)
About PowerShow.com