Item Preknowledge on Test Performance and Item Confidence BenRoy Do and Bradley Brummel University o - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Item Preknowledge on Test Performance and Item Confidence BenRoy Do and Bradley Brummel University o

Description:

Moving towards unproctored Internet testing. SIOP 2005 ... Speed test with low difficulty items. Power test to measure mastery of skill. ... – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 22
Provided by: workPsy
Category:

less

Transcript and Presenter's Notes

Title: Item Preknowledge on Test Performance and Item Confidence BenRoy Do and Bradley Brummel University o


1
Item Preknowledge on Test Performance and Item
ConfidenceBen-Roy Do and Bradley
BrummelUniversity of Illinois

Department of Psychology University of
Illinois, Urbana-Champaign
2
Overview of Presentation
  • Test security.
  • Academic testing.
  • Employee selection.
  • Process of cheating.
  • Item memorization.
  • Item pre-knowledge.
  • Item responses during testing.
  • Laboratory experiment.
  • Results.
  • Conclusion and Discussion.

3
Why Study Test Security?
  • Academic testing
  • 20 test takers reconstructed "a significant
    portion" of the computer-based GRE (Kaplan
    Educational Centers, 1995).
  • Websites offering questions from live exams of
    the computerized GRE (ETS, 2002).
  • Both questions and answers were illegally
    obtained by previous test takers who memorized
    and reconstructed questions to share them with
    other test takers.
  • In 2007, ETS will introduce Revised GRE with
    different item types, switched from an adaptive
    test to a linear test, and administered on fixed
    administration dates.

4
Why Study Test Security?
  • Employee selection
  • Only top applicants get job interviews.
  • Longer test window, no test fees.
  • Less likely to implement new selection tests.
  • Moving towards unproctored Internet testing.
  • SIOP 2005
  • Equivalence across proctored and unproctored
    Internet testing (Do, Shepherd, Drasgow, 2005).
  • Brain Dump Central (braindumpcentral.com)

5
Why Study Test Security?
6
Why Study Test Security?
  • Will item preknowledge influence their test
    performance?
  • If so, is the effect big?
  • Will their performance be different based on
    different item difficulty levels?
  • Will they be more confidence about their item and
    test performance, regardless of item difficulty
    levels?

7
Process of Cheating
  • How rampant is cheating? From 3 (Karlins,
    Michaels, Podlogar, 1988) to 87 (McCabe,
    1992).
  • The wide range may reflect their confusion about
    what cheating is (Chapman et. al, 2004).
  • Even though 57 of students would look at another
    student's exam during a test (Nonis Swift,
    1998), only 12 would change their answers.
  • Thus, we should study cheating not only about
    attitudes and opinions toward cheating, but also
    consider cheating in terms of process.

8
Process of Cheating
  • Item memorization
  • Previous test takers (sources) memorized the
    presented items and provide a list to other test
    takers (beneficiaries).
  • Item pre-knowledge
  • These beneficiaries utilize their item
    preknowledge and take the test during the same
    test window, where items are selected from the
    same item pool.
  • Item responses during testing
  • May performed differently (e.g., answer copying).

9
Issues with Previous Research
  • Item exposure control (Chang Ansley, 2003)
  • Items yielding the highest information values for
    a given ability will not be selected frequently
    and be compromised soon.
  • Meaningless if an item is exposed.
  • Increase of errors for examinees at the two ends.
  • Simulation based on dichotomous item preknowledge
    state (McLeod, Lewis, Thissen, 2003)
  • May be an oversimplification because previous
    test takers (sources) may imperfectly describe
    items and consequently fail to help
    beneficiaries.
  • Use empirical evidence.

10
Laboratory Experiment
  • Sample.
  • With item preknowledge 229.
  • Regular, without item preknowledge 157.
  • Measures.
  • 26-item critical reasoning test.
  • Procedure.
  • Participants in item preknowledge group were
    given a cheat sheet, assembled from memorized
    information recalled from previous test takers.
  • Each cheat sheet consisted of only 20 items 8
    easy, 8 hard, and 4 moderately difficult items.

11
Laboratory Experiment
  • Cheat sheets.
  • Two forms of cheat sheets were assembled to
    balance item preknowledge entries from one
    another. For example, if the Entry 1 for the
    first item was provided correct information on
    Cheat Sheet 1, Cheat Sheet 2 will have incorrect
    information.
  • Among easy, moderate, and hard difficulty levels
    for each cheat sheet, half of the entries were
    questions with correct answers, and the remaining
    half were questions with wrong answers provided.

12
Sample Item
13
Sample Cheat Sheet Entry
  • Q Why did a company continue build planes
    after 3 crashes. A FAA issued new guidelines
    (incorrect item preknowledge, Cheat Sheet 2).
  • Q Why did a company continue build planes
    after 3 crashes. A The company claims it was
    pilot error and planes were safe (correct item
    preknowledge, Cheat Sheet 1).

14
Results Test Performance
  • Participants with item preknowledge had
    significant higher or lower test scores than
    regular test takers, depending on the provided
    preknowledge.

15
Results Item Confidence
  • Participants with item preknowledge were more
    confident about their test performance,
    regardless correct/incorrect item preknowledge
    provided.

16
Conclusion
  • Will item preknowledge influence their test
    performance?
  • Yes, test scores were significantly different.
  • Is the effect big?
  • Yes, the effect was moderate to large.
  • Will their performance be different based on
    different item difficulty levels?
  • No, they rely heavily on the item preknowledge
    provided, as opposed to utilize their own
    ability.
  • Will they be more confidence about their item and
    test performance, regardless of item difficulty
    levels?
  • Yes, they were significantly more confident about
    their item and test performance.

17
Discussion Item Writing
  • Item type.
  • Write items such that it is difficult to recall.
  • Increase the number of words in the item (item
    length).
  • Item format Multiple choice.
  • Select multiple options for an item.
  • Two best choices (Revised GRE, 2007).
  • Each option consists of multiple entries (e.g.,
    (A) 1, 2, 3 (B) 1, 2, 4 (C) 2, 4, 6 (D) 1, 3,
    5).

18
Discussion Test Types
  • Test type.
  • Speed test with low difficulty items.
  • Power test to measure mastery of skill.
  • Test administration modes.
  • Linear
  • Fully adaptive
  • Partially adaptive (e.g., use testlets in
    multi-stage testing)

19
Discussion Test Content
  • Greater emphasis on higher cognitive skills.
  • More text-based materials, such as reading
    passages, and less on vocabulary (e.g., Revised
    GRE, 2007).
  • Utilize various cognitive domains.
  • Reasoning, spatial knowledge, problem solving,
    etc.
  • Analogies problems use a test that include
    verbal, numerical, pictorial, and spatial
    analogies (Humphreys, 1979).
  • Use several screening instruments.
  • Personality, biodata, situational judgment, etc.

20
Discussion Morning After
  • Crawling the Web
  • If you cant beat them, join them.
  • Provide rationale for incorrect item
    preknowledge.
  • Utilize statistics psychometric methods
  • Observe statistical changes in item parameters
    and item characteristics.
  • Observe p-value over time (Hambleton, 2006).
  • Using response time to detect aberrant responses
    (van der Linden, 2006).

21
Thank you
Presentation last revised on May 6, 2006. Title
page photo provided by University of
Illinois. Background picture provided by
Microsoft.
Write a Comment
User Comments (0)
About PowerShow.com