A Proposal for VoCATS Field Testing, Scaling, and Reporting of Results - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

A Proposal for VoCATS Field Testing, Scaling, and Reporting of Results

Description:

Examine student responses made to all 'flagged' items. That is, for each 'flagged' item, what percent of students chose A, B, C, and D? ... – PowerPoint PPT presentation

Number of Views:59
Avg rating:3.0/5.0
Slides: 41
Provided by: lorinwa
Category:

less

Transcript and Presenter's Notes

Title: A Proposal for VoCATS Field Testing, Scaling, and Reporting of Results


1
A Proposal for VoCATS Field Testing, Scaling, and
Reporting of Results
  • Lorin W. Anderson
  • The Anderson Research Group
  • Columbia, SC

2
Problem Identification
  • CTE Post Assessment scores vary widely from
    course to course
  • Dramatic swings in CTE Post Assessment scores
    from year to year
  • The entire process of Field Tests.

Source Letter of March 3, 2006, from Leland
Blankenship to Ted Summey
3
Four Primary Questions
  • How should the item validation process be
    conducted?
  • How can we collect meaningful data from the field
    tests?
  • How should the field test data be reviewed and
    what decisions should be made based on that
    review?
  • How should test scores be scaled to minimize
    course-to-course and year-to-year swings and
    meeting both the grading needs of teachers and
    federal accountability requirements?

4
Item Validation Procedure
  • Because of increased federal concerns for
    objectivity, the item validation process should
    be should be led by an external consultant or
    outside vendor. The external consultant will
    provide training to the Item Validation Team
    prior to their beginning their work.

5
Item Validation Procedure
  • All members of an Item Validation Team must have
    (a) taught the course for at least three years
    and (2) be nominated by a CTE Director of his or
    her designate. Because the Item Validation
    Procedure should yield an independent review of
    the items, Course Design Team members should not
    serve as members of the Item Validation Team.

6
Item Validation Procedure
  • The DPI consultants should provide content
    expertise and clarification as needed and ensure
    that the specified procedures are followed. If a
    DPI consultant does not believe he or she
    possesses content expertise for the course under
    review, he or she can designate a member of the
    Course Design Team to serve as an ex officio
    member of the Item Validation Team. In this
    capacity the Course Design Team member operates
    in a facilitative, not directive, manner under
    the supervision of the DPI consultant.

7
Item Validation Procedure
  • The DPI consultants will be responsible for
    identifying potential anchor items from
    existing non-RBT courses. These anchor items,
    approximately 30 to 40, must be aligned with
    objectives included in the RBT-developed Course
    Blueprints. Both concerns for content and
    cognitive level must be addressed in selecting
    anchor items. The purpose of the anchor items is
    to help us understand the changes in test scores
    that are likely to occur when old courses are
    replaced by new courses. Once identified the
    anchor items become part of the total pool of
    items reviewed by the Item Validation Team. They
    are NOT identified as anchor items.

8
Item Validation Procedure
  • In preparing items for the item validation
    process, all item numbers and correct answers
    should be masked. The items should be arranged
    in a random order, not in the order in which the
    objectives they assess appear in the Course
    Blueprint.

9
Item Validation Procedure
  • Following training, the item validation process
    consists of two phases. In the first phase, each
    Item Validation Team member completes an
    independent review of each item based on his or
    her experience and expertise. In the second
    phase, the Item Validation Team meets as a
    committee-of-the whole to compare the individual
    reviews, identify areas of disagreement, and work
    toward consensus. During this second phase, the
    Curriculum Guides, textbooks, and other
    curriculum resources should be used to help
    resolve disagreements.

10
Item Validation Response Form
  • Directions Individually, read the item
    above and indicate your answer to each of the
    questions in the Individual Response column. One
    you have answered all the questions individually,
    discuss your answers as a group and identify
    areas if disagreement. Based on the discussion
    and the examination of appropriate curriculum
    materials (e.g., Curriculum Guide, textbooks),
    attempt to resolve the disagreements and reach
    consensus. Circle the answer that best
    represents the Group Response or Consensus. When
    your group reaches a consensus, EACH PERSON will
    initial the bottom line to indicate agreement
    with the group decision. If any member disagrees
    with the consensus reported for any time, he or
    she should complete and submit a Minority Report
    Form for that item. If the consensus identifies
    problems with the item, the item should be edited
    to fix each of the problems identified. If an
    items has numerous problems, it should be
    rejected and not repaired.

11
Item Validation Response Form
12
Item Validation Response Form
13
The Paradox of Field Tests
  • We need to field test items before we use them to
    assign scores and grades to students. We have to
    test the items before we test the students.
  • However, if we dont assign grades to the field
    tests, students are unlikely to be motivated to
    do their best.

14
Recommended Field Test Procedure
  • The first year that an RBT-designed course is
    offered will be a field test year.

15
Recommended Field Test Procedure
  • During this field test year, the course will be
    divided into two approximately equal parts based
    on the Course Blueprint and the weights of the
    competencies and objectives. For a one semester,
    block scheduled course, for example, these two
    parts would correspond with quarters.

16
Recommended Field Test Procedure
  • At the end of each quarter (or its equivalency) a
    test composed of items assessing the objectives
    taught during the quarter will be administered.
    Various forms of the test will be prepared prior
    to this administration (see Structure of Field
    Tests).

17
Recommended Field Test Procedure
  • To increase the validity of the field tests,
    grades will be assigned to students based on
    their performance on the field tests. Because
    these are field tests, however, two requirements
    for assigning grades must be met. First, before
    assigning grades, the score achieved by
    highest-soring student on the test will be set as
    100. Then, all other scores will be re-scaled
    based on the highest achieved score. For
    example, if the highest score is 84, then 84 is
    re-scaled as 100. Then, 16 points (100 84) are
    assigned to each score lower than 100. Second,
    students performance on the field tests can
    account for no more than 20 percent of the
    students grades for that quarter. The re-scaling
    of student scores and the assignment of students
    grades will be done by teachers at the local
    level.

18
Structure of Field Tests
  • Five hundred (500) items are written for each
    course designed using the RBT. The number of
    items per objective is 5 times the weight of the
    objective in the Course Blueprint. For example,
    a 5 objective will have 25 items associated with
    it. In addition, there will be 30 to 40 anchor
    items.

19
Structure of Field Tests
  • All items that are approved by the Item
    Validation Team must be included in the field
    test. No differentiation between secure and
    released (classroom) items will be made at this
    time. Rather, this distinction will be made
    based in large part on the results of the field
    test. One anticipated benefit of this situation
    is that teachers will have to learn to use the
    Curriculum Guides.

20
Structure of Field Tests
  • Lets assume that the item validation results in
    the elimination of 50 items, leaving a total item
    pool of 450 items. Also, lets assume that 25 of
    the 30 to 40 anchor items make it through the
    validation process.

21
Structure of Field Tests
  • The remaining 450 items and the 25 anchor items
    will be organized around the objectives.

22
Structure of Field Tests
  • Six (6) forms of the test will be prepared and
    administered each quarter. These tests would
    assess the objectives taught during the quarter,
    based on the sequence of objectives on the Course
    Blueprint. Each item will be tagged in such a
    way that the specific form of the test on which
    the item appears is known.

23
Structure of Field Tests
  • For existing non-RBT courses, test data from the
    previous two years will serve as the field test
    data.

24
Procedures for Reviewing Field Test Data
  • Examine item difficulties and flag items with
    difficulties less than 40 percent. Note. Items
    with difficulties greater than 90 percent are
    problems only if their item discriminations are
    extremely low see Step 2.

25
Procedures for Reviewing Field Test Data
  • Examine corrected item-total correlations (i.e.,
    item discriminations) and flag items with
    correlations less than .200.

26
Procedures for Reviewing Field Test Data
  • Examine student responses made to all flagged
    items. That is, for each flagged item, what
    percent of students chose A, B, C, and D?
    Complete a table similar to Table 1 (attached).

27
Procedures for Reviewing Field Test Data
  • Examine the relationship between the flagged
    items and the course objectives. That is, see
    how the flagged items are distributed across
    the objectives.

28
Procedures for Reviewing Field Test Data
  • Make a decision about each flagged item (that
    is, keep or dump). Modifications of flagged
    items are not appropriate because modifications
    would require another field test of the modified
    items.

29
Table 1, Analysis of Flagged Items, Computerized
Accounting
30
Procedure for Combining Standard Setting and
Scaling
  • Use contrasting groups approach to establish
    cut-scores for advanced, and basic for each
    course. For accountability purposes,
    proficient will be defined as he midpoint
    between the two cut-scores.

31
Mean Scores for Students with Different
Anticipated Grades, Computerized Accounting
32
Percent of Anticipated A and Anticipated B
Students, Computerized Accounting
33
Procedure for Combining Standard Setting and
Scaling
  • Set the advanced cut-score to the lowest number
    associated with a grade of A (e.g., 93). Set
    the basic cut-score to the lowest number
    associated with a grade of D.

34
Procedure for Combining Standard Setting and
Scaling
  • Interpolate to arrive at scores corresponding
    with the lowest number associated with a grade of
    B and the lowest number associated with a grade
    of C.

35
Procedure for Combining Standard Setting and
Scaling
  • Derived a formula for calculating grade points
    from test scores.

36
Procedure for Combining Standard Setting and
Scaling
  • Use the formula to produce a grading scale for
    each course based on test scores.

37
Example of Combining Standard Setting with Scaling
  • Course 1. Computerized Accounting
  • Advanced (A students vs. B students) 75
  • Basic (D students vs. F students) 45
  • Course 2. Foods I
  • Advanced (A students vs. B students) 70
  • Basic (D students vs. F students) 55

38
Sample Uniform Grading System
  • A 93 -- 100
  • B 85 92
  • C 77 -- 84
  • D 70 -- 76
  • F Below 70

39
Example of Combining Standard Setting with Scaling
40
Example of Combining Standard Setting with Scaling
  • Accounting Difference in Test Scores 1.3 x
    Difference in Grade Points
  • Foods I Difference in Test Scores 0.65 x
    Difference in Grade Points
Write a Comment
User Comments (0)
About PowerShow.com