Evaluation of Student Learning: Test Construction - PowerPoint PPT Presentation

1 / 39
About This Presentation
Title:

Evaluation of Student Learning: Test Construction

Description:

Limited Choice Questions (MC, T/F, Matching) Open-Ended Questions (Short Answer, Essay) ... If question calls for an opinion, be sure that the emphasis is not on the ... – PowerPoint PPT presentation

Number of Views:341
Avg rating:3.0/5.0
Slides: 40
Provided by: Roberso
Category:

less

Transcript and Presenter's Notes

Title: Evaluation of Student Learning: Test Construction


1
Evaluation of Student Learning Test
Construction Other Practical Strategies
Faculty Professional DevelopmentFall 2005
  • Dr. Kristi Roberson-Scott

2
Guiding Principles for Evaluation
  • Evaluation should relate directly to
    instructional objectives
  • Each evaluation activity should be designed to
    promote student growth
  • The actual activity should be useful practice in
    itself
  • Feedback should be useable by the student
  • Multiple evaluation strategies should be provided
    to master achievement of X objective/competency
  • Student should clearly understand the methods of
    evaluation for X test or activity

3
Questions to Ask yourself in Designing a Test
  • What objectives will (should) I be testing?
  • What types of items will be included in the test?
  • How long will the test be in terms of time and
    number of items?
  • How much will each objective be worth in terms of
    weighting and number of items?

4
Tests as Diagnostic Tools
  • Students demonstrate learning
  • Instructor effectiveness modify teaching
    strategies or activities
  • Assignment of letter grades

5
Different Types of Tests Learning
  • Paper Pencil/WebCT Testing
  • Limited Choice Questions (MC, T/F, Matching)
  • Open-Ended Questions (Short Answer, Essay)
  • Performance Testing
  • Acquisition of skills that can be demonstrated
    through action (e.g., music, nursing, etc.)

6
Planning a Test
  • First step Outline learning objectives or major
    concepts to be covered by the test
  • Test should be representative of objectives and
    material covered
  • Major student complaint Tests dont fairly cover
    the material that was supposed to be canvassed on
    the test.

7
Planning a Test
  • Second Step Create a test blueprint
  • Third Step Create questions based on blueprint
  • Match the question type with the appropriate
    level of learning
  • Fourth Step For each check on the blueprint, jot
    down (might use 3x5 cards) 3-4 alternative
    question on ideas and item types which will get
    at the same objective
  • Fifth Step Organize questions and/or ideas by
    item types

8
Planning a Test
  • Sixth Step Eliminate similar questions
  • Seventh Step Walk away from this for a couple of
    days
  • Eighth Step Reread all of the items try doing
    this from the standpoint of a student

9
Planning a Test
  • Ninth Step Organize questions logically
  • Tenth Step Time yourself actually taking the
    test and then multiply that by about 4 depending
    on the level of students
  • Eleventh Step Analyze the results (item analyses)

10
Translating Course Objectives/Competencies into
Test Items
  • Syllabus
  • Specification table- what was taught/weight areas
    to be tested
  • Creating a Test Blueprint (see handout)
  • Blueprint- this is the test plan, i.e., which
    questions test what concept
  • Plotting the objectives/competencies against some
    hierarchy representing levels of cognitive
    difficulty or depth of processing

11
Thinking Skills
  • What level of learning corresponds to the course
    content
  • Blooms Taxonomy of Educational Objectives
  • Knowledge (see handout)
  • Comprehension
  • Application
  • Analysis
  • Synthesis
  • Evaluation

12
Practical Considerations
  • Representative sample of the course content not
    random purposeful based on blueprint
  • Representative sample of skill or cognitive
    levels across content
  • Analyze results by level AND content area

13
Question Arrangement on a Test
  • Group by question type
  • Common instructions will save reading time
  • Limit the number of times students have to change
    frame of reference
  • Patterns on test must be logical
  • Arrange from a content standpoint
  • Keep similar concepts together
  • Group by difficulty (easy to hard)

14
Selecting the Right Type of evaluation
  • How do you know what type of question to use and
    when?
  • It depends on the skill you are testing.
  • Evaluation should always match as closely as
    possible the actual activity youre teaching.
  • Examples Teaching Speech, should evaluate an
    oral speech
  • If testing ability to write in Spanish, better
    give an essay.
  • Testing reading MC, TF
  • Wouldnt use MC to test creative writing

15
Question Types verses Cognitive Levels of
Learning
16
Constructing the Test
  • Types of Test Questions
  • Multiple-Choice Items
  • True-False Items
  • Matching Items
  • Fill-In, Completion or Short-Answer Items
  • Essay Questions

17
Multiple Choice Items
  • Advantages
  • Extremely versatile-can measure the higher level
    mental processes (application, analysis,
    synthesis and evaluation)
  • A compromise between a short answer/essay and T/F
    item
  • Can cover a wide range of content can be sampled
    by one test
  • Disadvantages
  • Difficult to construct plausible alternative
    responses

18
Types of Multiple Choice Items
  • Four Basic Types
  • Question Type
  • Incomplete Statement Type
  • Right Answer Type
  • Best Answer Type
  • Which Type is Best?
  • Question Type vs. Incomplete Statement
  • Right Answer vs. Best Answer Type

19
Multiple Choice Items
  • Writing the stem first
  • Be sure the stem asks a clear question
  • Stems phrased as questions are usually easier to
    write
  • Stems should not contain a lot of irrelevant
    info.
  • Appropriate reading level/terms
  • Be sure the stem is grammatically correct
  • Avoid negatively stated stems

20
Multiple Choice Items
  • Writing the correct response
  • Use same terms/reading level
  • Avoid too many qualifiers
  • Assign a random position in the answer sequence
  • Read the stem and correct response together
  • Generate the distractors/alternative responses

21
Multiple Choice Items
  • Other Tips for Constructing MC Items
  • Items should have 3-4 alternatives.
  • Stem should present a single, clearly formulated
    problem
  • Simple, understandable, exclude extraneous words
    from both stem and alternatives
  • Include in the stem any word that are repeated in
    each response
  • Avoid all of the above (can answer based on
    partial information)
  • Avoid none of the above

22
Multiple Choice Items
  • Alternative responses/distractors should be
    plausible and as homogeneous as possible
  • Response alternatives should not overlap
  • Two synonymous terms (arithmetic average/mean)
  • Avoid double negatives
  • None of the following are part of the brain
    except which one?
  • Emphasize negative wording
  • Each item should be independent of other items in
    the test
  • Information in the stem of one item should NOT
    help answer another item.

23
True-False Test Items
  • Best suited for testing 3 kinds of info.
  • Knowledge level learning
  • Understanding of misconceptions
  • When there are two logical responses
  • Advantages
  • Sample a large amount of learning per unit of
    student testing time
  • Disadvantages
  • Tends to be very easy
  • 50-50 chance of guessing
  • Tends to be low in reliability

24
Tips for Constructing True/False Items
  • Tips for constructing True-False Items
  • Avoid double negatives
  • Avoid long or complex sentences
  • Specific determiners (always, never, only, etc.)
    should be used with caution
  • Include only one central idea in each statement
  • Avoid emphasizing the trivial
  • Exact quantitative (two, three, four) language is
    better than qualitative (some, few, many)
  • Avoid a pattern of answers

25
Objective Test Item Analyses
  • Evaluating the Effectiveness of Items..
  • Why?
  • Scientific way to improve the quality of tests
    and test items
  • Identify poorly written items which mislead
    students
  • Identify areas (competencies) of difficulty
  • Item analyses provided info. on
  • Item difficulty
  • Item discrimination
  • Effectiveness of alternatives in MC Tests

26
Short-Answer Items
  • Two Types (Question and Incomplete Statement)
  • Advantages
  • Easy to construct
  • Excellent format for measuring who, what, when,
    and where info.
  • Guessing in minimized
  • Student must know the material- rather than
    simply recognize the answer
  • Disadvantages
  • Grading can be time consuming
  • More than one answer can be correct

27
Short Answer Items
  • Tips for Constructing Short Answer Items
  • Better to supply the term and require a
    definition
  • For numerical answers, indicate the degree of
    precision expected and the units in which they
    are to be expressed.
  • Use direct questions rather than incomplete
    statements
  • Try to phrase items so that there is only one
    possible correct response
  • When incomplete statements are used, do not use
    more than one blank within an item.

28
Essay Questions
  • Types of Essay Questions
  • Extended Response Question
  • Great deal of latitude on how to respond to a
    question.
  • Example Discuss essay and multiple-choice type
    tests.
  • Restricted Response Question
  • More specific, easier to score, improved
    reliability and validity
  • Example Compare and contrast the relative
    advantages of disadvantages of essay and multiple
    choice tests with respect to reliability,
    validity, objectivity, usability.

29
Essay Items
  • Advantages
  • Measures higher learning levels (synthesis,
    evaluation) and is easier to construct than an
    objective test item
  • Students are less likely to answer an essay
    question by guessing
  • Require superior study methods
  • Offer students an opportunity to demonstrate
    their abilities to
  • Organize knowledge
  • Express opinions
  • Foster creativity

30
Essay Items
  • May limit the sampling of material covered
  • Tends to reduce validity of the test
  • Disadvantages
  • Subjective unreliable nature of scoring
  • halo effect good or bad students previous
    level of performance
  • Written expression
  • Handwriting legibility
  • Grammatical and spelling errors
  • Time Consuming

31
Essay Questions
  • Give students a clear idea of the scope
    direction intended for the answer
  • Might help to start the question with the
    description of the required behavior (e.g.,
    compare, analyze)
  • Appropriate language level for students
  • Construct questions that require students to
    demonstrate a command of background info, but do
    not simply repeat that info.
  • If question calls for an opinion, be sure that
    the emphasis is not on the opinion but on the way
    its presented or argued.
  • Use a larger number of shorter, more specific
    questions rather than one or two longer questions
    so that more information can be assessed.

32
Essay Questions
  • You might
  • Give students a pair of sample answers to a
    question of the type you will give on the test.
  • Sketch out a rubric (grading scheme) for each
    question before reading the papers OR randomly
    select a few to read and make up the grading
    scheme based on those answers
  • Give students a writing rubric
  • Detach identifying information and use code
    numbers instead to avoid letting personality
    factors influence you.
  • After grading all the papers on one item, reread
    the first few to make sure you maintained
    consistent standards
  • Be clear to student the extend to which factors
    other than content (e.g., grammar, handwriting,
    etc.) will influence the grade.

33
Essay Questions
  • Tips for constructing Essay Questions
  • Provide reasonable time limits for each question
  • thinking and writing time
  • Avoid permitting students a choice of questions
  • Will not necessarily get a representative sample
    of student achievement. Only be requiring all
    students to answer all questions can their
    achievement be compared
  • A definite task should be put forth to the
    student
  • Critical words compare, contrast, analyze,
    evaluate, etc.

34
Scoring Essay Items
  • Write an outline of the key points (use outline
    to design a rubric)
  • Determine how many points are to be assigned to
    the question as a whole and to the various parts
    within it.
  • If possible, score the test without knowledge of
    the students name
  • Face Sheet
  • Score all of the answers to one question before
    proceeding to the next question
  • Consistent standard

35
Scoring Essay Exams
  • If possible, score each set of answers within the
    same time frame
  • Handwriting, spelling Neatness
  • Two separate grades?
  • Mastery of material
  • Other

36
Alternative Methods of Assessment
  • Research/Term Papers
  • Research Reviews
  • Reports
  • Case Studies
  • Portfolios
  • Projects
  • Performances
  • Peer evaluation
  • Mastery
  • Simulations

37
Cheating
  • Preventing Cheating
  • Reduce the pressure (multiple evaluations)
  • Make reasonable demands (length/content of exam)
  • Use alternative seating
  • Use alternative forms
  • Be cautious with extra copies

38
Using Assessment Evaluation to Improve Student
Learning Outcomes
  • Providing feedback to student
  • Closing the assessment evaluation loop
  • Maximizing student learning

39
Questions?
Write a Comment
User Comments (0)
About PowerShow.com