EDP 301 Presentations - PowerPoint PPT Presentation

About This Presentation
Title:

EDP 301 Presentations

Description:

EDP 301 Presentations Lawrence W. Sherman, Ph. D PowerPoint Slides for EDP 301: Assessment and Evaluation in Educational Settings. Linn and Miller (2004) text – PowerPoint PPT presentation

Number of Views:133
Avg rating:3.0/5.0
Slides: 53
Provided by: usersMuoh
Learn more at: https://miamioh.edu
Category:

less

Transcript and Presenter's Notes

Title: EDP 301 Presentations


1
EDP 301 Presentations
  • Lawrence W. Sherman, Ph. D
  • PowerPoint Slides for EDP 301
  • Assessment and Evaluation in Educational
    Settings.
  • Linn and Miller (2004) text

2
Chapter 1Educational Testing and Assessment
Context, Issues, and Trends
  • Accountability demands
  • State, national and international assessment
    programs
  • National content and performance standards
  • Global competition
  • Fairness of uses and interpretations

3
Chapter 1Educational Testing and Assessment
Context, Issues, and Trends
  • Accountability demands including state, national
    and international assessment programs, national
    content and performance standards, as well as
    global competition, all contribute to increased
    demands for testing assessment. These factors
    have both stimulated and reflected new trends in
    educational measurement. The increased reliance
    on testing and assessment as educational reform
    tools has also raised issues concerning the
    farness of their uses and their interpretations.

4
Nature of AssessmentChapter 2 issues
  • Maximum Performance
  • Function Determines what individuals can do when
    performing at their best.
  • Example Aptitude and Achievement Tests
  • Typical Performance
  • Function Determine what individuals will do
    under natural conditions
  • Example Attitude, interest and personality
    inventories observational techniques peer
    appraisal.

5
Form of Assessment
  • Fixed Choice Test
  • Function Efficient measurement of knowledge and
    skills, indirect indicator.
  • Example Standardized multiple-choice tests
  • Complex-performance assessment
  • Function Measurement of performance in contexts
    and on problems valued in their own right
  • Example Hands-on laboratory experiment,
    projects, essays, oral presentation

6
Tests Used in Classroom Instruction
  • Placement
  • Formative
  • Diagnostic
  • Summative

7
Placement
  • Function
  • Determines prerequisite skills, degree of mastery
    of course goals, and or best mode of learning
  • Example
  • Readiness tests
  • Aptitude tests
  • Pretests on course objectives
  • Self-report inventories
  • Observational techniques

8
Formative Assessment
  • Function
  • Determines learning progress
  • Provides feedback to reinforce learning
  • Corrects learning errors
  • Examples
  • Teacher-made tests
  • Custom-made tests from textbook publishers,
  • Observational techniques

9
Diagnostic Assessment
  • Function
  • Determines causes (intellectual, physical,
    emotional, environmental) of persistent learning
    difficulties.
  • Example
  • Published diagnostic tests,
  • Teacher-made diagnostic tests,
  • Observational techniques

10
Summative Evaluation
  • Function
  • Determines end-of-course achievement for
    assigning grades
  • Certifying mastery of objectives
  • Examples
  • Teacher-made survey tests
  • Performance rating scales
  • Product scales

11
Methods of interpreting results
  • Criterion Referenced
  • Function Describes student performance according
    to a specified domain of clearly defined learning
    tasks (e.g., adds single-digit whole numbers)
  • Example
  • Teacher-made tests,
  • custom-made tests from test publishers,
  • observational techniques
  • Norm Referenced
  • Function Describes student performance according
    to relative position in some known group (e.g.,
    ranks 10th out of 30 top 10 percent)
  • Examples
  • Standardized aptitude and achievement tests
  • Teacher-made survey tests,
  • Interest inventories,
  • Adjustment inventories

12
Chapter 3Instructional Goals and Objectives
Foundations for Assessment
  • What types of learning outcomes do you expect
    from your teaching?
  • Knowledge?
  • Understanding?
  • Applications?
  • Thinking skills?
  • Performance skills?
  • Attitudes?
  • Clearly defining desired learning outcomes is the
    first step in good teaching. It is also
    essential to the assessment of student learning.
    Sound assessment requires relating the assessment
    procedures as directly as possible to intended
    learning outcomes.

13
Chapter 3 Instructional Goals
  • Types of Learning Outcomes to Consider
  • Blooms Taxonomy of Educational Objectives
  • Cognitive Domain
  • Knowledge and intellectual skills/abilities
  • Affective Domain
  • Attitudes, interests, appreciation
  • Psychomotor Domain
  • Perceptual and motor skills

14
Chapter 3Instructional Goals
  • Other sources for lists of objectives
  • Professional Association Standards
  • MCREL
  • State Content Standards
  • OHIO

15
Chapter 3Instructional Goals
  • Some Criteria for Selecting Appropriate
    Objectives
  • Do the objectives include all important outcomes
    of the course?
  • Are the objectives in harmony with the content
    standards of the state or district and with
    general goals of the school?
  • Are the objectives in harmony with sound
    principles of learning?
  • Are the objectives realistic in terms of the
    abilities of the students and the time and
    facilities available?

16
Correlation
  • Correlational research tells whether the values
    of two variables are related. Quite often used in
    determining Reliability and Validity

17
Correlation
  • Keep in mind, though, that just because two
    variables are correlated, even highly correlated,
    one variable does not cause the other.

18
Strength of a Correlation
  • A correlation coefficient is a number ranging
    between -1.00 and 1.00 that represents the
    degree and direction of relation between two
    variables.
  • The number tells you the strength of the
    correlation, and the sign tells you the direction
    of the relation.

19
Positive Corrrelation
  • The higher the number, the stronger the relation.
  • A plus sign tells us that as the values of one
    variable increase, so do the values of the other.
  • Thus values of both variables are headed in the
    same direction they are positively correlated.

20
Negative Correlation
  • By contrast, a minus sign tells us that the
    values of the two variables travel in opposite
    directions they are negatively correlated.
  • As the values of one variable increase, the
    values of the other tend to decrease. This is
    described as an inverse relationship.

21
Chapter 4 Validity
  • When constructing or selecting assessments, the
    most important questions are
  • (1) To what extent will the interpretation of the
    scores be appropriate, meaningful, and useful for
    the intended application of the results? and
  • (2) What are the consequences of the particular
    uses and interpretations that are made of the
    results?
  • (3) A valid test Must be Reliable!

22
Chapter 4 Validity Issues
  • Nature of Validity
  • Major Considerations in Assessment Validation
  • Content Considerations
  • Construct Considerations
  • Assessment-Criterion Relationships
  • Consideration of Consequences
  • Factors Influencing Validity

23
Nature of Validity
  • Appropriateness of the interpretation of the
    results
  • Its a matter of degree
  • Specific to some particular use or interpretation
  • Is a Unitary concept
  • Involves an overall evaluative judgment

24
Major Considerations in Validation
  • Content
  • How it represents the domain of tasks to be
    measured
  • Construct
  • Interpretation as a meaningful measure of some
    characteristic or quality
  • Assessment-Criterion Relationship
  • Prediction of future performance (criterion)
  • Consequences
  • How well results accomplish intended purposes and
    avoids unintended effects

25
Chapter 5 Reliability
  • Next to validity, reliability is the most
    important characteristic of assessment results.
    Reliability
  • (1) provides the consistency that makes validity
    possible, and
  • An unreliable test cannot be Valid!
  • (2) reliability indicates the degree to which
    various kinds of generalizations are justifiable.
    The practicality of the evaluation procedure is,
    of course, also of concern to the busy classroom
    teacher.

26
Chapter 5 ReliabilityIssues
  • Nature of Reliability
  • Determining Reliability by Correlation Methods
  • Standard Error of Measurement
  • Factors Influencing Reliability Measures
  • Reliability of Assessments Evaluated in Terms of
    a Fixed Performance Standard
  • Usability

27
Chapter 6Planning tests Timing
  • Preparation (Planning)
  • Administration
  • Grading
  • Post-Test Analysis!

28
Think In Terms of a Time Line/Cycle!
Preparation How much time do you have to make
the test?
Administration How much time do you have to
give the test (class period length)?
Post-test analysis How much time do you have to
analyze the test results
Grading How much time do you have to grade the
test?
29
Chapter 6Planning tests
  • Objective Tests
  • A. Supply Type
  • Short Answer
  • Completion
  • B. Selection Type
  • True-False or Alternative-Response
  • Matching
  • Multiple Choice
  • Performance Assessment
  • Extended Response
  • Restricted Response

30
Table of Specifications Similar to tables
6.2-6.4 in Chapter 6
Content national Standards State Standard Specific Objective Bloom Taxonomy Bloom Taxonomy Bloom Taxonomy
Content national Standards State Standard Specific Objective Knowledge Understanding Application
1. Short Answer
2. True/False
3. Multiple Choice
4. Matching
31
Chapter 7 Simple Forms
  • Short-Answer
  • True-false
  • Matching

32
Short-Answer issuesChapter 7, page 178
  1. Is this the most appropriate type of item for the
    intended learning outcomes?
  2. Can the items be answered with a number, symbol,
    word or brief phrase?
  3. Has text book language been avoided?
  4. Have the items been stated so that only one
    response is correct?
  5. Are the answer blanks equal in length?
  6. Are the answer blanks at the end of the items?
  7. Are items free of clues (such as a or an)?
  8. Has the degree of precision been indicated for
    numerical answers?
  9. Have the units been indicated when numerical
    answers are expressed in units?
  10. Have the items been phrased so as to minimize
    spelling errors?
  11. If revised, are the items still relevant to the
    intended learning outcomes?
  12. Have the items been set aside for a time before
    reviewing?

33
Short Answer Example supply-type items
  • Short Answer using a direct question
  • What is the name of the man who invented the
    airplane? _________________________
  • Completion using an incomplete sentence
  • The name of the man who invented the airplane is
    ___________________.

34
True-False ItemsChapter 7, p. 185
  1. Is this the most appropriate type of item to use?
  2. Can each statement be clearly judged t or f?
  3. Have specific determiners been avoided? (eg.,
    usually, always, etc.)
  4. Have trivial statements been avoided?
  5. Have negative statements (especially double
    negatives been avoided?
  6. Have the items been stated in simple, clear
    language?
  7. Are opinion statements attributed to some source?
  8. Are the t and f items approximately equal in
    length?
  9. Is there an approximately equal number of true
    and false items?
  10. Has a detectable pattern of answers been avoided?
    (eg., t,F,T,F,T,F,etc.)
  11. If revised, are the items still relevant to the
    intended learning outcome?
  12. Have the items been set aside for a time before
    reviewing them?

35
Examples of True False Questions
  • Keep in mind that your Directions are important
    as they link to the structure of your True/False
    items!
  • Directions Write in a T for true or F for
    false in the space directly to the left of each
    of the following 10 statements.
  • ___1. The green coloring material in a plant
    leaf is called chlorophyll.
  • Directions On your scanner form bubble in an a
    for true or b for false for each of the
    following 10 statements.
  • 1. Larry Sherman is the First Vice President of
    the Ohio Gourd Society.
  • A. True B. False

36
Matching ItemsChapter 7, p. 190
  1. Is this the most appropriate type of item to use?
  2. Is the material in the two lists homogeneous?
  3. Is the list of responses longer or shorter than
    the list of premises?
  4. Are the responses brief and on the right-hand
    side?
  5. Have the responses been placed in alphabetical or
    numerical order?
  6. Do the directions indicate the basis for
    matching?
  7. Do the directions indicate that each response may
    be used more than once?
  8. Is all of each matching item on the same page?
  9. If revised, are the items still relevant to the
    intended learning outcomes?
  10. Have the items been set aside for a time before
    reviewing them?

37
Example of Matching questions
Directions On the line to the left of each
achievement listed in Column A, write the
letter of the mans name in Column B who is noted
for that achievement. Each name in Column B may
be used only once.
Column A Achievement Column B Names
____1. Invented the telephone ____2. Discovered America ____3. First United States astronaut ____4. First US President A. Alexander Graham Bell B. Christopher Columbus C. John Glenn D. Abraham Lincoln E. Ferdinand Magellan F. George Washington G. Eli Whitney
38
Chapter 8 Multiple Choice Items Chapter 8, p.
214
  1. Is this the most appropriate type of item to use?
  2. Does each item stem present a meaningful problem?
  3. Are the item stems free of irrelevant material?
  4. Are the item stems stated in positive terms (if
    possible)?
  5. If used, has negative wording been given special
    emphasis (e.g., capitalized, underlined, etc.)
  6. Are the alternatives grammatically consistent
    with the item stem?
  7. Are the alternative answers brief and free of
    unnecessary words?
  8. Are the alternatives similar in length and form?
  9. Is there only one correct or clearly best answer?
  10. Are the distracters plausible to low achievers?
  11. Are the items free of verbal clues to the answer?
  12. Are the verbal (or numerical) alternatives in
    alphabetical (or numerical) order?
  13. Have none of the above and all of the above been
    avoided (or used sparingly and appropriately?
  14. If revised, are the items still relevant to the
    intended learning outcomes?
  15. Have the items been set aside for a time before
    reviewing them?

39
Example of Multiple Choice Items.
  • Directions must be associated with the style of
    the question!
  • Directions Circle the one most correct answer to
    each of the following 10 questions.
  • 1. Who was the first United States astronaut to
    orbit the earth in space?
  • A. John Glenn
  • B. Scott Carpenter
  • C. Virgil Grissom
  • D. Alan Shepard

40
Chapter 10 Measuring Complex Achievement
Essay Questions.
  • Some important learning outcomes may best be
    measured by the use of open-ended essay questions
    or other types of Performance assessments.
    Essay questions provide freedom of response that
    is needed to adequately assess students ability
    to formulate problems organize, integrate, and
    evaluate ideas and information and apply
    knowledge and skills.

41
Essay Questions Check List
  1. Is this the most appropriate type of task to use?
  2. Are the questions designed to measure
    higher-level learning outcomes?
  3. Are the questions relevant to the intended
    learning outcomes?
  4. Does each question clearly indicate the response
    expected?
  5. Are the students told the bases on which their
    answers will be evaluated?
  6. Have you conceptualized a rubric upon which the
    response will be scored?
  7. Are generous time limits provided for responding
    to the questions?
  8. Are students told the time limits and/or point
    values for each question?
  9. Are all students required to respond to the same
    question?
  10. If revised, are the questions still relevant to
    the intended learning outcomes?
  11. Have the questions been set aside for a time
    before reviewing them?

42
CHAPTER 11 PERFORMANCE-BASED ASSESSMENTS
  • Essay tests are the most common example of a
    performance-based assessment, but there are many
    others, including artistic productions,
    experiments in science, oral presentations, and
    the use of mathematics to solve real-world
    problems. The emphasis is on doing, not merely
    knowing on PROCESS as well as PRODUCT.

43
Suggestions for constructing performance tasks
  1. Focus on learning outcomes that require complex
    cognitive skills and student performances.
  2. Select or develop tasks that represent both the
    content and the skills that are central to
    important learning outcomes.
  3. Minimize the dependence of task performance on
    skills that are irrelevant to the intended
    purpose of the assessment task.
  4. Provide the necessary scaffolding for students to
    be able to understand the task and what is
    expected.
  5. Construct the task directions so that the
    students task is clearly indicated.
  6. Clearly communicate performance expectations in
    terms of the scoring rubrics by which the
    performances will be judged.

44
Chapter 12 PortfoliosKey Steps in Defining,
Implementing and Using Portfolios
  1. Specify purpose
  2. Provide guidelines for selecting portfolio
    entries
  3. Define student role in selection and
    self-evaluation.
  4. Specify evaluation criteria
  5. Use portfolios in instruction and communication.

45
Chapter 14 Assembling, Administering, and
Appraising Classroom Tests and Assessments
  • Care in preparing an assessment plan and
    constructing relevant test items and assessment
    tasks should be followed by similar care in
    reviewing and editing the items and tasks,
    preparing clear directions, and administering and
    appraising the results. Classroom assessments
    lso can be improved by using simple methods to
    analyze student responses, and building a file of
    effective items and tasks.

46
Chapter 14 Assembling, Administering,
  • Assembling the Classroom Test
  • Administering and Scoring Classroom Tests and
    Assessments
  • Appraising Classroom Tests and Assessments
  • Building a File of Effective Items and Tasks

47
Flow Chart of Testing Process
48
(No Transcript)
49
(No Transcript)
50
(No Transcript)
51
Chapter 15 Grading and Reporting
  • Grading and reporting student progress is one of
    the more frustrating

Grading and reporting student progress is one of
the more frustrating aspects of teachingthere
are so many factors to consider, and so many
decisions to be made. This chapter will remove
some of the complexity by describing the various
types of grading and reporting systems and
providing guidelines for their effective use.
52
Chapter 15 Grading and Reporting
  • Functions of Grading and Reporting Systems
  • Types of Grading and Reporting Systems
  • Multiple Grading and Reporting Systems
  • Assigning Letter Grades
  • Record-Keeping and Grading Software
  • Conducting Parent-Teacher Conferences
  • Reporting Standardized Test Results to Parents
Write a Comment
User Comments (0)
About PowerShow.com