PROFESSIONAL EDUCATION VI - PowerPoint PPT Presentation

1 / 139
About This Presentation
Title:

PROFESSIONAL EDUCATION VI

Description:

PROFESSIONAL EDUCATION VI ASSESSMENT OF LEARNING (Basic Concepts) Prof. Yonardo Agustin Gabuyo Analytic Rubric breaks down the objective or final product into ... – PowerPoint PPT presentation

Number of Views:194
Avg rating:3.0/5.0
Slides: 140
Provided by: JadeV
Category:

less

Transcript and Presenter's Notes

Title: PROFESSIONAL EDUCATION VI


1
  • PROFESSIONAL EDUCATION VI
  • ASSESSMENT OF LEARNING(Basic Concepts)
  • Prof. Yonardo Agustin Gabuyo

2
Basic Concepts in Assessment of Learning
  • Assessment
  • ? refers to the collection of data to describe or
    better understand an issue.
  • ?measures "where we are in relation to where we
    should be?"  Many consider it the same as
    Formative Evaluation.

3
  • ? is a process by which information is obtained
    relative to some known objective or goal.
  • ?teachers way of gathering information about
    what students have learned , and they use them to
    make important decisions-about students grades,
    the content of future lessons, the revision of
    the structure or content of a course.

4
Measurement
? refers to the process by which the attributes
or dimensions of some physical object are
determine. ?is a process of measuring the
individuals intelligence, personality, attitudes
and values, achievement and anything that can be
expressed quantitatively. ? it answer the
question, how much?
5
Evaluation
? determines "how well did we do what we set out
to do?"  Evaluation is tied to stated goals and
objectives.  Many equate this to summative
evaluation.
6
Evaluation
? it refers to the process of determining the
extent to which instructional objectives are
attained. ? refers to the comparison of data to
standard for purpose of judging worth or
quality.
7
Test is an instrument designed to measure any
quality, ability, skill or knowledge. Testing is
a method used to measure the level of performance
or achievement of the learner.
8
TESTING refers to the administration, scoring and
interpretation of an instrument (procedure)
designed to elicit information about performance
in a sample of a particular area of behavior.
9
ASSESSMENT EVALUATION
Content timing, primary purpose Content timing, primary purpose
Formative ongoing, to improve learning Summative final, to gauge quality
Orientation focus of Orientation focus of
Process-oriented how learning is going Product-oriented whats been learned
Findings uses thereof Findings uses thereof
Diagnostic identify areas for improvement Judgmental arrive at an overall grade/score
10
MODES OF ASSESSMENT
A. Traditional Assessment ?preparation of the
instrument is time consuming and prone to
cheating. ?the objective paper-and-pen test which
usually assess low level thinking
skills. ?scoring is objective and administration
is easy because students can take the test at the
same time.
11
B. Performance Assessment ?the learner performs
a behavior to be measured in a "real-world"
context. 
12
  • ? The learner demonstrates the desired behavior
    in a real-life context and the locus of control
    is with the student.

13
B. Performance Assessment
?A mode of assessment that requires actual
demonstration of skills or creation of products
of learning. ?Scoring tends to be subjective
without rubrics. ?Preparation of the instrument
is relatively easy and it measures behavior that
cannot be deceived.
14
C. Portfolio Assessment
?A process of gathering multiple indicators of
students progress to support course goals in
dynamic, ongoing and collaborative
processes. ?Development is time consuming and
rating tends to be subjective without
rubrics. ?Measures students growth and
development .
15
TYPES OF ASSESSMENT PROCESSES
  • ?Determine the entry behavior of the students.
  • ?Determine the students performance at the
    beginning of instruction.

A. Placement Assessment
16
?Determine the position of the students in the
instructional sequence. ?Determine the mode of
evaluation beneficial for each student.
17
B. Diagnostic Assessment is given at the start
? to determine the students levels of
competence. ? to identify those who have already
achieve mastery of the requisite learning. ? to
help classify students into tentative small
group of instruction.
18
C. Formative Assessment is given to ? monitor
learning progress of the students. ? provide
feedback to both parents and students.
19
  • ? it answer the question "Where we are in
    relation to where we should be?
  • ?this type of assessment can be done informally
    and need not use traditional instruments such as
    quizzes and tests.

20
  • D. Summative Assessment
  • given at the end of a unit
  • to determine if the objectives were achieved. 
  • tends to be formal and use traditional
    instruments such as tests and quizzes.

21
  • ?it answer the question "How well did we do what
    we set out to do?" 
  • ?determine the extent of the students
    achievement and competence.
  • ?provide a basis for assigning grades.

22
  • ?provide the data from which reports to parents
    and transcripts can be prepared.

23
  • Principles of Quality Assessment
  • 1.Clarity of the Learning Target
  • 2.Appropriateness of the Assessment Method
  • 3. Validity
  • 4. Reliability
  • 5. Fairness
  • 6. Practicality and Efficiency

24
Principles of Quality Assessment
1.Clarity of the Learning Target
Learning Target. Clearly stated, focuses on
student learning objective rather than teacher
activity, meaningful and important target. Skill
Assessed.  Clearly presented, can you "see" how
students would demonstrate the skill in the task
itself?
25
  • Performance Task - Clarity. Could students tell
    exactly what they are supposed to do and how the
    final product should be done?
  • Rubric - Clarity.  Would students understand how
    they are to be evaluated? Are the criteria
    observable and clearly described?

26
2.Appropriateness of the Assessment Method
? Does it work with type of task and learning
target? ? Does it allow for several levels of
performance? ? Does it assess skills as
stated? ? The type of test used should much the
learning objective of the subject matter.
27
Two general categories of test items 1.Objective
items ? require students to select the correct
response from several alternatives or to supply a
word or short phrase to answer a question or
complete a statement. 2.Subjective or essay
items ? which permit the student to organize and
present an original answer.
28
  • Objective Test
  • ?include true-false, fill-in-the-blank, matching
    type, and multiple choice questions.
  • the word objective refers to the scoring and
    indicates there is only one correct answer.
  • Objective tests rely heavily on your skill to
    read quickly and to reason out the answer.

29
  • ? measure both your ability to remember facts and
    figures and your understanding of course
    materials.
  • ? prepare yourself for high level critical
    reasoning and making fine discriminations to
    determine the best answer.

30
  • a) Multiple-Choice Items
  • ?used to measure knowledge outcomes and various
    types of learning outcomes.
  • ?they are most widely used for measuring
    knowledge , comprehension, and application
    outcomes.
  • ?scoring is easy, objective, and reliable.

31
Principles of Quality Assessment
Advantages in Using Multiple-Choice Items
  • Multiple-choice items can provide ...
  • versatility in measuring all levels of cognitive
    ability.
  • ? highly reliable test scores.
  • ? scoring efficiency and accuracy.
  • ? objective measurement of student achievement or
    ability.

32
  • Multiple-choice items can provide
  • ? a wide sampling of content or objectives.
  • ? a reduced guessing factor when compared to
    true-false items.
  • ? different response alternatives which can
    provide diagnostic feedback.

33
  • b. True-False Items
  • ? typically used to measure the ability to
    identify whether statements of fact are correct.
  • ? the basic format is simply a declarative
    statement that the student must judge are true or
    false.
  • ?item is useful for outcomes where there are two
    possible alternatives.

34
  • True-False Items..
  • ? do not discriminate between students of varying
    ability as well as other item types.
  • can often include more irrelevant clues than do
    other item types.
  • ? can often lead an instructor to favor testing
    of trivial knowledge.

35
  • c. Matching Type Items
  • ? consist of a column of key words presented on
    the left side of the page and a column of options
    place at the right side of the page. Students are
    required to match the options associated with a
    given key word(s).
  • ? provide objective measurement of students
    achievement.
  • ? provide efficient and accurate test scores.

36
  • Matching Type Items
  • ? if options can not be used more than once, the
    items are not mutually exclusive getting one
    answer incorrect automatically means a second
    question is incorrect.
  • ? all items should be of the same class, and all
    options should be of the same class. (e.g., a
    list of events to be matched with a list of dates.

37
  • d. Short Answer Items
  • ? requires the examinee to supply the
    appropriate words, numbers, or symbols to answer
    a question or complete a statement.
  • ? items should require a single word answer or
    brief and definite statement.
  • ? can efficiently measure lower level of
    cognitive domain.

38
  • B) Essays or Subjective test
  • may include either short answer questions or long
    general questions. ? these exams have no one
    specific answer per student.
  • they are usually scored on an opinion basis,
    although there will be certain facts and
    understanding expected in the answer.

39
  • ? essay test are generally easier and less time
    consuming to construct than are most objective
    test items.
  • ?the main reason students fail essay tests is not
    because they cannot write, but because they fail
    to answer the questions fully and specifically,
    their answer is not well organized.

40
  • students with good writing skills have an
    advantage over students who have difficulty
    expressing themselves through writing.
  • ?essays are more subjective in nature due to
    their susceptibility to scoring influences.

41
C) PERFORMANCE TEST ?also known as alternative
or authentic assessment ? is designed to assess
the ability of a student to perform correctly in
a simulated situation (i.e., a situation in which
the student will be ultimately expected to apply
his/her learning).
42
  • ?a performance test will simulate to some degree
    a real life situation to accomplish the
    assessment.
  • ? in theory, a performance test could be
    constructed for any skill and real life
    situation.

43
? most performance tests have been developed for
the assessment of vocational, managerial,
administrative, leadership, communication,
interpersonal and physical education skills in
various simulated situations.
44
  • Advantages in Using Performance Test Items
  • Performance test items
  • ?can appropriately measure learning objectives
    which focus on the ability of the students to
    apply skills or knowledge in real life
    situations.

45
  • ?usually provide a degree of test validity not
    possible with standard paper and pencil test
    items.
  • ?are useful for measuring learning objectives in
    the psychomotor domain.

46
SUGGESTIONS FOR WRITINGPERFORMANCE TEST ITEMS
1.Prepare items that elicit the type of behavior
you want to measure. 2. Clearly identify and
explain the simulated situation to the student.
3. Make the simulated situation as "life-like"
as possible.
47
  • 4. Provide directions which clearly inform the
    students of the type of response called for.
  • 5. When appropriate, clearly state time and
    activity limitations in the directions.
  • 6. Adequately train the observer(s)/scorer(s) to
    ensure that they are fair in scoring the
    appropriate behaviors.

48
  • D) Oral questioning
  • the most commonly-used of all forms of assessment
    in class.
  • assumes that the learner can hear, of course, and
    shares a common language with the assessor.

49
?the ability to communicate orally is relevant to
this type of assessment.
50
  • ? The other major role for the "oral" in
    summative assessment is in language learning,
    where the capacity to carry on a conversation at
    an appropriate level of fluency is relatively
    distinct from the ability to read and write the
    language.           

51
E) Observation
? refers to measurement procedures in which child
behaviors in the school or classroom are
systematically monitored, described, classified,
and analyzed, with particular attention typically
given to the antecedent and consequent events
involved in the performance and maintenance of
such behaviors.
52
F) Self-reports ? Students are asked to reflect
on make a judgment about, and then report on
their own or a peer's behavior and performance.
53
  • ?typical evaluation tools could include sentence
    completion, Likert scales, checklists, or
    holistic scales.
  • ?responses may be used to evaluate both
    performance and attitude.

54
3. Validity ? is the degree to which the test
measures what is intended to measure. ? it is
the usefulness of the test for a given purpose.
? a valid test is always reliable.

55
Approaches in Validating Test
Procedure Meaning
1. Face Validity Done by examining the physical appearance of the test.
2. Content-Related Validity Done through a careful and critical examination of the objectives of the test so that it reflects the curricular objectives .Compare the test tasks to the test specifications describing the task domain under consideration. How well the sample of test tasks represents the domain of tasks to be measured.
56
3. Criterion-related Validity Establish statistically such that a set of scores revealed by the test is correlated with the scores o0btained in another external predictor or measure. Compare the test scores with another measure of performance obtained How well test performance predicts future performance or estimates current performance on some valued measures other than the test itself (called criterion).
57
at a later date (for prediction) or another measure of performance obtained concurrently (for estimating present status).
4. Construct-related Validity Establish statistically by comparing psychological traits or factors that theoretically influence scores in the test. Establish the meaning of the scores on the test by controlling (or examining) the development of the test, evaluating the relationships of the scores with other relevant measures, and experimentally determining what factors influence test performance. How well test performance can be interpreted as a meaningful measure of some characteristic or quality.
58
  • Factors Affecting Content Validity of Test Items
  • A. Test itself
  • B. The administration and scoring of a test.
  • C. Personal factors influencing how students
    response to the test.
  • D. Validity is always specific to a particular
    group.

59
Factors Affecting Content Validity of Test
Items A. Test Itself Ways that can reduce the
validity of test results 1. Unclear Directions 2.
Poorly constructed test items 3. Ambiguity 4.
Inappropriate level of difficulty 5. Improper
arrangement of items
60
6. Inadequate time limits 7. Too short
test 8.Identifiable pattern of answers. 9.Test
items inappropriate for the outcomes being
measured. 10.Reading vocabulary and sentence
structure to difficult.

61
  • B. The administration and scoring of a test.
  • ? assessment procedures must be administered
    uniformly to all students. Otherwise, scores will
    vary due to factors other than differences in
    student knowledge and skills.

62
  • ? the test should be administered with ease,
    clarity and uniformity so that scores obtained
    are comparable.
  • ?uniformity can be obtained by setting the time
    limit and oral instructions.

63
? insufficient time to complete the test ? giving
assistance to students during the testing ?
subjectivity in scoring essay tests
64
C. Personal factors influencing how students
response to the test. ?students might not
mentally prepared for the test. ? students can
subconsciously be exercising what is called
response set.
65
D. Validity is always specific to a particular
group. ?the measurement of test results can be
influence by such factors as age, sex, ability
level, educational background and cultural
background.
66
  • Validity
  • ? is the most important quality of a test.
  • ? does not refer to the test itself.
  • ? generally addresses the question "Does the
    test measure what it is intended to measure?"

67
  • ?refers to the appropriateness, meaningfulness,
    and usefulness of the specific inferences that
    can be made from test scores.
  • ? is the extent to which test scores allow
    decision makers to infer how well students have
    attained program objectives.

68
4. Reliability ?it refers to the consistency of
score obtained by the same person when retested
using the same instrument or one that is parallel
to it. ?refers to the results obtained with an
evaluation instrument and not the instrument
itself.
69
  • ? an estimate of reliability always refer to a
    particular type of consistency.
  • ? reliability is necessary but not a sufficient
    condition for validity.
  • ? reliability is primarily statistical.

70
Methods of Computing Reliability Coefficient
Method Procedure Type of Reliability Measure
1.Test-retest method Give a test twice to the same group with any time interval between tests. Measure of Stability
2. Parallel Method (Equivalent Forms) Give parallel forms of test with close time intervals between forms. Measure of Equivalence
3. Split-Half Method Give a test once. Score equivalent halves of the test e.g. odd and even numbered items. Measure of Internal Consistency
4. Kuder-Richardson Give the test once then correlate the proportion /percentage of the students passing and not passing a given item. Measure of Internal Consistency
71
Relationship of Reliability and Validity ? test
validity is requisite to test reliability. ? if
a test is not valid, then reliability is moot. In
other words, if a test is not valid there is no
point in discussing reliability because test
validity is required before reliability can be
considered in any meaningful way.
72
  • Reliability
  • ? is the degree to which test scores are free of
    errors of measurement due to things like student
    fatigue, item sampling, student guessing.
  • ? if as test is not reliable it is also not valid.

73
  • 5. Fairness
  • ?the assessment procedures do not discriminate
    against a particular group of students (for
    example, students from various racial, ethnic, or
    gender groups, or students with disabilities).

74
  • 6. Practicality and Efficiency
  • ? Teachers familiarity with the method
  • ? Time required
  • ? Complexity with the administration
  • ? Ease in scoring -the test should be easy to
    score such that directions for scoring are clear,
    the scoring key is simple provisions for answer
    sheets are made.

75
  • ? Cost- (economy) - the test should be given in
    the cheapest way, which means that the answer
    sheets must be provided so that the test can be
    given from time to time.

76
Development of Classroom Assessment Tools
Steps in Planning for a Test ? Identifying test
objectives ? Deciding on the type of objective
test to be prepared
77
? Preparing a Table of Specifications (TOS) ?
Construction the draft test items ? Try-out and
validation
78
Identifying Test Objectives. An objective test,
if it is to be comprehensive, must cover the
various levels of Blooms taxonomy. Each
objective consists of a statement of what is to
be achieved and preferably, by how many percent
of the students.
79
Cognitive Domain
  • 1. Knowledge
  • ? recognizes students ability to used rote
    memorization and recall certain facts. Test
    questions focus on identification and recall
    information.
  • ? Sample verbs of stating specific learning
    outcomes
  • ? Cite, define, identify label, list, match,
    name, recognize, reproduce, select state.

80
  • ? At the end of the topic, students be able to
    identify major food groups without error.
    (instructional objective)
  • ? Test Item
  • ? What are the four major food groups?
  • ? What are the three measures of central
    tendency?

81
  • 2. Comprehension
  • ? involves students ability to read course
    content, interpret important information and put
    others ideas into their own words. Test
    questions should focus on the use of facts, rules
    and principles.
  • ? Sample verbs of stating specific learning
    outcomes.
  • ? Classify, convert, describe, distinguish
    between, give examples, interpret, summarize.

82
  • ? At the end of the lesson, the students be able
    to summarize the main events of the story in
    grammatically correct English. (instructional
    objective)
  • ? Summarize the main event in the story in
    grammatically correct English. (test item)

83
  • 3. Application
  • ? students take new concepts and apply them to
    new situation. Test questions focuses on applying
    facts and principles.
  • ? Sample verbs of stating specific learning
    outcomes.
  • ? Apply, arrange, compute, construct,
    demonstrate, discover, extend, operate, predict,
    relate, show, solve, use.

84
  • ? At the end of the lesson, the students be able
    to write a short poem in iambic pentameter.
    (instructional objective)
  • ? Write a short poem in iambic pentameter.

85
  • 4. Analysis
  • ?students have the ability to take new
    information and break it down into parts and
    differentiate between them. The test question
    focus on separation of a whole into component
    parts.
  • ? Sample verbs of stating specific learning
    outcomes.
  • ? Analyze, associate, determine, diagram,
    differentiate, discriminate, distinguish,
    estimate, point out, infer, outline, separate.

86
  • ? At the end of the lesson, the students be able
    to describe the statistical tools needed in
    testing the difference between two means.
    (instructional objective)
  • What kind of statistical test would you run to
    see if there is a significant difference between
    pre-test and post-test?

87
  • 5. Synthesis
  • ? students are able to take various pieces of
    information and form a whole creating a pattern
    where one did not previously exist. Test question
    focuses on combining new ideas to form a new
    whole.

88
  • ?Sample verbs of stating specific learning
    outcomes.
  • ?Combine, compile, compose, construct, create,
    design, develop, devise, formulate, integrate,
    modify, revise, rewrite, tell, write.

89
  • ? At the end of the lesson, the student be able
    to compare and contrast the two types of error.
    (instructional objective)
  • ? What is the difference between type I and type
    II error?

90
  • 6. Evaluation
  • ?involves students ability to look at someone
    elses ideas or principles and the worth of the
    work and the value of the conclusion.

91
  • ? Sample verbs of stating specific learning
    outcomes.
  • ? Appraise, assess, compare, conclude, contrast,
    criticize, evaluate, judge, justify, support.
  • ? At the end of the lesson, the students be able
    to conclude the relationship between two means.

92
  • ?Example What should the researcher conclude
    about the relationship in the population?

93
Preparing Table of Specification A table of
specification ? is a useful guide in determining
the type of test items that you need to
construct. If properly prepared, s table of
specifications will help you limit the coverage
of the test and identify the necessary skills or
cognitive level required to answer the test item
correctly.
94
  • Gronlund (1990) lists several examples of how a
    table of specifications should be prepared.

95
Format of a Table of Specifications Specific
Objectives these refer to the intended learning
outcomes stated as specific instructional
objectives covering a particular test topic.
96
  • Cognitive Level this pertains to the
    intellectual skill or ability to correctly answer
    a test item using Blooms taxonomy of educational
    objectives. We sometimes refer to this as the
    cognitive demand of a test item. Thus entries in
    this column could be knowledge, comprehension,
    application, analysis, synthesis and evaluation.

97
?Type of Test Item this identifies the type or
kind of test a test items belongs to. Examples of
entries in this column could be multiple choice,
true or false, or even essay.
98
  • ?Item Number this simply identifies the question
    number as it appears in the test.
  • ?Total Number of Points this summarizes the
    score given to a particular test item.

99
(1) Sample of Table of specifications
Specific objectives Cognitive level Type of test Item number Total Points
Solve easy, moderately difficult and difficult problems applying the principles of percentage composition. Analysis Multiple choice 1 and 2 4 points
100
(2) Sample of Table of specifications
Content Number of Class Sessions Number of Items Number of Items
1.Subtraction Concepts 4 5 1- 5
2. Subtraction as the Inverse of Addition 4 5 6-10
3. Subtraction without Regrouping 8 10 11- 20
4. Subtraction with Regrouping 5 6 21- 26
5.Subtraction Involving Zeros 8 10 27- 36
6.Mental Computation through Estimation 4 5 37- 41
7. Problem Solving 7 9 42- 50
TOTAL 40 50 1- 50
101
(3) Sample of Table of specifications
Content Class Session K C Ap An Sy Ev
1.Conversion of Unit 3 1 1 1 1 1 1
2. Speed and Velocity 3 1 1 2 1 1
3.Acceleration 2 1 1 1 1
4. Free Falling Bodies 1 1 1
5. Projectile Motion 1 1 1
6. Force 1 1 1
7. Vector 2   2   1   1
8.Work,Energy, Power 3 1 1 2 1 1
9.Conservation of Energy  2 2 1 1
10.Conversation of Momentum 2   1 2   1
TOTAL 20 4 6 8 8 7 7
102
Points to Remember in preparing a table of
Specifications
1)Define and limit the subject matter coverage of
the test depending on the length of the test. 2)
Decide on the point distribution per subtopic. 3)
Decide on the type of test you will construct
per subtopic.
103
  • 4) Make certain that the type of test is
    appropriate to the degree
  • of difficulty of the topic.
  • 5) State the specific instructional objectives in
    terms of the specific types of performance
    students are expected to demonstrate at the end
    of instruction.

104
6) Be careful in identifying the necessary
intellectual skill needed to correctly answer the
test item. Use Blooms taxonomy as reference.
105
Suggestions for Constructing Short-Answer Items
1)Word the item so that the required answer is
both brief and specific. 2)Do not take statements
directly from textbooks to use as a basis for
short-answer items.
106
  • 3)A direct question is generally more desirable
    than an incomplete statement.
  • 4) If the answer is to be expressed in numerical
    units, indicate the type of answer wanted.

107
5) Blanks for answer should be equal in length
and in column to the right of the question. 6)
When completion items are used, do not include
too many blanks.
108
  • Example for
  • 1) Poor An animal that eats the flesh of other
    animals is (carnivorous)
  • Better An animal that eats the flesh of other
    animals is classified as (carnivorous)
  • 2) Poor Chlorine is a (halogen).
  • Better Chlorine belongs to a group of elements
    that combine with metals to form salt. It is
    therefore called a (halogen)

109
Development of Classroom Assessment Tools
Suggestions for Constructing Short-Answer Items
3) Poor John Glenn made his first orbital flight
around the earth in (1962). Better In what
year did John Glenn make his first orbital flight
around the earth? (1962)
110
Selecting the Test Format Selective Test a
test where there are choices for the answer like
multiple choice, true or false and matching
type. Supply Test a test where there are no
choices for the answer like short answer,
completion and extended-response essay.
111
Construction and Tryouts ? Item Writing
? Content Validation ? Item Tryout ? Item
Analysis
112
  • Item Analysis
  • ?refers to the process of examining the students
    response to each item in the test.

113
  • There are two characteristics of an item. These
    are desirable and undesirable characteristics. An
    item that has desirable characteristics can be
    retained for subsequent use and that with
    undesirable characteristics is either be revised
    or rejected.

114
Use of Item Analysis
  • ? Item analysis data provide a basis for
    efficient class discussion of the test results.
  • ? Item analysis data provide a basis for remedial
    work.
  • ? Item analysis data provide a basis for general
    improvement of classroom instruction.

115
  • Use of Item Analysis
  • ?Item analysis data provide a basis for increased
    skills in test construction.
  • ?Item analysis procedures provide a basis for
    constructing test bank.

116
Three criteria in determining the desirability
and undesirability of an item. a) difficulty of
an item b) discriminating power of an item c)
measures of attractiveness Difficulty
index ?refers to the proportion of the number of
students in the upper and lower groups who
answered an item correctly.
117
Development of Classroom Assessment Tools
Level of Difficulty of an Item
Index Range Difficulty Level
0.00-0.20 Very Difficult
0.21-0.40 Difficult
0.41-0.60 Moderately Difficult
0.61-0.80 Easy
0.81-1.00 Very Easy
118
Development of Classroom Assessment Tools
Discrimination Index ?refers to the proportion
of the students in the upper group who got an
item correctly minus the proportion of the
students in the lower group who got the an item
right.
119
Development of Classroom Assessment Tools
Level of Discrimination
Index Range Discrimination Level
Below 0.10 Questionable Item
0.11-0.20 Not discriminating
0.21-0.30 Moderately discriminating
0.31-0.40 Discriminating
0.41-1.00 Very Discriminating
120
Types of Discrimination Index
  • Positive Discrimination Index
  • more students from the upper group got the item
    correctly than in the lower group.
  • Negative discrimination Index
  • More students from the lower group got the item
    correctly than in the upper group.

121
  • Zero Discrimination Index
  • The number of students from the upper group and
    lower group are equal

122
MEASURES OF ATTRACTIVENESS
To measure the attractiveness of the incorrect
option (distractors) in a multiple-choice tests,
count the number of students who selected the
incorrect option in both the upper and lower
groups. The incorrect options should attract less
of the upper group than the lower group.
123
Rubrics ?a systematic guideline to evaluate
students performance through the use of a
detailed description of performance standard. ?
used to get consistent scores across all
students
124
  • ?it provides students with feedbacks regarding
    their weakness and strength, thus enabling them
    to develop their skills.
  • ?allows students to be more aware of the
    expectations for performance and consequently
    improve their performance.

125
Holistic Rubric vs Analytic Rubric
  • Holistic Rubric is more global and does little
    to separate the task in any given product, but
    rather views the final product as a set of all
    interrelated tasks contributing to the whole.

126
  • ? Provide a single score based on an overall
    impression of a students performance on task.
  • ? May be difficult to provide one over all score.
  • Advantage quick scoring, provide overview of
    students achievement.

127
  • Disadvantage does not provide detailed
    information about the student performance in
    specific areas of the content and skills.

128
  • Use a holistic rubric when
  • ? You want a quick snapshot of achievement.
  • ? A single dimension is adequate to define
    quality.

129
Example of Holistic Rubrics
130
  • Analytic Rubric
  • ?breaks down the objective or final product into
    component part each part is scored independently.
  • ?provide specific feedback along several
    dimension.

131
  • Analytic Rubric
  • Advantage more detailed feedback, scoring more
    consistent across students and graders.
  • Disadvantage time consuming to score

132
  • Use an analytic rubric when
  • ? you want to see relative strengths and
    weaknesses.
  • ? you want detailed feedback.
  • ? you want to assess complicated skills or
    performance.
  • ? you wants students to self-assess their
    understanding or performance.

133
Example of Analytic Writing Rubric
134
Example of Analytic Writing Rubric
135
Utilization of Assessment Data
  • Norm-Referenced Interpretation
  • result is interpreted by comparing a student with
    another student where some will really pass.
  • designed to measure the performance of the
    students compared to other students. Individual
    score is compared to others.
  • ?usually expressed in term of percentile, grade
    equivalent or stanine.

136
  • ? Norm-referenced grading is a system typically
    used to evaluate students based on the
    performance of those around them. IQ tests and
    SAT exams would be two examples of this system,
    as well as grading on the curve.
  • ?Norm-referenced grading is more common in
    schools that emphasize class rank rather than
    understanding of skills or facts.

137
Utilization of Assessment Data
Criterion-Reference Interpretation ?result is
interpreted by comparing student based on a
predefined standard where all or none may
pass. ?designed to measure the performance of
students compared to a pre-determined criterion
or standard, usually expressed in terms of
percentage.
138
  • ?Criterion-referenced evaluation should be used
    to evaluate student performance in classrooms.
  • it is referenced to criteria based on learning
    outcomes described in the provincial curriculum.
  • the criteria reflect a student's performance
    based on specific learning activities.

139
  • ?a student's performance is compared to
    established criteria rather than to the
    performance of other students.
  • ?evaluation referenced to prescribed curriculum
    requires that criteria are established based on
    the learning outcomes listed under the
    curriculum.
Write a Comment
User Comments (0)
About PowerShow.com