Zdeslav Hrepic - PowerPoint PPT Presentation

1 / 58
About This Presentation
Title:

Zdeslav Hrepic

Description:

Zdeslav Hrepic. Supported by NSF ROLE Grant # REC-0087788 ... Normalized gain (h) = gain / (maximum possible gain) (Hake, 1997). 37. Using a particular model ... – PowerPoint PPT presentation

Number of Views:29
Avg rating:3.0/5.0
Slides: 59
Provided by: fhsu6
Category:
Tags: hake | hrepic | zdeslav

less

Transcript and Presenter's Notes

Title: Zdeslav Hrepic


1
A real-time assessment of students mental models
of sound propagation
  • Zdeslav Hrepic

Dissertation Defense
Kansas State UniversityPhysics Education
Research Group
Supported by NSF ROLE Grant REC-0087788
2
Outline
  • Rationale Why use in-class, real-time
    assessment?
  • Previous research
  • Mental models of sound propagation.
  • Hybrid mental models and their role.
  • Test construction and validation
  • Results
  • Using the test
  • Further study

3
Real time, in class assessment
Uses some form of Class Response System
Enables quick collection and immediate analysis
of students responses in the classroom.
4
Benefits of class assessment
  • Engages students.
  • Facilitates interactive learning and peer
    instruction (especially in large enrolment
    classes).
  • Gives immediate feedback to the teacher.
  • Enables the teacher to adjust the teaching before
    the exam rather than after it and according to
    specific needs of his/her students.
  • Allows a post lecture detailed analysis.

5
Goal of the study
  • To create a multiple choice test
  • that can elicit students mental models of sound
    propagation
  • during the lecture
  • using a class response system and appropriate
    software.

6
Mental model definition
  • Mental model is
  • an internal (mental) representation
  • analogous to the physical world situations or
    processes that it represents
  • and that serves to explain and predict the
    physical world behavior (Greca Moreira, 2002)
  • Mental model has
  • spatial configuration of identifiable kinds of
    things
  • (a few) principles of how system works and
  • (certain) predictive power (diSessa, 2002)
  • Mental model state
  • Is defined by students consistency (Pure Mixed)

7
Research questions
  • Main question
  • What is the optimal multiple choice test that can
    elicit students mental models of sound
    propagation in a real time, during the
    instruction?
  • (Some of) Sub questions
  • Is model analysis the optimal analytical tool for
    analysis of students responses in this test?
  • How do we represent data so the display provides
    a variety of instruction guiding information?
  • How reliable is the test?
  • How valid is the test?

8
Starting point in test creationIdentifying
mental models of sound propagation
  • Hrepic, Z., Zollman, D., Rebello, S. (2002).
    Identifying students' models of sound
    propagation. Paper presented at the 2002 Physics
    Education Research Conference, Boise ID.

9
4 basic models - mechanisms of propagation
10
4 basic models - mechanisms of propagation
Wave ModelScientifically Accepted Model
() Ear Born Sound
Propagating Air
Hybrid Models
Dependent Entity
Independent Entity Dominant Alternative Model
11
Implications of hybrid mental models
  • Implications for analysis of our test
  • Hybrid models cause overlaps in multiple choice
    questionnaires more than one model corresponds
    to the same choice (E.g.)
  • Model analysis requires one on one match of model
    and answer choice
  • Implications for teaching
  • A student can give a variety of correct answers
    on standard questions using a hybrid (wrong)
    model (E.g.)

12
Constructing the test
  • Four steps of test construction and validation
  • Pilot testing
  • Pre-survey testing
  • Survey testing
  • Post Survey testing

13
Pilot testing
  • Did we miss anything in terms of mental models?
  • Open-ended questionnaire on a large sample
  • Did we miss anything in terms of productive
    questions to determine students mental models?
  • Battery of semi-structured conceptual questions
    related to sound as a wave phenomena in variety
    of situations

14
Test Contexts1. Air
How does sound propagate in this situation?
15
Test Contexts2. Wall
How does sound propagate in this situation?
16
Test Contexts1a, 2a - Vacuum
What happens without the medium (air or wall)?
17
Pre-survey testing
  • 5 option multiple choice test needed
  • Does our choice selection match students
    needs?
  • Trial with
  • None of the above
  • More than one of the above
  • Validation through expert reviews
  • Probing and refining the test through students
    interviews

18
Survey testing
  • Surveying - to determine
  • Stability of results
  • across different institutions at equivalent
    educational levels
  • across different course levels at same
    institutions
  • Instructional sensitivity of the test
  • Correlations between response items
  • Model distributions at different levels - for
    future use
  • Interviewing to determine
  • To validate new test version
  • To inform and make sense of survey findings

19
Test questions - paraphrased
  • What is the mechanism of sound propagation in the
    air/wall?
  • How do particles of the medium vibrate, if at
    all, while the sound propagates?
  • How do particles of the medium travel, if at all,
    while the sound propagates?
  • What does this motion have to do with sound
    propagation cause and effect relationship?
  • What does this motion have to do with sound
    propagation time relationship?
  • What happens with sound propagation in the vacuum?

20
Displaying the test results
  • Several representations of students state of
    understanding
  • Available in real time and in post instruction
    analysis
  • Consistency
  • Consistent a student uses one model(Pure model
    state)
  • Inconsistent a student uses more than one
    model(Mixed model state)

21
Using a particular model Pre Instruction
Calculus based University NY
Inconsistently
Consistently
N 100
22
Using a particular model at least once Pre
Instruction Calculus based University NY
Inconsistently
Consistently
N 100
23
Movements of particles of the medium Pre
Instruction Calculus based University NY
() Random Travel
() Travel Away From The source
Vibration on the Spot
N 100
24
Model states Pre Instruction Calculus based
University NY
Mixed Any
Pure Other
Mixed Entity
Pure Wave
Mixed Ear-Wave
N 100
25
Correctness Pre Instruction Calculus based
University NY
N 100
26
Survey participants
27
Survey Results
  • Results stable? Differences meaningful?
  • Comparing consistency and correctness
  • Different levels Pre- and post-instruction

28
Comparing correctness and consistencyDifferent
levels Pre- and post-instruction
29
Comparing correctness and consistencyDifferent
levels Pre- and post-instruction
30
Comparing model distributionDifferent
educational levels
31
Comparing model distribution Grouped models
Different educational levels
32
Comparing model distribution Grouped models
Different educational levels
33
Comparing model distribution Grouped models
Different Educational Levels
34
Comparing model distributionDifferent course
levels
35
Comparing differences in model distributionVariab
ility within different educational levels
36
Pre-Post instruction difference
Gain (G) (post-test) (pre-test) Normalized
gain (h) gain / (maximum possible gain) (Hake,
1997).
37
Using a particular model Pre Instruction
Calculus based University NY
Inconsistently
Consistently
N 100
38
Using a particular model Post Instruction
Calculus based University NY
Inconsistently
Consistently
N 95
39
Movements of particles of the medium Pre
Instruction Calculus based University NY
() Random Travel
() Travel Away From The source
Vibration on the Spot
N 100
40
Movements of particles of the medium Post
Instruction Calculus based University NY
() Random Travel
() Travel Away From The source
Vibration on the Spot
N 95
41
Correctness Pre Instruction Calculus based
University NY
N 100
42
Correctness Post Instruction Calculus based
University NY
N 95
43
Correlation analysis of answer choices
44
Validity interviews
  • 17 x 4 probes in the interviewed sample.
  • The invalid display of a model would have
    occurred in 6 instances
  • 8.8 of the probes
  • 3 instances because of 5a
  • ( another 3 that did not cause invalid probe)

45
Post-Survey Testing
  • Expert review
  • To validate post survey version
  • Few minor items improved
  • Surveying
  • To determine correlations between response items
    and see if changes made the desired effect.
  • Problems fixed
  • Role playing validation
  • To validate new test version in an additional way
  • Perfect score

46
Test Reliability
  • Reliability pertains to the degree to which a
    test consistently measures what it is supposed to
    measure. (Oosterhof, 2001)
  • Content sampling error
  • Occasion sampling error
  • Examiner Error
  • Scorer Error

47
Reliability addressedContent sampling error
  • Occurs because students may be more or less lucky
    with how test items correspond to things they
    know.
  • To reduce Test more content
  • To measure Need parallel form
  • Issues No parallel form, Context dependence
  • Reduced by probing a single model multiple times
  • Addressed by showing meaningful correlations
    between the answer choices
  • Not neg. if related to same model (pos. and
    frequently sig.)
  • Not sig. pos. if related to different models
    (Except Dependent/Independent entity models -
    continuum)
  • Pertain only to secondary and tertiary levels
    but not to primary

48
Occasion sampling error
  • Occurs because students can be more or less lucky
    with respect to time when the test was
    administered.
  • To reduce Test more often
  • To measure Need multiple administrations of the
    same test
  • Issues Problematic for instructors and students,
    Economy
  • Did not probe time stability
  • Addressed by showing
  • Stable results across institutions at the same
    level
  • Meaningful differences between educational levels
  • Meaningful differences between course levels
  • Meaningful differences between pre- and
    post-instruction

49
Examiner error Scorer error
  • Examiner error occurs because of the differences
    in examiners.
  • Not measurable
  • Was reduced through the standard introduction to
    the test (verbal and written)
  • Scorer error occurs if students scores depend on
    who happened to mark their work.
  • Not an issue - computerized analysis of results.
  • All four of the treats to the reliability well
    addressed
  • Gives a ground for the statement that the test is
    a reliable instrument.

50
Validity addressed
  • Test Validity
  • The extent to which a test measures what it is
    supposed to measure and nothing else. (Oosterhof,
    2001)
  • Validity concerns the appropriateness of
    inferences and actions that are based on a tests
    scores. (Hanna, 1993, p. 8)
  • Validity is not an attribute of the test, but of
    the interaction of a test with a situation in
    which the test is used to make decisions.
    (Hanna, 1993 p. 382)
  • Content-related evidence of validity
  • Criterion-related evidence of validity
  • Construct-related evidence of validity

51
Content-related evidence of validity
  • Indicates how well the content of a test
    corresponds to the student performance that we
    want to observe. (Oosterhof, 2001)
  • Addressed by
  • Experts review of the content and correctness of
    the answer choices
  • Demonstrated instructional sensitivity
  • Table of (content) specifications

52
Criterion-related evidence of validity
  • Indicates how well performance on a test
    correlates with performance on relevant criterion
    measures external to the test (Oosterhof, 2001,
    p.55)
  • concurrent validation (compares the test results
    a parallel, substitute measure).
  • predictive validation (compares the test results
    with follow up testing)
  • Addressed by
  • Validation through the interviews
  • Think aloud interview protocols
  • Comparisons of students free answers in interview
    setting with their results on the test
  • Role playing validation
  • Correlation analysis of answer choices

53
Construct-related evidence of validity
  • Establishes a link between the underlying
    psychological construct we wish to measure and
    the visible performance we choose to observe.
    (Oosterhof, 2001, p.46)
  • Addressed by
  • Building the case on previous research
  • Table of (construct) specifications

54
Prospective uses of test, test questions
  • Formative assessment combined with any
    instructional method/approach
  • traditional
  • progressive
  • misconception oriented
  • Model cause
  • Misconception symptom
  • As peer instruction questions (not model
    defining)
  • Not recommended as a summative assessment
  • Online package related to test and analysis of
    data available at http//web.phys.ksu.edu/role/so
    und/

55
Limitations
  • Common to multiple choice tests
  • Answer options do affect students understanding /
    models
  • Test taking strategies may obscure results
  • Test projects no model state as mixed model state
    and possibly pure model state.

56
Future researchUnique approach - Wide themes
opened
  • Applicability of the approach in other domains of
    physics
  • Is the approach hybrid model-(in)dependent?
  • Applicability in domains of other natural
    sciences?
  • How effectively teachers can implement the
    real-time aspects of this testing approach?
  • Instructional utility of this type of testing
    Will addressing of the underlying models in real
    time help students learn?
  • Possibility of individualized addressing of
    students models in real time?
  • Applicability of the testing approach in
    eliciting non-cognitive psychological constructs
  • Personality tests Would it provide information
    that current tests in that field do not?
  • Reduction of items when compared to Likert scale

57
Future researchSpecific issues opened
  • Optimal using of the test in combination with
    online homework
  • Saving of time
  • Any classroom benefit counterbalance?
  • How applicable is this test at the middle school
    level?
  • How would a branched version of the test look,
    and would it have any advantages with respect to
    this one?
  • Improved simplicity and validity of the test

58
More Information / Feedback
zhrepic_at_phys.ksu.edu http//www.phys.ksu.edu/zhre
pic/
Thank You!
Write a Comment
User Comments (0)
About PowerShow.com