Implementing Webbased Interactive Testing with Dynamic Interventions in Middle School Classrooms Usi PowerPoint PPT Presentation

presentation player overlay
1 / 25
About This Presentation
Transcript and Presenter's Notes

Title: Implementing Webbased Interactive Testing with Dynamic Interventions in Middle School Classrooms Usi


1
Implementing Web-based Interactive Testing with
Dynamic Interventions in Middle School Classrooms
Using Hot Potatoes Software
IT899 Masters Project Instructional Design
TechnologyEmporia State UniversityMay
2005Sheryl H. Brown
Photograph Retrieved, April 28, 2005 from
http//www.westengland.devon.sch.uk/images/picture
s/at_computer2.jpg
2
Action research involving the development and
implementation of interactive web-based testing
Focus Writing Implementing interactive
assessments that utilize dynamic intervention
feedback
Hints HelpsPrompts
Who, What, When, Where Questions that aid in the
construction of meaning
guidance
Dynamic Interventions
3
Objective - Goals
Evaluate the use of Hot Potatoes as an effective
and efficient tool for creating web-based
materials for the regular classroom.
2 Increase student motivation, particularly
for low and at-risk readers.
3 Model ways teachers could incorporate new
technology and interactive testing as
differentiated instruction
4 - Evaluate student response to computerized
testing
5 Evaluate test development time and
ease of use
6 Evaluate teacher responses to interactive
testing as a means of incorporating online
strategies into current classroom management
scenarios.
1 - Introduce new forms of technology
integration into reading with at least one
Literacy First Classrooms and Title I
After- School reading program and in the
content area for Computer Keyboarding
Classrooms
4
Research indicated . . . . . . .
Computer-based assessments online interactive
learning activities. . .
Shift learning from behavioral to learner-centered
Allow for self-pacing, individualization, (by
adapting content) opportunities to use more
differentiated instruction
Increases retention, produces higher achievement
gains than conventional instruction alone
eliminates drudgery
Impact student attitude, increase learning
engagement
  • enhance student ability to learn specific
    concepts high achievement gains

Removes Obstacles never tires, never angers,
patient, allows privacy, always praises, fun,
gives immediate feedback, more objective than
teachers, teaches in manageable increments, great
motivators
  • improve data collection, reporting, accounting
    methodology
  • augment collaborative learning,
    cooperative learning
  • frees teachers/facilitators, anytime, anyplace,
    synchronous/asynchronous -- cost effective

Parents embrace validity/valueWeb-based more
often tied to standards
  • creates ownership of Learning

5
Instructional Design Model
Rapid Prototyping Model
  • 1. Test User Interface
  • 2. Test Effectiveness of Teaching Strategy
  • Develop Models
  • Give clients product intended use
  • Feedback reactions

(Tripp Bichelmeyer, 1990)
6
Instructional Design Process
  • Focused on Real-World Performance- Typical
    Middle School Classroom in reading in
    content area (computer keyboarding)
  • Instructional Team Approach Reading Coach,
    After-School Staff, Reading Teachers

C. Needs Analysis, Surveys, Study in reading
and testing, Learned basic of Hot Potatoes
D. Implementation Participants 150 Students.
Wrote Hot Potatoes Quizzes with feedback
mechanisms for training of staff and for students
E. Data Collection Formative
Assessment/Evaluation made revisions, wrote
additional pieces, re-implemented
F. Did more staff training, summative
evaluations with staff, students, summative
informal interview with staff and students
7
Why use Interactive Feedback?
Student learning motivation as a process of
feedback . .
  • help learner view abilities as improvable
  • present tasks as non-threatening,
    non-competitive, task-focused only
  • enable leaner to view the computer as a
    learning resource, not an entertainment
    source
  • reflect difficulties and challenges as
    positive interactions that do not reflect
    skill or ability levels
  • continually and steadily increase learner
    self-efficacy
  • help learner understand that effort was a
    contributor to success or failure, not
    ability alone (Hoska, 1993)

8
Quoted from Mason Bruning, 2001 Feedback can
take on many forms depending on the levels of
verification and elaboration incorporated into
the item response has focused on eight commonly
used levels of feedback
No-feedback. Often used as a comparison
condition, a no-feedback condition simply
provides learners with the performance score with
no reference to individual test items. This
minimal level of feedback contains neither
verification nor elaboration, but simply states
the learners number or proportion of correct
responses
Answer-until-correct. A modification of
knowledge-of-response feedback that often is
associated with mastery learning instruction is
answer-until-correct feedback. Answer-until-correc
t feedback provides verification but no
elaboration, and requires the learner to remain
on the same test item until the correct answer is
selected.
No Feedback
Answer UntilCorrect
Knowledge ofResponse
Knowledge-of-response- simplest form of feedback,
knowledge-of-response tells learners whether
their answers are correct or incorrect--essential
for verification purposes, it does not provide
any information that would further the learners
knowledge or provide additional insight into
possible errors in understanding
Topic-contingent. Topic-contingent feedback
provides item verification and general
elaborative information concerning the target
topic. After incorrect responses, learners are
returned to passages or other learning material
where the correct information is located. they
are given additional information from which they
may find the answer. While topic-contingent
feedback makes extensive elaborative information
available, it depends upon learners to locate the
correct answer within the instructional material.
  • Response-contingent. Response-contingent
    feedback, also termed extra-instructional
    feedback, provides both verification and
    item-specific elaboration. In addition to
    providing knowledge of the correct response,
    response-contingent feedback gives
    response-specific feedback that explains why the
    incorrect answer was wrong and why the correct
    answer is correct.
  • .

Knowledge of Correct Response
Topic Contingent Feedback
Response ContingentFeedback
Knowledge-of-correct-response. Knowledge-of-correc
t-response feedback provides individual item
verification and supplies learners with the
correct answer. Knowledge-of-correct-response
provides no elaborative information, however,
beyond identification of the correct response
option
Bug-related. Bug-related feedback provides
verification and addresses specific errors.
Bug-related feedback relies on "bug libraries"
or rule sets to identify and correct a variety of
common student errors. While bug-related feedback
does not provide learners with the correct
response, it can assist them in identifying
procedural errors so that self-correction is
possible.
Bug-Related Feedback
Attribute-isolation. Attribute-isolation feedback
provides item verification and highlights the
central attributes of the target concept.
Attribute-isolation feedback focuses learners on
key components of the concept to improve general
understanding of the phenomenon.
Attribute Isolation
9
Appendix JTypes of Feedback Used in the Project
   
 
 
   
 
10
What is Hot Potatoes Software?
What it IS and what it is NOT ? Curious?
11
Thats a lot of NOTS What about the ISes
Advantages?It is . . . . ?
It is . . .

12
  • Shareware
  • Web-based
  • JavaScript
  • Provides varying levels of interactivity
  • Global Presence

13
Hot Potatoes 5 Basic Parts The Masher
14
The Different Potatoe
Click on each test icon!
Just Ask Dan!Flash Cards
Just Ask Dan! Crossword
Just Ask Dan! Cloze Exercise
Just Ask Dan! Multiple-Choice Quiz
15
Project Results by Objectives
  • Objective Introduce technology integration -
    Align with school improvement
    goals
  • Positive use of technology, a more seamless
    integration for both teachers and
    students
  • Focused on real-world performance students
    began to take ownership of learning
    activities.
  • Objective Model ways to incorporate new
    technology online,
    interactive testing as differentiated instruction
  • Instructional Team approach, staff development,
    learning activities, evaluations Staff had
    first hand view
  • Objective Increase student motivation - Impact
    instruction in more than one
    curriculum area
  • Offered online learning assessment/exercises to
    regular reading classes, supplemental
    reading learning, and keyboarding students
  • Objective Evaluate the use of Hot Potatoes as
    an effective efficient
    classroom tool for creating web-based materials.

Results Observations
16
Project Results
Instructional Design Model Rapid Prototyping -
Very Suitable! Positive use of technology, a more
seamless integration Allows for differentiated
instruction/individualization Students preferred
online to paper/pencil Gives students a feeling
of control/ownership Decreased fear of
tests Rewards, praises, encourages, charts
personal progress Allows for flexible classroom
environment e.g. centers Increases
time-on-task/engagement Allows for self-pacing,
multiple trys without embarrassment Feedback
timing was critical students performed better
with immediate/multiple try feedback Staff
engagement limited awed at benefits, but saw
online testing as too much work (teachers felt
technologically limited)
17
Conclusions
Offered new way to integrate technology, but
learning curve may be still be a problem Student
acceptability very high, enjoyable Not fool
proof, not true test authoring Requires potential
increase of student/computer ratio Represents
technology that is on the forefront will
grow Requires planning, sound instructional
design Student acceptability very high,
enjoyable
18
Hot Potatoes Diet Offers Test Cures!
Students Accept No Dud Spuds!
Give it a try! Go to hotpotatoes.com There are
many possibilities called New Skins for
enhancing the online appearance, just dont be
left behind!
newspaper picture retrieved April 6, 2005 from
http//www.redsongbird.com/ClipArt/pages/newspaper
.htm
19
References Alessi, S. M. Trollip, S. R.
(2001). Multimedia for learning. (3rd ed.).
Boston Allyn and Bacon. Atkinson, E. (1999).
Key factors influencing pupil motivation in
design and technology. Journal of
Technology Education, 10 (2). Retrieved January
17, 2005 from http//scholar.lib.vt.edu/ejou
rnals/JTE/v10n2/atkinson.html Beck, J., Mostow,
J., Cuneo, A., Bey, J. (2003, July) . Can
automated questioning help children's
reading comprehension?, Proceedings of the Tenth
International Conference on Artificial
Intelligence in Education, (AIED2003), 380-382.
Retrieved January 20, 2005 from
http//www.2.cs.cmu.edu/listen/pdfs/AIED2003_help
ing_comprehension_with_ automated_questions.
pdf Bostock, S. (1998). Constructivism in mass
higher education a case study. Retrieved
February 2, 2005, from http//www.keele.ac.u
k/depts/cs/Stephen_Bostock/docs/sin98pa6.htm Buil
ding on technologys promise. (1999). Southwest
Educational Development Laboratory Technology
Assistance Program. Retrieved January 17, 2005,
from http//www.sedl.org/pubs/tec26/cnc.html Byer
s, C., (2001). Interactive assessment an
approach to enhance teaching and learning.
Journal of Interactive Learning Research,
Winter, (12), 359-375. Retrieved January 18,
2005, from http//0web7.infotrac.galegroup.co
m Clariana, R., Wagner, D., Murphy, L. (2000).
Applying a connectionist description of feedback
timing. Educational Technology Research
and Development. 48
20
Clark, L. (2004, May). Computerized adaptive
testing effective measurement for all students.
T.H.E. Journal Online. Retrieved January
19, 2005, from http//www.thejournal.com/mag
azine/vault/A4814.cfm Cooper, S. (2004,
September). Computerized practice tests boost
student achievement. T.H.E. Journal Online.
Retrieved January 19, 2005, from
http//www.thejournal.com/magazine/vault/articlepr
intversion.cfm?aid4997 Cotton, K. (1991).
Computer-assisted instruction. Northwest Regional
Educational Laboratory. Retrieved January
14, 2005, from http//www.nwrel.org/scpd/sirs/5/cu
10.html Duffey, D. (2004, June). Connecting
people and information to improve student
achievement. T.H.E. Journal Online.
Retrieved January 19, 2005, from
http//www.thejournal.com/magazine/vault/A4873.cfm
Field, R. (1999, Spring). Web-based quizzes
produced by Hot Potatoes. Newsletter on
Philosophy and Computers, APA Newsletters,
98,(2). Retrieved January 5, 2005, from
http//www.apa.udel.edu/apa/archive/newsletters/v9
8n2/computers/field.asp Fletcher, G. (2004,
February). The importance of delivering classroom
content via technology. T.H.E. Journal
Online. Retrieved January 3, 2005, from
http//www.thejournal.com/magazine/vault/A4671.cfm
Fogg, B. (2003). Persuasive technology. Using
computers to change what we think and do.
San Francisco Morgan Kauffman Publishers. Grist,
S. (1989). Computerized adaptive tests. ERIC
Clearinghouse on Tests Measurement and
Evaluation. Washington DC American Institutes
for Research
21
Hartsell, T., Yuen, S. (2003). Developing
on-line exams. American Technical Education
Association Journal, 31. Holmes, M. (2004). Hot
potatoes comes of age. Currents, (1)3. Hoska,
D. M. (1993). Motivating learners through CBI
feedback developing a positive learner
perspective. In J. V. Dempsey G. C. Sales
(Eds.), Interactive instruction and feedback (pp.
105132). Englewood Cliffs, NJ Educational
Technology. Horton, W. (2001). Designing
web-based training. New York John Wiley Sons,
Inc. Hot Potatoes from Half-Baked
Software. (2004). British Columbia, Canada
University of Victoria. Retrieved December
2, 2004, from http//web.uvic.ca/hrd/halfbaked/ J
ohnson, M., Green, S., (2004). On-line
assessment the impact of mode on students
strategies, perception and behaviors. Paper
presented at the British Educational Research
Association Annual Conference. University of
Manchester, UK. Retrieved January 18, 2005,
from http//www.leeds.ac.uk/educol/documents/00003
729.htm Joy, M., Muzykantskii, B., Rawles, S.,
Evans, M. (2002). An infrastructure for web-based
computer-assisted learning. The ACM Journal
of Educational Resources, 2(41), 1-19. King, T.,
Duke-Williams, E. (2002). Using computer-aided
assessment to test higher level learning
outcomes. Retrieved January 14, 2005, from
http//www.tech.port.ac.uk/kingt/research/lough/C
AA01.html Kozulin, A., Garb, E., (2001).
Dynamic assessment of EFL text comprehension.
Proceedings from the 9th Conference of the
European Association for Research on Learning and
Instructionhttp//www.thejournal.com/magazin
e/vault/A4873.cfm
22
http//www.icelp.org/files/research/DynamicAssessO
fEFL.pdfsearch'Dynamic20assessment20of20EFL2
0text20comprehension. Kruse, K. (2004).
Gagnes nine events of instruction an
introduction. Retrieved January 28, 2005, from
http//www.e-learningguru.com/articles/art3_3.
htm Kruse, K. (2004). The magic of learner
motivation the arcs model. Retrieved January 20,
2005, from http//www.e-learningguru.com/arti
cles/art3_5.htm Lee, J. (1999). Half-baked
ideas lead to hot new software. The Ring. The
University of Victoria. Retrieved January
5, 2005, from http//ring.uvic.ca/99/june1/halfbak
ed.html Mack, R., Masullo, M. (1997).
Educational multimedia perspective in evolution.
Paper presented at ED-MEDIA and ED-TELCOM,
Calgary, Canada. Retrieved January 16, 2005, from
http//www.ianr.unl.edu/eduport/R-EDME97.htm
Mason, B. Bruning, R. (2001). Providing
feedback in computer-based instruction what the
research tells us. University of
Nebraska-Lincoln Center for Instructional
Innovation. Retrieved February 2, 1005, from
http//dwb.unl.edu/Edit/MB/MasonBruning.html McHe
nry, B., Griffith, L., McHenry, J. (2004,
April). The potential, pitfalls and promise of
computerized testing. T.H.E. Journal
Online. Retrieved January 22, 2005, from
http//www.thejournal.com/magazine/vault/A4769.cfm
Merrill, D. (2000). Instructional transaction
theory instructional design based on knowledge
objects. Retrieved January 27, 2005, from
http//www.id2.usu.edu/Papers/7ReigChp.PDFsear
ch'instructional 20transaction20theory20
instructional20design20based20on
20knowledge20objects'
23
Northrup, P. (2001). A framework for designing
interactivity into web-based instruction.
Educational Technology, 41 (2), 31-39.
Retrieved January 27, 2005, from
http//cops.uwf.edu/itc/itc-
research/Framework20forOnline20Interaction.pdf
Olson, A. (2004). Technology that moves student
achievement and assessment forward. MultiMedia
and Internet Schools, 11 (6).
26-29. Reading and writing tools. (2003,
December). T.H.E. Journal Online. Retrieved
January 22, 2005, from http//www.thejournal
.com/magazine/vault/A4610.cfm Reiser, R.,
Dempsey, J. (2002). Trends and Issues in
Instructional Design and Technology. Upper
Saddle River, New Jersey Pearson Education,
Inc Russo, A. (2002). Mixing technology and
testing. School Administrator, 59(4), 6-12.
Retrieved January 20, 2005, from
http//www.aasa.org/publications/sa/2002_04/russo.
htm Stevenson, B. (2003). The computer as a tool
for assessment. Journey into history. Retrieved
February 22, 2005, from http//www.journeyinto
history.com/time/computerAssessment.htm
Summers, L. (2000). How to create interactivity
in online training. Retrieved January 22, 2005,
from http//techrepublic.com.com/5100-6317_1
1-5032032-1.html Technology can raise
achievement. (2001, July). The CEO Forum
National School Boards Association. Retrieved
January 18, 2005, from http//www.nsba.org/s
ite/doc_sbn.asp?TRACKIDDID7929CID322 Technol
ogy can raise achievement. (2001, June). The
Electronic School The School Technology
Authority. Retrieved January 20, 2005, from
http//www.electronic- school.com/2001/09/09
01ewire.html
24
Top 10 considerations when purchasing an
assessment system. (2004, September). T.H.E.
Journal Online. Retrieved January 19, 2005,
from http//www.thejournal.com/magazine/vault/A499
3.cfm Tripp, S. Bichelmeyer, B. (1990). Rapid
prototyping an alternative instructional design
strategy. Educational Technology Research
and Development, 38(1), 31-44. Valdez, G.,
McNabb, M., Foertsch, M., Anderson, M., Hawkes,
M., Raack, L. (2004). Computer- based
technology and learning evolving uses and
expectations. North Central Regional
Educational Laboratory. Retrieved January 20,
2005 from http//www.ncrel.org/tplan/cbtl/toc.htm
Wilhem, J. (2001). Improving comprehension with
think-aloud strategies. New York Scholastic,
Inc. Wilson, B. G., Jonassen, D. H., Cole, P.
(1993). Cognitive approaches to instructional
design. In G. M. Piskurich (Ed.), The ASTD
handbook of instructional technology. Retrieved
February 4, 2005, from, http//www.cudenver.
edu/bwilson Winke, P., MacGregor, D. (2001).
Review of Hot Potatoes. Language Learning and
Technology, 5(2). Retrieved January 5,
2005, from http//llt.msu.edu/vol5num2/review3/def
ault.html Wise, S., Kingsbury, G. (2001).
Practical issues in developing and maintaining a
computerized adaptive testing program.
Psicologica 21, 135-155. Retrieved January 19,
2005, from http//www.uv.es/psicologica/arti
culos1y2.00/wise.pdfsearchpractical20issues
20in20developing20and20maintaining20a20c
oputerized20adaptive 20testing20program
Zieba-Warcholak, A., (2004). Creating
interactive materials for EAL/ELT. The Onestop
Magazine. Retrieved January 14, 2005, from
http//www.onestopenglish.com/News/Magazine/
Archive/interactive_hotpotatoes.htm
25
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com