COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING

Description:

Title: TECHNOLOGY AND TESTING: PROBLEM SOLVING AND TEAMWORK Author: cse Last modified by: cse Created Date: 5/11/1999 11:41:53 PM Document presentation format – PowerPoint PPT presentation

Number of Views:152
Avg rating:3.0/5.0
Slides: 32
Provided by: CSE122
Category:

less

Transcript and Presenter's Notes

Title: COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING


1
COMPUTER-BASED ASSESSMENT OF COLLABORATIVE
PROBLEM SOLVING
  • Harry O'Neil
  • University of Southern California National
    Center for Research on Evaluation, Standards, and
    Student Testing (CRESST)
  • Gloria Hsieh University of Southern California
  • Gregory K. W. K. ChungUCLA/CRESST

CRESST Conference Los Angeles, CA September 15,
2000
CRESST Conference 9/15/00 v.3
2
CRESST MODEL OF LEARNING
Content Understanding
Collaboration
Learning
Problem Solving
Communication
Self-Regulation
3
JUSTIFICATION WORLD OF WORK
  • The justification for collaborative problem
    solving as a core demand can be found in analyses
    of both the workplace and academic learning
  • ONeil, Allred, and Baker (1997) reviewed five
    major studies from the workplace readiness
    literature. Each of these studies identified the
    need for (a) higher order thinking skills, (b)
    teamwork, and (c) some form of technology
    fluency. In four of the studies, problem-solving
    skills were specifically identified as essential.

4
JUSTIFICATION NATIONAL STANDARDS
  • New standards (e.g., National Science Education
    Standards) suggest new assessment approaches
    rather than multiple-choice exams
  • Deeper or higher order learning
  • More robust knowledge representations
  • Integration of mathematics and science
  • Integration of scientific information that
    students can apply to new problems in varied
    settings (i.e., transfer)
  • Integration of content knowledge and problem
    solving
  • More challenging science problems
  • Conduct learning in groups

5
MODELSPROBLEM SOLVING DEFINITION
  • Problem solving is cognitive processing directed
    at achieving a goal when no solution method is
    obvious to the problem solver (Mayer Wittrock,
    1996)
  • Problem-solving components
  • Domain-specific knowledge (content
    understanding)
  • Problem-solving strategy
  • Domain-specific strategy in troubleshooting
    (e.g., malfunction probability i.e., fix first
    the component that fails most often)
  • Self-regulation (metacognition planning,
    self-monitoring motivation effort,
    self-efficacy )

6
PROBLEM SOLVING
Content Understanding
Domain-DependentProblem-Solving Strategies
Self-Regulation
Metacognition
Motivation
Self- Monitoring
Planning
Self- Efficacy
Effort
7
COMPUTER-BASED PROBLEM- SOLVING TASK (CAETI)
  • Metacognition and motivation are assessed by
    paper-and-pencil survey instrument
    (self-regulation)
  • Create a knowledge map on environmental science
    (Content understanding)
  • Receive feedback on it
  • Using a simulated Web site, search for
    information to improve it (problem-solving
    strategy)
  • Relevance, searches, browsing
  • Construct a final knowledge map
  • Serves as the outcome content understanding
    measure

8
CRESSTS CONCEPT MAPPER
9
CORRELATION COEFFICIENTS OUTCOME AND PROCESS
VARIABLES (N 38)
10
CONCLUSIONS
  • Computer-based problem-solving assessment is
    feasible
  • Process/product validity evidence is promising
  • Allows real-time scoring/reporting to students
    and teachers
  • Useful for program evaluation and diagnostic
    functions of testing
  • Whats next?
  • Generalizability study
  • Collaborative problem solving with group task

11
TEAMWORK MODEL
12
CRESST ASSESSMENT MODEL OF TEAMWORK
Pre-Defined Process Taxonomy
Pre-Defined Messages
Union Management Negotiation/ Networked Concept
Map
Real-Time Assessment and Reporting
Networked Computers
Simulation
13
CORRELATION BETWEEN TEAM PROCESSES AND OUTCOME
MEASURES1(N 26)
C R E S S T / U S C
CREEST Conference 9/15/00 v.1
14
Nonparametric (Spearman) Correlations Between
Team Processes and Post Outcome Measures for
Concept Map (N 14)
15
PUZZLE
  • Unfortunately, the concept mapping study (Chung
    et al., 1999) found that the team process did not
    predict team outcomes, unlike the union
    management negotiation task.
  • We hypothesized that the lack of useful feedback
    in the concept mapping task and low prior
    knowledge may have influenced the results.

16
ONGOING RESEARCH
  • We changed the nature of the task to provide more
    extensive feedback and to create a real group
    task
  • Feedback will be knowledge of response feedback
    versus adaptive knowledge of response feedback
  • A group task is a task where
  • no single individual possesses all the resources
  • no single individual is likely to solve the
    problem or accomplish the task objective without
    at least some input from others (Cohen
    Arechevala-Vargas, 1987)
  • One student creates the concept map, the other
    student does the searches

17
KNOWLEDGE OF RESPONSE FEEDBACK(Schacter et al.
Study)
  • Your map has been scored against an experts map
    in environmental science. The feedback tells you
  • How much you need to improve each concept in
    your map (i.e., A lot, Some, A little).
  • Use this feedback to help you search to improve
    your map.

A lot Some A
little ___________________________________________
____ Atmosphere Climate Evaporation Bacteria Carbo
n dioxide Greenhouse gasses Decomposition Photosyn
thesis Oxygen Sunlight Waste Water
cycle Respiration Oceans Nutrients Consumer Food
chain Producer
Adapted Knowledge of Response (the above the
following)
Improvement You have improved the food chain
concept from needing A lot of improvement to
the Some improvement category. Strategy It is
most useful to search for information for the A
lot and Some categories rather than the A
little category. For example, search for
information on atmosphere or climate first,
rather than evaporation.
18
GENERAL LESSONS LEARNED
  • Need model of cognitive learning (the Big 5)
  • Need submodels of process
  • problem solving is content understanding,
    problem-solving strategies, self-regulation
  • Teamwork is adaptability, coordination, decision
    making, interpersonal skill, leadership, and
    communication
  • For diagnostic low-stakes environments need
    real-time administration, scoring, and reporting
  • Role of type of task and feedback may be critical
    for assessment of collaborative problem solving

19
BACK-UP SLIDES
CREEST Conference 9/15/00 v.1 p.19
20
ASSESSMENTS FOR TYPES OF LEARNING
TYPES OF LEARNING
ASSESSMENT METHODOLOGY
Content Understanding Facts Procedures Explanati
on Tasks, Concept Mapping, Concepts Principles Mul
tiple-Choice, Essays
Problem Solving Domain-Specific Domain-specific Au
gmented Concept Mapping With Search
Task, Self-regulation strategies Transfer Tasks,
Motivation (effort, self- efficacy, anxiety),
Search Strategies
Team Work and Collaboration Coordination Adaptabil
ity Collaborative Simulation, Self Report,
Leadership Interpersonal Observation Decision
Making
Self-regulation Planning Self-Report,
observation, inference Self-Checking Self-Effica
cy Effort
Communication Comprehension Use of
Conventions Explanation scored for
communication Expression Multimode
21
DOMAIN SPECIFICATIONS EMBEDDED IN THE
UNION/MANAGEMENT NEGOTIATION SOFTWARE
CRESST Conference 9/15/00 v.1 p.19
22
Domain Specifications Embedded in the Knowledge
Mapping Software
23
BOOKMARKING APPLET
24
SAMPLE METACOGNITIVE ITEMS
The following questions refer to the ways people
have used to describe themselves. Read each
statement below and indicate how you generally
think or feel. There are no right or wrong
answers. Do not spend too much time on any one
statement. Remember, give the answer that seems
to describe how you generally think or feel.
Note. Formatted as in Section E, Background
Questionnaire Canadian version of the
International Adult Literacy Survey (1994). Item
a is a planning item item b is a self-checking
item. Kosmicki (1993) reported alpha reliability
of .86 and .78 for 6-item versions of these
scales respectively.
25
TEAMWORK PROCESSES
26
SCREEN EXAMPLE
27
FEEDBACK FREQUENCY
  • Lowering the percentage of feedback
  • slows down the acquisition of concepts
  • but facilitates the transfer of knowledge

28
TIMING OF FEEDBACK
  • Delayed-Retention Effect (Delayed gt Immediate)
  • Classroom or Programmed Instruction Settings
    (Immediate gt Delayed)
  • Developmental difference
  • Younger children gt Immediate gt Delayed
  • Older children gt Delayed gt Immediate

29
THREE CHARACTERISTICS OF FEEDBACK
  • Complexity of feedback
  • What information is contained in the feedback
    messages
  • Timing of feedback
  • When is the feedback given to students
  • Representation of feedback
  • The form of the feedback presented (text vs.
    graphics)

30
CORRELATIONS BETWEEN TEAMWORK PROCESS SCALES AND
OUTCOME MEASURES FOR UNION PARTICIPANTS (N 48)
31
THE NATURE OF TASKS Interaction will be
positively related to productivity under two
conditions
  • Group Tasks
  • No single individual possesses all the resources
  • No single individual is likely to solve the
    problem or accomplish the task objectives without
    at least some inputs from others
  • Ill-Structured Problem
  • No clear-cut answers or procedures for the problem
Write a Comment
User Comments (0)
About PowerShow.com