Title: Implementing and Integrated Engineering Curriculum at Louisiana Tech University
1Louisiana Tech University College of Engineering
and Science
Using Item Analysis to Adjust Testing and
Topical Coverage in Individual Courses
Bernd S. W. Schröder
College of Engineering and Science
2ABETs requirements
- The institution must evaluate, advise, and
monitor students to determine its success in
meeting program objectives. (From criterion 1) - (The institution must have) a system of ongoing
evaluation that demonstrates achievement of these
objectives and uses the results to improve the
effectiveness of the program. (2d)
College of Engineering and Science
3ABETs requirements (cont.)
- Each program must have an assessment process
with documented results. Evidence must be given
that the results are applied to the further
development and improvement of the program. The
assessment process must demonstrate that the
outcomes important to the mission of the
institution and the objectives of the program,
including those listed above, are being
measured. (postlude to a-k)
College of Engineering and Science
4ABETs requirements and us
- Long term assessment is the only way to go, but
how can more immediate data be obtained? - Large feedback loops need to be complemented with
small feedback loops - All this costs time and money
- Still feels foreign to some faculty
- Free tools that do something for me would be
nice
College of Engineering and Science
5Data that is immediately available
- Faculty give LOTS of tests
- Tests are graded and returned
- Next term we make a new test
- and we may wonder why things do not improve
College of Engineering and Science
6Presenters context
- Adjustment of topical coverage in Louisiana Tech
Universitys integrated curriculum - Presenter had not taught all courses previously
- Some material was moved to nontraditional
places - How do we find out what works well?
College of Engineering and Science
7Integrated Courses
Freshman Year
fall
spring
winter
Plus 1 additional class -- History, English, Art,
...
8Integrated Courses
Sophomore Year
fall
spring
winter
Plus 1 additional class -- History, English, Art,
...
9Implementation Schedule
- AY 1997-98 One pilot group of 40
- AY 1998-99 One pilot group of 120
- AY 1999-2000 Full implementation
College of Engineering and Science
10Item analysis
- Structured method to analyze (MC) test data
- Can detect good and bad test questions
- Awkward formulation
- Blindsided students
- Can detect problem areas in the instruction
- Difficult material
- Teaching that was less than optimal
- Plus, data that usually is lost is stored
College of Engineering and Science
11But I dont give tests ...
- Do you grade projects, presentations, lab reports
with a rubric? - Scores are sums of scores on parts
- Do you evaluate surveys? (Gloria asked)
- Individual questions may have numerical responses
(Likert scale) - Item analysis is applicable to situations in
which many scores are to be analyzed
College of Engineering and Science
12Literature
- R. M. Zurawski, Making the Most of Exams
Procedures for Item Analysis, The National
Teaching and Learning Forum, vol. 7, nr. 6, 1998,
1-4 - http//www.ntlf.com
- http//ericae.net/ft/tamu/Espy.htm (Bio!)
- http//ericae.net/ (On-line Library)
College of Engineering and Science
13Underlying Assumptions in Literature
- Multiple Choice
- Homogeneous test
- Need to separate high from low
- Are these valid for our tests? (This will affect
how we use the data.)
College of Engineering and Science
14How does it work?
- Input all individual scores in a spreadsheet
- If you use any calculating device to do this
already, then this step is free - the same goes for machine recorded scores
(multiple choice, surveys) - Compute averages, correlations etc.
- But what does it tell us? (Presentation based on
actual and cooked sample data.)
College of Engineering and Science
15Item Difficulty
- Compute the average score of students on the
given item - Is a high/low average good or bad?
- How do we react?
College of Engineering and Science
16Comparison Top vs. Bottom
- General idea high performers should outperform
low performers on all test items - Compare average scores of top X to average
scores of bottom X - Problems on which the top group outscores the
bottom group by about 30 are good separators
(retain) - Advantage Simple
College of Engineering and Science
17Comparison Top vs. Bottom
- Problems on which the bottom group scores near or
above the top group should be analyzed - Is the formulation intelligible?
- Was the material taught adequately?
- Was the objective clear to everyone?
- Does the problem simply address a different
learning style? - Is there a problem with the top performers?
College of Engineering and Science
18Comparison Student Group vs. Rest
- Can analyze strengths and weaknesses of specific
demographics (even vs. odd, 11-20 vs. rest). - Knowing a weakness and doing something about it
unfortunately need not be the same thing. (3NT)
College of Engineering and Science
19Comparison Class vs. Class
- If the test is given to several groups of
individuals, then scores of the different groups
can be compared - Differences in scores can sometimes be traced to
differences in teaching style - Similarity in scores can reassure faculty that a
particular subject may have been genuinely easy
or hard.
College of Engineering and Science
20Correlation
- Related material should have scores that
correlate - Individual problem scores should correlate with
the total? (What if very different skills are
tested on the same test?)
College of Engineering and Science
21Correlation and Separation
- Often the two are correlated but cross fires
can occur - Questions with same correlation can have
different separations and vice versa - A question may separate well, yet not correlate
well and vice versa
College of Engineering and Science
22Distractor Analysis
- Incorrect MC item that was not selected by anyone
should be replaced - Possible by slightly misusing the tool.
College of Engineering and Science
23Data remains available
- Many faculty look to old tests (of their own or
from others) when making a new test - Past problems are often forgotten
- Item analysis provides a detailed record of the
outcome and allows faculty to re-think testing
and teaching strategies - Anyone who has spent time thinking about curving
may want to spend this time on item analysis
College of Engineering and Science
24Consequences of the Evaluation (Glorias law
Emc2)
- Dont panic, keep data with test
- Better test design
- Identification of challenging parts of the class
leads to adjustment in coverage - Students are better prepared for following classes
College of Engineering and Science