Title: Improving Faculty Teaching
1Improving Faculty Teaching Student Learning in
Undergrad Science Eng
- The Promise of New Formative Assessment Tools
and Practices - California State Univ. - Fullerton Workshop
- How Learners Learn and Teachers Teach
- January 26, 2004
- By Dr. Myles Boylan
- Division of Undergraduate Education, NSF
- Center for the Advancement of Scholarship on Eng
Ed - Email Mboylan_at_nsf.gov and Mboylan_at_nae.edu
2Education is Teaching Learning
- Teaching based on knowledge of a discipline
(e.g. Math, or Chemistry) - Emphasis on clarity, logic, organization
- Learning focuses on students their
developmental context - nurturing cognitive skill growth
- insuring receptivity to new knowledge
- improving ability to transform knowledge
3Reform Education Best Practices Based on Trial
Error
- STEM reform educators believe
- Constructivism is the keystone of best
- .. it more often creates positive student affect
- Constructivist learning literally means that
students construct their own meaning - in labs and field experiences
- through writing and research
- in student teams and discussion groups
- in experiencing multi-disciplinary efforts
4How have STEM reformers known what practices work?
- Often seat-of-the-pants judgment with some
evaluation, often done incompletely and
inexpertly - Measures include
- conventional tests (e.g. the course final)
- pre-tests -- post-tests
- a few, or just one, post-course evaluations
(sometimes with control groups) - new tests and measures designed to test for new
knowledge (infrequent).
5What are the Strengths Weaknesses of these
approaches? (test alignment with course is key)
- conventional tests clear comparison to past, but
may not be ideal test of new knowledge. - pre-test, post-test a value-added approach, but
does not directly compare new method with old
(and gains are often disappointingly small). - one shot evaluation w control group treatment
group often differs from control group students
in both groups often contaminate purity of
control by sharing information. - new tests of student skills a broader measure
of learning but may not match the skills the
skeptics value the most, e.g. d(f(x)/dx ? - In all cases the impact of the new course may
mutate - a one shot evaluation is not a guarantee
of future success.
6(No Transcript)
7(No Transcript)
8Several new studies under gird this shift in
emphasis
- JD Branford, AL Brown, RR Cocking, eds. How
People Learn Brain, Mind, Experience, and
School Expanded Ed (NAS Press, 1999) - JW Pellegrino, N Chudowsky, R Glaser, eds.
Knowing What Students Know The Science and
Design of Educational Assessment (NAS Press,
2001) - J Golub, M Bertenthal, J Labov, P Curtis, eds.,
Learning and Understanding Improving Advanced
Study of Math Science in US High Schools (NAS
Press, 2002)
9Knowing What Students Know, Key Points
- Advances in cognitive sciences illuminate
important aspects of learning, understanding - Coupled with advances in measurement and
technology gt a new science of assessment - Assessment Contexts Classroom, Large Scale
- assist learning
- measure individual achievement
- evaluate programs (i.e. larger scale assessment)
- Cognition modelsObservationsInterpretation
10(No Transcript)
11Assessment is Key to Success Key Points
- Learning f(prior knowledge, experience)
- This context often has large misconceptions
- Loss of mastery of course material is often fast
- Knowledge transfer is often weak Practice!
- Most introductory courses go too fast, too far
- Building context-improving learning environments
is usually valuable activity - Students can learn to gauge own learning gains
- Students should share goals for learning
12(No Transcript)
13(No Transcript)
14(No Transcript)
15The FLAG Web Site Good Stuff www.flaguide.org/d
efault.asp
- FLAG Field-Tested Learning Assessment Guides
- A source of instruments for measuring student
understanding of the natural sciences
mathematics - A depository for new assessment tools that have
passed validity and reliability screens
16The FLAG offers broadly applicable,
self-contained modular Classroom Assessment
Techniques (CATs) and discipline-specific Tools
for STEM instructors interested in new approaches
to evaluating student learning, attitudes, and
performance. Each CAT has been developed,
tested and refined in real colleges and
universities classrooms. The FLAG also contains
an assessment Primer, a section to help you
select the most appropriate assessment
technique(s) for course Goals, and other
Resources.
17Some Classroom Assessment Techniques
- Attitudinal Surveys
- These can provide information on student
perceptions (emotions, feeling, attitudes) of
their classroom experience. For example - - the content of a course
- - specific components of a course
- May focus on students needs in taking a course,
how well those needs are met, student interest,
and student confidence. - FLAG survey materials are valid and reliable
18Some Classroom Assessment Techniques
- ConcepTests (often used in large lecture courses)
- The instructor presents one or more questions
during class involving key concepts, along with
several possible answers. - -gt The instructor obtains immediate feedback on
the level of class understanding. - -gt Students obtain immediate practice in using
SMET terminology and concepts. - -gt Students have an opportunity to enhance
teamwork and communication skills. - -gt Many instructors have reported substantial
improvements in class attendance and attitude.
19Q5. The mean height of American college men is 70
inches, with standard deviation 3 inches. The
mean height of American college women is 65
inches, with standard deviation 4 inches. You
conduct an experiment at your university
measuring the height of 100 American men and 100
American women. Which result would most surprise
you? a) One man with height 79 inches b) One
woman with height 77 inches c) The average height
of women at your university is 68 inches d) The
average height of men at your university is 73
inches
20Some Classroom Assessment Techniques
- Conceptual Diagnostic tests
- Aim to assess students conceptual understanding
of key ideas in a discipline, especially those
that are prone to misconceptions. - Discipline-specific, rather than generic. The
format typically is multiple-choice. - Diagnostic knowledge is currently richer in K-12
than undergraduate education. - Best known UG example is the physics Force
Concept Inventory. - Other tests exist and some ASA projects are
working on concept inventories, e.g. in
engineering statics, calculus, and
thermodynamics. see 2-page bibliography
21Some Classroom Assessment Techniques
- Performance Assessment
- Designed to judge student abilities to use
specific knowledge and research skills. Most
performance assessments require the student to
manipulate equipment to solve a problem or make
an analysis. - Made up of three distinct parts
- -gt a performance task
- -gt a format in which the student responds and
- -gt a predetermined scoring system.
22Student Assessment of Learning Gains (SALG)
Instrument
- can spotlight those elements in a course that
best support student learning and those that need
improvement. - can be easily individualized,
- provides instant statistical analysis of the
results, and - facilitates formative evaluation throughout a
course.
23Sample SALG Question Q4 To what extent did you
make gains in any of the following?
- 1. Understanding the main concepts
- 2. Understanding the relationship between
concepts - 3. Understanding how ideas in this class relate
to those in other science classes - 4. Understanding the relevance of this field to
real world issues - Q4 continued
24SALG Q4 continued
- 5. Appreciating this field
- 6. Ability to think through a problem or argument
- 7. Confidence in your ability to do this field
- 8. Feeling comfortable with complex ideas
- 9. Enthusiasm for subject
- Possible responses are NA, not at all,
- a little, somewhat,
- a lot, a great deal.
25Assessment Course DevelopmentCIA Model of
Course Development
- Curriculum
- Goals
Goals - Instruction Goals Assessment
Goals
Goals
Students
Goals
26A Generalized Model for Course Development
- translate goals into Measurable Student Outcomes
- determine Levels of Expertise required to achieve
outcomes Think of Blooms Taxonomy - select both Curriculum and Classroom Assessment
Techniques - choose and implement Instructional Methods
- conduct Assessment and evaluate--were Measurable
Student Outcomes realized?
27Blooms Taxonomy of Educational Objectives -
Knowledge-Based Goals
- Knowledge facts
- Comprehension translate, interpret, extrapolate
- Application apply abstractions principles to
specific problems - Analysis separating complex ideas into parts
understanding the relationship of the parts - Synthesis finding new patterns ideas from
complex information from disparate sources - Evaluation using external evidence to validate
ideas
28Blooms Taxonomy of Educational Objectives
- also includes Skills-Based Goals and Affective
Goals - There are different ways of representing
measurable student outcomes, e.g., as statements
about students knowledge and skills, as
questions to be asked of students about their
knowledge or skills, or as student statements
representing their affective perspective.
29(No Transcript)
30(No Transcript)
31(No Transcript)
32Generation Students use the models to generate
new knowledge and extend models grad
school Construction Students integrate
understanding of science into full working models
upper division Formulation Students combine
uni-relational ideas, building more complex
knowledge lower division Recognition Students
begin to recognize normative scientific ideas,
attaching meaning to uni-relational concepts
high school Notions Students use real-world
ideas, observation, logic, and reasoning to
explore scientific problem-solving.
(middle-school)
33(No Transcript)
34(No Transcript)
35(No Transcript)
36(No Transcript)
37Some Published Assessment Information and Data 1.
Libarkin, J.C., Anderson, S.W., Boone, W.J.,
Beilfuss, M., and Dahl, J., 2002, The Geoscience
Concept Test A assessment new tool based on
student misconsceptions EOS, Transactions of the
American Geophysical Union, in press. 2. D.
Rickey and A. M. Stacy, The Role of
Metacognition in Learning Chemistry, Journal of
Chemical Education, 77, 915-920, 2000. 3.
http//mc2.cchem.berkeley.edu/index.html (Chem
Connections, including some evaluation ideas see
http//mc2.cchem.berkeley.edu/Evaluation/concept.h
tml). 4. Andrea Stone, Kirk Allen, Teri Reed
Rhoads, Teri Jo Murphy, Randa Shehab, Chaitanya
Saha, "The Statistics Concepts Inventory A
Pilot Study", (2003). Conference paper, Accepted
Bibliography 33rd ASEE/IEEE Frontiers in
Education Conference, November 5-8, 2003,
Boulder, CO. See also http//coecs.ou.edu/sci/
38Four Factors Influencing Effective Change in
Colleges and Universities 1 Alignment
- Alignment is required at all levels for effective
system change (known to K-12, not to PSE). - Some examples are
- Curriculum development in higher ed with K-12
- Assessment methods with learning goals
- Course offerings across departments -- knowledge
departmental alignment - Activities of STEM faculty with those of
Education and Cognitive Psychology faculty
392 Degree of Success in Re-balancing Departmental
Rewards System
- Traditionally, departmental values are
- primacy of research over teaching
- primary loyalty to discipline, not institution
- duty to cover collectively-approved course
material - For successful reforms, rewards must also reflect
respect for teaching educational scholarship.
403 Extent of Convincing Evidence
- Evidence is a necessary condition for reform.
- Reform needs clear and convincing evidence.
- Best students may still question at first whether
they can know something not memorized. - New content and pedagogy requires a period of
classroom adjustment to get smoothed-out - alignment issues
- student expectations about the nature of the
course
414 Endorsement by Credible Agencies, because
evidence alone is often not a sufficient
condition
- Hard to prove new methods if little or no data
exist about effectiveness of old methods - New methods imply need for new assessments
- Evidence is discounted if reform violates
existing values and accepted social behavior. - Personal endorsements by faculty with strong
research reputations often convinces skeptics. - External agencies with clout have same effect
funders, accreditors, scientific societies, NAS,..