You - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

You

Description:

You ve Got a Great Teacher Education Program so Why Doesn t Anyone Know About it? Mark Girod Chair, Teacher Education Western Oregon University – PowerPoint PPT presentation

Number of Views:95
Avg rating:3.0/5.0
Slides: 42
Provided by: HildaRo4
Learn more at: https://www.aascu.org
Category:

less

Transcript and Presenter's Notes

Title: You


1
Youve Got a Great Teacher Education Program so
Why Doesnt Anyone Know About it?
  • Mark Girod
  • Chair, Teacher Education
  • Western Oregon University
  • AASCU Academic Affairs Summer Meeting
  • Friday, July 29th, 2011
  • Portland, Oregon

2
Strengths and challenges in your teacher
preparation programs
  • Identify 2-3 strengths of the teacher preparation
    programs on your campus
  • Identify 2-3 challenges of your programs
  • Can you identify the essential challenges in
    teacher preparation today?
  • Bottom line we need evidence driven programs,
    with shared outcomes, systematic programs of
    research and structured dissemination plans

3
Identifying an outcome measure articulating the
inference chain
  • To get from teacher education to impact on
    pupils learning requires a chain of evidence
    with several critical links empirical evidence
    demonstrating the link between teacher
    preparation programs and teacher candidates
    learning, empirical evidence demonstrating the
    link between teacher candidates learning and
    their practices in actual classrooms, and
    empirical evidence demonstrating the link between
    graduates practices and what and how much their
    pupils learn. Individually, each of these links
    is complex and challenging to estimate. When they
    are combined, the challenges are multiplied (p.
    303).
  • Cochran-Smith, M. (2005). Studying teacher
    education What we know and need to know. Journal
    of Teacher Education, 56, 301-306.

4
Multi-leveled model
  • This is a complex and multilayered inference
    chain that requires examination of data at
    multiple levels including
  • Candidate level variables (GPA, prior coursework,
    exam scores)
  • We have much of this information about our
    candidates but is it in a form we can use?
  • Program level variables (structure, courses,
    fieldwork, outcomes)
  • What is the nature of these variables? Are these
    grouping variables? Independent variables?
  • Candidate learning (performance on anchor
    assignments)
  • Variability? How do we rate scaffolding in our
    failure free systems?
  • Candidate practices in classrooms (observations
    of teaching, strategies)
  • TWS? Observations? Data quality? Both dependent
    and mediating? Aggregate?
  • Contextual factors (classroom, school, community
    variables)
  • Most salient? And are we measuring the right
    ones?
  • P-12 learning (kind and complexity of outcomes
    targeted and met...)
  • As evidenced by (assessment)? Aggregation?
    Definitions? Judgment?

5
Necessary conditions
  • At least five conditions seem to be needed if a
    teacher preparation program is going to serve as
    context for research
  • Persons responsible for the management and
    operation of the program must be inclined toward
    experimentation
  • Persons responsible for the management and
    operation of the program must view it as subject
    to continuous change, and view a systematically
    designed program of research on its effectiveness
    as a major data source for its change.
  • Data of a quality that will support trustworthy
    research must be collected as a normal part of
    program operation.
  • Sophisticated data management, storage, retrieval
    and display capabilities must be available.
  • There must be an advisory structure to insure the
    research that is pursued has value to people in
    the program as well as to the profession at
    large.

6
How to run the system
  • Need to establish a conceptual model linking the
    inference chain
  • P-12 learning is the product of
  • Relationship between teacher/curriculum/learner
  • Nested in contexts
  • Kinds of studies needed
  • Policy oriented cost/benefits, program
    evaluation
  • Practice oriented follow-up studies, short and
    long-term effect
  • Basic research hypothesis testing, hypothesis
    generating
  • Quality assurance form use, rating patterns,
    inter-rater reliability
  • Data display development of procedures to
    display outcomes
  • Instrumentation refinement of instruments

7
And the results
  • Teacher preparation programs as contexts for
    research
  • Empirically validated systems of teacher
    preparation
  • Model building and theory building that would
    lead to an empirically validated field of teacher
    preparation
  • Policy and practices informed by evidence (best
    case scenario)
  • Better teachers and more and better P-12 learning
  • The argument suggests we have too much to lose to
    not attempt to become an empirically driven
    business and TWS is uniquely well positioned to
    help move us ahead if we can improve the quality
    of the data and our ability to aggregate
  • 1975

8
So what barriers have kept us from getting there?
  • Discussion
  • What are some of the barriers to data driven
    teacher preparation programs that connect to P-12
    student learning?
  • What will be the consequences if we dont get
    there?
  • How can we move toward this?

9
What is a Teacher Work Sample?
  • A Teacher Work Sample is an authentic
    performance assessment completed in a real-world
    setting that demonstrates a candidates ability
    to assess, plan, instruct, and reflect in a
    standards-based educational system and impact
    student learning in a positive manner.

10
The revolution of standards-based schooling
  • The standards set for learning in todays
    schools define the successive bars to be reached
    by students as they progress in their learning,
    and standards-linked assessments indicate where
    students stand at a particular point in time with
    respect to a particular bar, but it is each
    student that needs to reach each bar and the main
    job of teachers is to help each student in each
    classroom make steady progress toward each bar
    that lies immediately ahead.
  • (Del Schalock, 2006)

11
Use for teacher licensure
  • Since 1989, in Oregon, each teacher candidate has
    been required to successfully implement two
    teacher work samples prior to being awarded
    initial licensure.
  • A teacher work sample is an empirically validated
    performance assessment that includes
  • A setting description
  • Pre-assessment
  • Learning outcomes
  • Lesson plans
  • Post-assessment
  • Analysis of learning data
  • Reflective analysis

12
Principles
  • An instructional program needs to be aligned with
    and supportive of what candidates are asked to
    do, including the documentation and reporting
    that is required in completing a work sample.
  • School contexts that model and are supportive of
    what candidates are asked to do need to be
    available for practicum and student teaching
    placements.
  • A supervision, evaluation, and feedback system
    needs to be in place that provides guided
    practice in applying and carrying out the tasks
    teacher work sampling demands of candidates.

13
Principles
  • Judgments about a candidates effectiveness as a
    teacher need to take into account the gains in
    learning made by every student taught.
  • Documentation of a candidates effectiveness as a
    teacher needs to be accompanied by observations
    of practice and descriptions of context, as well
    as evidence of learning gains by students.
  • Multiple lines of evidence need to be considered
    in reaching a recommendation for licensure, only
    some of which come through teacher work sampling.

14
Principles
  • Multiple reviewers of evidence need to be
    involved in preparing a recommendation for a
    license to teach, only some of whom represent a
    teacher education faculty.
  • Evidence needs to be assembled and reported by a
    teacher education faculty on the confidence that
    can be placed in all lines of evidence collected
    through teacher work sampling that inform a
    licensing decision (the reliability and validity
    of information used).
  • A conceptual map is needed to help inform and
    give meaning to candidates regarding the way in
    which the previous 8 principles inform the TWSM.

15

16
Undergirding skills
  • Candidates prepare products or components of a
    work sample as evidence of their developing
    skillfulness. These skills, when employed with
    acumen, facilitate connections between teaching
    and learning.
  • It is connecting teaching and learning that we
    value and we judge it by examining the products
    of the work sample. Therefore we must distinguish
    between skills and products.
  • See handout distinguishing between the
    undergirding skills and the products developed in
    a teacher work sample.

17
Weaving a tapestry
When a candidate successfully weaves together the
underlying skills of the teacher work sample with
real children, in real 21st century schools, we
argue that they have maximized opportunities for
P-12 student learning. The teacher work sample
stands as evidence of this connecting teaching
and learning.
18
TWS Multi-purposing
  • Use of TWS results by programs
  • Disaggregated data
  • Candidates ability to teach to state and
    national standards
  • Candidates ability to enact best practices in
    content based pedagogy linked to national
    professional standards
  • Candidates ability to impact student learning
  • Aggregated data
  • Program accountability
  • Program improvement
  • Context for research
  • See handout on empirical foundations and supports
    for TWS

19
Optimism about TWSM
Reasonableness hard to argue that we should not
value P-12 student learning Feasibility can be
used in any context, at all levels, and is
practically familiar Serves multiple purposes
both a pedagogical model as well as an assessment
tool Empirically validated TEP-II, a
longitudinal study of teachers used TWS and
explained more than 40 of the variance in
student learning Schalock, H.D., Schalock, M.,
Girod, G. (1997). Teacher work sample
methodology as used at Western Oregon State
College. In J. Millman (Ed.), Grading teachers,
grading schools. Thousand Oaks, CA Corwin. For
more information on TEP-II see http//www.tr.wou.e
du/tep/products.html
20
Trustworthiness
  • Validation of TWSM extended at Idaho State
    University and Western Kentucky University
  • Dependable rating system and rater training
  • Bias free rating
  • Validation of frequency, criticality, necessity,
    and representativeness of TWS actions to actual
    practices
  • Linked TWS performance to levels of P-12 learning
  • Denner, P., Norman, A., Salzman, S., Pankratz,
    R., Evans, S. (2004). The Renaissance
    Partnership teacher work sample Evidence
    supporting score generalizability, validity, and
    quality of student learning assessment. In E.
    Guyton and J. Rainer Dangel (Eds.), Research
    linking teacher preparation and student
    performance. Dubuque, Iowa Kendall.

21
Systems-level approach
  • Teacher work sampling as a compass
  • Keeps programs focused on P-12 student learning
  • Keeps faculty focused on teaching skills,
    knowledge, and dispositions that help to maximize
    P-12 student learning
  • Keeps institutions focused on university teaching
    as a valued outcome
  • Codifies a systems-level commitment to
    connecting teaching and learning

22
Teacher development
Students responded to the prompt Did the process
of completing a teacher work sample help you
think differently about teaching and learning? If
so, how? If not, why not? Analyzing my teaching
using the work sample really helped me understand
the importance of making adaptations and
modifications as needed. It made me realize the
importance of alignment between context,
instruction, adaptations, and assessment. I don't
think I really understood this before.
23
Teacher development
And from another student The teacher work
sample has helped me develop a sense of necessity
when it comes to being sensitive and attentive to
details especially the needs and abilities of
my students. It helped me recognize the
importance of paying attention to pre-assessment
results when designing instruction. I can modify
instruction to better work with my students'
needs.
24
NCATEs focus on student learning
  • A centerpiece of the NCATE performance-based
    system is collecting and aggregating data to show
    that candidates have the knowledge and skills to
    teach effectively so that students learn, a
    requirement that directly impacts approximately
    two-thirds of all new teacher graduates
    nationally.
  • (Westat, 2006)

25
TEACs focus on student learning
  • Although less explicit, TEAC also references
    P-12 student learning as found in Quality
    Principle 1.3 Teaching Skill which states that
    teachers must act on their knowledge in a caring
    and professional manner that would lead to
    appropriate levels of achievement for all their
    pupils.
  • NOTE Less than one year ago, NCATE and TEAC
    consolidated and teacher preparation
    accreditation is now under a new organization
    called Council for the Accreditation of Education
    Preparation (CAEP)

26
Art or science?
  • Teacher education lacks a unifying theory founded
    on empirically defensible assertions
  • If developing a science of teacher education is
    possible it includes mapping a complex set of
    interactions, nested in multiple levels, resting
    on shifting contextual milieu

27
TWS well positioned
  • Given attention to each stage of Cochran-Smiths
    inference chain, and sensitivity to multiple
    levels of context teacher work sampling is
    methodologically well positioned to more
    systematically explore empirical connections
    between preparation, practices, and P-12 student
    learning
  • Though VAM is powerful it is not yet
    particularly useful for informing the work of
    teacher education beyond giving us hope that what
    we do matters!

28
Future directions
  • Maturation of TWSM
  • Ongoing instrument validation ?predictive
    validity
  • Dimensionality of undergirding constructs
  • Codifying scoring procedures
  • Codifying non-negotiables at multiple levels
  • Contextualizing methodology
  • Aggregation of TWS information
  • Contributing to a scholarship of teaching
    education
  • Linking to data warehouses ?cross-validation
  • Etc

29
Research Questions
  • Descriptive study
  • How does the learning of P-12 students vary, as
    evidenced by data from the TWS, across
  • Kind and complexity of outcomes pursued and met
  • Individuals and groups of students (ethnicity and
    instructional program)
  • Variations in preparation experiences - planned
    variation studies
  • Variations in context, candidate level variables,
    program level variables
  • Variations in instructional strategies,
    assessment strategies, and efforts to
    differentiate

30
(No Transcript)
31
Initial observations
  • More variance within groups than between groups -
    for ethnicity and academic program
  • Between program variance illustrates pedagogical
    differences not program quality
  • Need for a web-based, data-entry portal that
    aggregates P-12 learning gain data
  • Will require within-program conversations about
    common language and expectations
  • Will afford programs the ability to link the
    evidentiary chain and become data-driven settings
    for the improvement of teacher education at the
    program level

32
P-12 student impact
  • Oregon Collaborative Research Initiative
  • Through examination of 98 teacher work samples
    representing the learning of 2,400 children
  • Overwhelmingly, students were focused on
    low-level, cognitive outcomes
  • Only small, non-significant differences for
    student-level effects (race/ethnicity, academic
    program)
  • Only small, non-significant differences for
    school-level effects (SES, urban/rural, size)

33
Learning Gain Aggregator
  • Tension in balancing pedagogical goals with
    evaluation/research goals
  • Replace existing efforts dont add work to
    students or faculty
  • Getting instructors on board
  • Using FileMaker Pro for a web-accessible database

34
Candidate data
35
Learning outcome data
36
Outcomes by student data
37
Reporting cuts by fields
38
And a real example
39
And one more real example
40
What will we be able to say and do?
  • Descriptive studies of the kind and complexity of
    outcomes pursued and met by
  • (1) candidates with particular qualities
  • (2) in particular kinds of programs
  • (3) in particular kinds of placement settings
  • (4) working with particular kinds of P-12
    children
  • Ex What do the learning profiles look like for
    candidate seeking initial licensure in science,
    working in urban schools, with African American
    girls, receiving special education services, when
    pursuing performance outcomes for learning?!!
  • Planned variation studies Programs become
    laboratories for experimentation with a fixed
    dependent variable

41
Challenges and opportunitiesand this is the
last slide!
  • Getting faculty to agree to a common vision of
    their work
  • Having faculty leadership to establish proof of
    concept
  • Protecting time and resources to get there
  • Gives faculty a defensible outcome variable for
    all research
  • Provides evidence on program effectiveness
  • Moves teacher preparation ahead
  • Mark Girod girodm_at_wou.edu
  • Western Oregon University
Write a Comment
User Comments (0)
About PowerShow.com