Title: Using Student Assessment Results to Improve Teacher Knowledge and Practice
1Using Student Assessment Results to Improve
Teacher Knowledge and Practice
NESA FLC 24- 27 October2013
- MARTINA BOVELL, B.A. Dip. Ed. Grad dip. Arts
(UWA) - Senior Research Fellow
2www.acer.edu.au
80 years experience Independent Not for
profit Over 300 staff in seven
offices Melbourne, Sydney, Brisbane, Adelaide,
Perth, India (New Delhi), UAE (Dubai) Four
goals Learners and their needsevery learner
engaged in challenging learning opportunities
appropriate to their readiness and needs The
Learning Profession every learning professional
highly skilled, knowledgeable and engaged in
excellent practice Places of learningevery
learning community well resourced and
passionately committed to improving outcomes for
all learners A Learning Societya society in
which every learner experiences success and has
an opportunity to achieve their potential
3ACER and international assessment
Member of the International Association for the
Evaluation of Educational Achievement (IEA).
Consortium leader for the OECDs Programme for
International Student Assessment (PISA) 1998
2012.
4Purposes of assessment
- clarify educational standards
- monitor trends over time
- evaluate the effectiveness of educational
initiatives and programs - ensure that all students achieve essential skills
and knowledge
5International Schools Assesssment
- http//www.acer.edu.au/tests/isa
- Over 64 000 students from 312 schools
participated in the ISA in October 2012 and Feb
2013.
6The ISA
- Grades 3 10
- Mathematics
- Reading
- Expository writing
- Narrative writing
- Grades 8 10
- Science online
- New in 2013-2014 testing cycle
7How ISA results are reported
- All students who participate in ISA tests have
their performance measured against a single
scale. - There is a separate scale for each of the five
domains being assessed (Reading, Mathematics,
Narrative Writing, Expository Writing, and
Science). - The ISA scale score for a student is different
from the raw score that the student would get
by adding up the number of correctly answered
questions on a test. - The ISA scale allows meaningful comparisons of
results between different grade levels and
between different calendar years even though the
tests administered are not the same. - The ISA scales are based on those developed for
the Organisation for Economic Co-operation and
Developments (OECDs) Program for International
Student Assessment (PISA).
8Types of information
- Whole grade mean and distribution
- all students, language background, gender
- this year and previous years (longitudinal
comparisons) - compared to all other ISA schools
- compared to all other Like schools (based on s
of students with ESB) - For years 8 10, comparison with PISA results
(Maths, Reading, Science) - Classes within whole grades
- class mean and individual student scale scores
- performance by item classification within a
domain - compared to all other ISA schools
- compared to all other Like schools
- compared to other classes in your school
- Item by item results for each student
- By item classification
- compared to other individuals
- compared to all other ISA schools
9ISA Reports
- Paper-based
- Electronic
- tracking
- interactive
- Science online test delivery means faster
turn-around of results
10Improve teacher knowledge and practice
- A growth model
- acknowledges that each student is at some point
in their learning - expects every student to make excellent learning
progress regardless of their starting point - assesses growth over time
11ACER National School Improvement tool (NSIT)
- Means of ACERs mission of improving learning
- Endorsed by Australian federal and state
governments - Research-based
- Informs and assists schools improvement agendas
- Available to all schools
12http//www.acerinstitute.edu.au/home
- Part of ACERs Institute of Learning larger
school improvement project. - The Institutes services include
- Professional learning
- School review services
- Capacity building for school improvement
13www.acer.edu.au/documents/NSIT.pdf
14(No Transcript)
15Review of a domain
16Domain performance levels
17Domain performance level LOW
- Teachers do not systematically analyse test and
other data for their classes and teachers make
little use of data to reflect on their teaching.
18Domain performance level MEDIUM
- An ad hoc approach exists to building staff
skills in the analysis, interpretation and use of
classroom data
19Domain performance level HIGH
- One or more members of staff have been assigned
responsibility for implementing the annual plan,
analysing the full range of school data, and
summarising, displaying and communicating student
outcome data for the school. The school has
ensured that appropriate software is available
and that at least these assigned staff have been
trained to undertake data analyses. - Time is set aside (e.g. on pupil free days and in
staff meetings) for the discussion of data and
the implications of data for school policies and
classroom practices. These discussions occur at
whole-school and team levels.
20Domain performance level OUTSTANDING
- Data are used throughout the school to identify
gaps in student learning, to monitor improvement
over time and to monitor growth across the years
of school. - A high priority has been given to professional
development aimed at building teachers and
leaders data literacy skills. - Staff conversations and language reflect a
sophisticated understanding of student assessment
and data concepts (e.g. value-added growth
improvement statistical significance). - Teachers are given class test data electronically
and are provided with, and use, software to
analyse, display and communicate data on
individual and class performances and progress,
including pre- and post-test comparisons. - Teachers routinely use objective data on student
achievement as evidence of successful teaching.
21Data literacy framework
- US Dept of Education office of planning,
evaluation and Policy development (2011).
Teachers ability to use data to inform
instruction challenges and supports. - STUDY
- investigation of teachers thinking about student
data by administering hypothetical education
scenarios accompanied by data displays and
questions to individual and small group
interviews to teachers and schools identified as
exemplary in active data use (50 teachers, 72
small groups) - PURPOSES
- 1. to investigate teachers thinking and reasoning
independently about data and how they build on
each other's understandings when working with
data in small groups. - 2. to isolate the difficulties, misconceptions
and support needed.
22What they found
- Teachers likelihood of using data is affected by
how confident they feel about their knowledge and
skills - Working in small groups appears to promote
teachers engagement with data. Compared when
working individually, teachers were - more likely to arrive at sound data
interpretations - more likely to use a wider range of skills when
making decisions about how to use and interpret
data - able to clarify and frame problems and correct
data interpretation errors - more likely to enjoy discussing data
23Data literacy skill areasUS study framework
- Data location
- find relevant pieces of data in the data system
or display - Data comprehension
- understand what the data is saying
- Data interpretation
- figure out what the data mean
- Instructional decision making
- select an instructional approach to address the
situation identified in the data - Question posing
- frame instructionally relevant questions that can
be addressed by the data
24Bias when making decisions using data
- Representative /Availability bias
- When judging the probability of something, we use
some preconceptions based on how similar two
things are or the extent to which an event
matches our previous experience. - e.g. irrelevant personal characteristics or
stereotypes - Anchoring and adjustment bias
- Make a decision based on an initial calculation
without following though on the calculations. - We ignore data that doesnt agree with our
preliminary decisions or biases (we agree with
what we expect). - Confidence in making the decision is not always
associated with quality of decision making - Group decision making can mitigate some of these
biases.
25PISA findings about decision making
- PISA in focus 26 http//www.oecd.org/pisa/pisainf
ocus/ - Countries vary in the way they use marks, but
they all tend to reward the mastery of skills and
attitudes that promote learning. - Teachers tend to give girls and
socio-economically advantaged students better
school marks, even if they dont have better
performance and attitudes than boys and
socio-economically disadvantaged students. - It seems that marks not only measure students
progress in school, they also indicate the
skills, behaviours, habits and attitudes that are
valued in school.
26What biases?
27Narrative Writing assessment
- Narrative Task
- Same tasks and marking criteria for all grades
- Content 0 - 14 quality and range of ideas,
development of plot, characters and setting, the
writers sense of audience and purpose, the
overall shape of the writing. - Language 0 - 14 sentence and paragraph
structure, vocabulary and punctuation, and the
writers voice. - Spelling 0 - 11 considers phonetic and visual
spelling patterns and the kind of words
attempted, and correctness.
28Reporting
- Content 7 (max score 14)
- Language 6 (max score 11)
- Spelling 5 (max score 11)
- Individual student report comment
- Level 5 Write a story with some developed detail
in content and using a variety of sentence forms.
Spell correctly many words from a student-level
vocabulary.
29Reading Framework
The ISA definition of Reading literacy derives
from PISA Understanding, using, reflecting on
and engaging with written texts, in order to
achieve ones goals, to develop ones knowledge
and potential, and to participate in society.
Goes beyond decoding and literal comprehension
and recognises the full scope of situations in
which reading plays a role in the lives of
students from grades 3 to 10. Three parts
ASPECT TEXT TYPE TEXT FORMAT
30Reading Framework
Retrieving Information Locating , selecting,
and collecting one or more pieces of information
in a text.
Continuous text format composed of sentences that
are, in turn, organised into paragraphs. These
may fit into even larger structures such as
sections, chapters and books. Narrative pieces,
exposition, description, argument and
instructional passages
Interpreting texts Making sense of the text,
constructing meaning making connections and
drawing inferences from one or more parts of a
text, e.g. cause and effect, compare and
contrast, category and example. Involves
information that is not stated.
Noncontinuous text format Essentially, texts
composed of one or more lists in which
information is presented in, e.g. tables, graphs,
maps and diagrams
Reflecting Drawing on knowledge, ideas and
values external to the text to evaluate a text
relating a text to ones experience, knowledge
and ideas.
31Feedback from EARCOS
- This workshop was exactly what was needed to
guide us. It really helped the focus and the
analysis of data. I hope ISA continue to provide
this service to support schools who use this
service. - Very useful and I am motivated to do my work
better. - I felt that the workshop has enabled me to speak
with some authority on how we can unpack the ISA
results. More importantly I have an insight into
how we can use these results to better inform
further decisions, ones that are based on student
achievement. - Very useful. More of these sessions need to be
provided to help teachers understand how data can
be used to improve learning. - Informative and well worth the time. Hands on,
interactive learning.
32ISA Interactive diagnostic report
- A guide to using the interactive diagnostic
report that was demonstrated during the the
conference session is downloadable at - http//www.acer.edu.au/documents/ISA_Using_the_In
teractive_Diagnostic_Report.pdf
33Interrogating data
- Focus on the student
- Did the student perform as well as expected?
- Does the performance match expectations/reflect
teacher judgement about the student? - What does the students response pattern show
about the strengths of the student? - What does the students response pattern show
about the areas of concern for the student? - Are any areas of concern preventing the student
from making progress? What might account for
these?
34Interrogating data
- Focus on the group group scores
- How does the group achievement relate to the
bands? - How does the class distribution against the bands
match expectations about the group? - Did the group as a whole perform as well as
expected? - Does the relative order of students match
expectations about the students? - What students have achieved higher than expected,
or lower than expected, in relation to others? - Are there students in the group with similar
achievements?
35Interrogating data
- Focus on the group group scores
- Do students with similar scale scores have
similar or different response patterns? - What assessment criteria do students perform well
on? - What assessment criteria do students perform less
well on? - What does the groups response pattern show about
its strengths? - What does the groups response pattern show about
its areas of concern?
36Interrogating data
- Focus on the teaching program
- Has any teaching impacted on the groups results?
- Are there any areas of concern preventing the
whole group from making progress?
37www.acerinstitute.edu.au/conferences/eppc
Presented by practitioners, for
practitioners. ACER recognises that, every day,
teachers and school leaders are responsible for
improving learning among students. This
conference provides an opportunity to report on
and celebrate the improvements you have achieved
within your classes, across the whole school or
within networks of schools. Call for papers now
open
38School Improvement Tool Contact
- Robert Marshall
- Senior Project Director
- Australian Council for Educational Research
- 19 Prospect Hill Road, Camberwell, Victoria
- Australia 3124
- Robert.Marshall_at_acer.edu.au
- 61 3 9277 5346
- 0439 665 965
39- Martina Bovell
- Senior Research Fellow
- Australian Council for Educational Research
- 7/1329 Hay Street, West Perth
- Western Australia 6005
- martina.bovell_at_acer.edu.au
- 61 8 9235 4821
- 61 439 926 277
40References
- Barber, M. and Mourshed, M. (2007). How the
worlds best-performing school systems come out
on top. Mckinsey and Co. - Dweck, C.S. (2006). Mindset The new psychology
of success. New York Balantine Books. - Fullen, M., Hill, P., Crevola, C. (2006).
Breakthrough. California Corwin Press. - International Baccalaureate Organisation. (2010).
IB learner profile booklet. www.ibo.org - International Baccalaureate Organisation. (2010).
Programme standards and practices. www.ibo.org - Masters, G. (2013). Towards a grown mindset in
assessment. ACER Occasional essays 2013.
www.acer.edu.au - Tversky, A. and Kahneman, D. (1982).Judgement
under uncertainty Heuristics and biases. In
Judgement under uncertainty Heuristics and
biases, eds. D Kahneman, P. Slovic and A.Tversky,
3-20. NY Cambridge University Press cited in
US Dept of Education office of planning,
evaluation and Policy development (2011).
Teachers ability to use data to inform
instruction challenges and supports .