Title: ASSESSING ICT LITERACY IN AUSTRALIAN SCHOOLS
1ASSESSING ICT LITERACY IN AUSTRALIAN SCHOOLS
- John Ainley
- Julian Fraillon
- Chris Freeman
- Juliette Mendelovits
2Outline
- Introduction
- National goals and their assessment
- Features of the PMRT assessment of ICT literacy
- Assessment framework
- What is ICT literacy?
- What is progress in ICT literacy?
- Developing an assessment
- Item types
- Processing and marking
- Assessment content
- Conducting the assessment
- Stages and operations
- Data analysis
- Conclusion
3ICT Literacy in the National Goals
- Goal
- Students will be confident, creative and
productive users of new technologies,
particularly information and communication
technologies, and understand the impact of those
technologies on society - Assessment
- Assessment of ICT literacy through sample
surveys of students in Year 6 and Year 10 every
three years beginning October 2005.
4Key Features of the ICTL Assessment
- Focus on computer literacy
- Using computers to do computer tasks
- Construction of products or artefacts
- Skill assessment tasks
5Features of the methodology
- Uniform presentation
- Delivered on uniformly configured laptop
computers - Mini-labs delivered to schools
- Administered by trained administrators
- Windows platform
- Reflects resource constraints
- Reflects usage patterns
- Collect data on student usage to investigate
effects on measures - Software features
- Simulated environment for skill assessment
(SkillCheck) - Real applications for large tasks
- Seamless movement between simulated and real
environments (Sonet) - Includes other applications (InAlbum, Flag
Builder)
6Assessment framework
7The ICT Literacy Framework
- Definition
- the ability of individuals to use ICT
appropriately to access, manage and evaluate
information, develop new understandings, and
communicate with others in order to participate
effectively in society
8The ICT Literacy processes
- Accessing information
- Managing information
- Evaluating
- Developing new understandings
- Communicating with others
- Using ICT appropriately
9The PISA ICT Literacy framework processes
- Access
- Manage
- Integrate
- Evaluate
- Construct
- Communicate
10Comparison of MCEETYA and PISA ICT processes
- MCEETYA processes
- Accessing information
- Managing information
- Evaluating
- Developing new understandings
- Communicating with others
- Using ICT appropriately
- PISA processes
- Access
- Manage
- Integrate
- Evaluate
- Construct
- Communicate
11The ICT Literacy framework Strands
- Strand A
- Working with Information
- Strand B
- Creating and sharing information
- Strand C
- Using ICT responsibly
12ICT Literacy Domain Processes and Strands
13The ICT Literacy Progress Map
14Strand C Using ICT responsibly
-
- This strand includes understanding the capacity
of ICT to impact on individuals and society, and
the consequent responsibility to use and
communicate information legally and ethically.
15(No Transcript)
16Detail of the ICTL progress map
- Level 1, Strand C (Using ICT responsibly)
-
- Understands and uses basic terminology and
general procedures for ICT. Describes uses of ICT
in everyday life.
17(No Transcript)
18Detail of the ICTL progress map
- Level 5, Strand C (Using ICT responsibly)
-
- Identifies the social, legal, economic and
ethical consequences associated with using ICT
across a range of environments and contexts
19Developing assessment items to reflect the
progress map
- Two reasons Amanda provided the source of the
data are - - to inform readers where the information has
come from - - to help readers look up the information
themselves - Give one other reason why Amanda may have
provided the source of the data below.
- Code 1 - Identifies that providing the source
of the data acknowledges the efforts of the
person who originally completed the work OR that
it provides credibility by demonstrating use of
evidence from an external source.
20Assessment instrument
21The assessment instrument
Existing electronic resources
Native applications
Software simulations
Flexibility Capacity (task size) Functional
authenticity
Robustness Consistency Data capture
22The assessment instrument
Hybrid
Native applications
Software simulations
Robustness Consistency Data capture
Flexibility Capacity (task size) Functional
authenticity
23The assessment instrument
Hybrid
Native applications
Software simulations
Level 6 5 4 3 2 1
Strands
B A C 40
40 20
24Assessment characteristics
- Modular structure
- Modules
- based on educational personal contexts.
- tasks around a common theme.
- All students complete
- A General Skills Test
- Two Hybrid Assessment Modules (HAM)
- A Student Background Survey
25The assessment instrument
?
?
?
?
?
?
?
?
?
?
Two of
HAM 6 (a or b)
Student Background Survey
HAM 5
General Skills Test
HAM 4
?
?
?
?
?
HAM 3
HAM 2
HAM 1
26Hybrid Assessment Modules
- Common theme
- Simulation, MC short-answer questions.
- Questions are preparation for a Large Task
- searching for/collecting information,
- evaluating information
- reshaping information.
- Large Tasks
- electronic artefact using native applications.
- 40 of the score points and time in each HAM
27(No Transcript)
28Marking
Simulation and MCQ tasks
Automatic
Student scores database
Short constructed responses
Human grading
Student products (artefacts)
29MarkingLarge task artefacts
- Substance
- E.g. PowerPoint - Text Adaptation from Resources
0 - Large sections of text copied from the
resources and pasted with no editing 3 -
Relevant sentences have been copied from
resources and pasted. Some sentences have been
semantically linked with student's own words. 5
- The key points from the resources have been
rephrased and linked using student's own words or
graphics to suit the purpose and audience.
30MarkingLarge task artefacts
B. Use of technology E.g. PowerPoint Use of
colour
0 - No evidence of manipulation of text or
background colours to support reading of the
slides. Large amounts of text may be difficult or
impossible to read due to poor contrast. 3 -
Slides show evidence of careful planning
regarding the use of colour. The colours chosen
for the text and background are aesthetically
compatible and appropriate to the task. There is
sufficient contrast to enable all text and images
to be seen easily and there is a clear continuity
in the use of colours for specific purposes (such
as headings).
31Marking
Simulation and MCQ tasks
Student scores database
Short constructed responses
IRT (Rasch) Scaling
Human grading
Student products (artefacts)
Multilevel Modelling
Student details database
Student background survey
32Conducting the assessment
33Conducting the assessment
- Stages in development of tasks
- Pilot
- Field Trial
- Four jurisdictions
- 620 students in 66 schools
- 150 of items , redundancy
- Trial items, procedures and infrastructure
- Main Survey
- All jurisdictions
- Decentralisation of Test Administration
34CONDUCTING THE ASSESSMENT
- Two-stage sample
- PPS sample of schools
- 260 schools with Year 6
- 260 schools with Year 10
- Random selection of 15 students per school
- 3,900 Year 6 and 3,900 Year 10 students
- Mainland states 600 students (40 schools) per
year level - Smaller jurisdictions 300 students per year
level - Test Administrators
- Administered by trained administrators
- Nominations from State based liaison officers
- Test Administrators as main contact point with
schools
35CONDUCTING THE ASSESSMENT
- Marker Training
- Student products captured electronically
- Marking in computer based environment
- Moving from pen and paper environment to
electronic environment - Logistics
- Geographic distances
- Movement and co-ordination of resources
- Quality of test administrators
36CONDUCTING THE ASSESSMENT
- Data Analysis
- Psychometric Model
- General Skills Test and common modules
- Individual analyses to test links by year level
- Concurrent analysis of assessment matrix
- Analytic procedures
- Quest and RUMM
- Analysis by Module
- Analysis by Strand
37CONDUCTING THE ASSESSMENT
- Data Analysis
- Trial data indicate effective targeting
- Relations between Strands
- Unidimensionality of variables
- Fit statistics within strands and overall
- Correlations between strands
- Strand A and B versus Strand C
38Conclusion
- Assessment tool for ICT Literacy
- Computer-based tasks
- Combines skill assessment and larger tasks
- Sound psychometric properties
- National assessment survey
- Estimate students achieving a proficient
standard. - Reported for
- all students
- jurisdictions
- males and females
- indigenous and non-indigenous students
- LBOTE and other students
- geographic locations of metropolitan, provincial
and remote - socio economic status categories.
39Finish