Title: Is Online Testing Viable for Language Courses
1Is Online Testing Viable for Language Courses?
TEL_at_York Conference 2007 May 1, 2007
- Norio Ota, Noriko Yabuki-Soh, Alison
Devine-Tanimura (DLLL), Mike Street (IT
Consultant, ATS)
2On-line Testing using Moodle2007TEL_at_York
ConferenceNorio Ota
- Panel
- This is Part 2 of our report on developing
on-line tests using Moodle presented last year.
Some of the information will be repeated for a
new audience. First I would like to talk about
the background and make general remarks on the
Japanese Sections on-line testing. Next, Alison
Devine will discuss pros and cons of using Moodle
to create and implement on-line tests for
Japanese from a novice users viewpoints. Her
presentation will help would-be users to
understand what is involved in developing on-line
tests. Mike Street will then discuss technical
issues that include problems unique to Moodle and
those related to administering tests. He will
also comment on some of the on-going and future
developments of Moodle. Noriko Yabuki Soh will
discuss how we have developed various types of
questions for Japanese tests and examine students
feedback. Last, I will show how Moodle has been
used to create advanced level tests in Japanese
and how they were implemented in distant
locations including Halifax and Japan.
3Background
- The Japanese Section has been developing
web-based courses and instructional materials for
self-study in the past 11 years, and offered an
experimental distance education course for four
years. In September 2006 it offered an advanced
level Japanese language course using a distance
education format via video-conferencing for
students at St. Marys University in Halifax. - The TEL initiatives by the Japanese Section of
DLLL are as follows - Server-based web course development
- Developing interactive instructional materials
for self-study - Developed a distance education course for the
elementary level Japanese language course (tested
at Glendon for 4 yrs) using video-conferencing - Introducing Media Site Live for video-streamed
lectures - Developing web-based on-line tests for the
elementary Japanese course assisted by ATS
(2005-07) - Developed a distance education course for the
advanced Japanese course for St. Marys
University in Halifax (2006-07)
4Web-based Testing (WBT)
- One of the most challenging aspects of web-based
on-line language courses is how to implement
testing on-line. Web-based testing (WBT) has
overcome some of the restrictions of
Computer-Based Testing (CBT) and opened a door to
developing browser-based tests, which are more
flexible and user-friendly. Many on-line course
delivery software products have been developed
and made available for creating on-line tests,
such as WebCT, Hot Potatoes, Sakai, FLE3 and
Moodle. All of these products have a quiz and
test creating component. With the uncertainty of
the future of WebCT and other products, we have
chosen Moodle to put first year language tests on
line for the following reasons. - Free
- Open source customizable
- Non-proprietary
- User-friendly
- Ease of installation
- Flexible
- Language support
- Comprehensive
- The main rationale of this project was that
implementation of on-line tests would save much
time and energy for the two instructors who would
have had to mark 270 tests four times a year.
On-line testing is also a must for developing
distance education courses.
5Challenges for on-line testing for languages
- Language teaching professionals are often
reluctant in developing on-line tests due to the
following restrictions. - limited types of questions
- lack of analytical tools (natural language
parsing) - lack of qualitative evaluation
- lack of evaluation for communicative competence
- security issues
- technical issues
- administering issues
- These are still legitimate concerns, which will
be discussed in this session, but on-line testing
is used mainly to assess learners learned
knowledge.
6Implementation and Objectives
- On-line testing is NOT comprehensive
- On-line testing is to access each learners
knowledge and recognition of - Vocabulary
- Expressions
- Conjugations
- Sentence structures
- Basic kana characters and basic sino-Japanese
characters (kanji) - Simple context (communicative understanding)
- Sociolinguistic and pragmatic aspects
- It is to be underlined that the purpose of theses
tests are to assess each learners knowledge and
recognition of vocabulary, expressions,
conjugations, sentence structures, basic kana
characters and basic sino-Japanese characters
(kanji). Questions which test learners
understanding of context are included in the form
of a short dialogue. Understanding of various
sociolinguistic and pragmatic aspects such as
honorifics and speech acts are also tested in
terms of learned knowledge. This position can be
justified because other aspects of language
learning such as communicative activities,
listening recognition and comprehension, reading
and writing are covered in the classroom in terms
of experiential knowledge. The Section plans to
incorporate audio or video files for listening
comprehension and test knowledge on
sociolinguistic and pragmatic knowledge in
2007-08.
7Developing tests
- We embarked on this project by transforming the
written versions of the tests into on-line
versions with certain modifications and by
inventing new types of questions that would be
more suitable to the nature of the program. - Transforming paper-based tests into web-based
versions - Modifying types of questions
- Developing new types of questions
- Reviewing students answers and modifying
possible answers - Readjusting answers to introduce a partial
marking system - Developing a questionnaire for students feedback
- As is always the case with the attempts at early
stages, we have observed pros and cons ourselves
and from the students responses on the
questionnaire.
8Outcome
- Learn more about limitations, bugs and positive
features of Moodle - Why were test scores lower this year?
- Academic honesty issues
- Learning process for faculty
- Facultys willingness to learn and no quick
resistance to Moodle - Cooperation between IT consultant and faculty
with good working relationship and initiatives - Tech problems are hard to deal with by faculty
alone requires a tech support person at the test
site. - Limitations re the test site
9A Novices Observations
- Advantages
- Disadvantages
- Suggestions for improvement
10Physical Advantages
- Paperless Green
- Remote Access means students can-
- see their final exam
- learn from their mistakes
- query grades via email
- Instructors can-
- reconsider grades anytime / anywhere
11Design Advantages (1)
- No black streaks from the photocopier, etc.
- GIFs for high resolution visuals ?
- colour instead of BW.
- Colour coded Q-prompts ?
- increased clarity of instruction to
- test-taker.
12GIFs Colour Coding
13Design Advantages (2)
- Re-ordering Q.s
- Delete / replace Q.s
- Display decided number of Q.s per page
14Sorting Question Order
15Sorting Question Order
16Grading Advantages
- Automatic grading ?Ease speed for large courses
- Feedback Function ? Reduced writing time Fair
distribution of comments - Regrade Function ? Ergonomic.
- Typed answers Legible ?
17Automatic Grading
18Feedback Function
19Question Design Disadvantages
- GIFs for visuals
- Allow cut paste of clip art formatted Excel
tables - Create online community for GIF sharing
- Formatting of Q text
20Answer Base Disadvantages (1)
- Need to input ALL POSSIBLE strings
- Or manually grade ALL students tests
- REDUNDANT
- ? Have ALL students answers saved into Q-Answer
Base - or
- ? Display ALL Q.1, Q.2, Q.3
21(No Transcript)
22Answer Base Disadvantages(2)
- ONLY (!) 10 Answer slots in Q-design
- ? How many slots? Function
23ONLY(!) 10 Answer Boxes
24(Technical issues?Mike)
25What is tested?
- Evaluation criteria for JP1000
- Attendance Participation Sept.-April
10 - Oral Presentation 4 times a year
25 - Quizzes (dictation) Homework (exercise
- of Japanese characters) every week 15
- Tests (on-line) 4 times a year 50
- -grammar structure vocabulary idiomatic
expressions - -reading of Japanese scripts
26Improvement from last year
- More ease of and control over making and
conducting tests in general - Improved format and instructions of each test
- Carefully selected test items (e.g. types and
number of questions) - Effective use of visual aids
- Decided number of questions per page
- Two attempts only
27Test Average ()
- Test 1 Test 2 Test 3 Test4
- Paper 78.44 55.16 64.90 66.98
- (2004-05)
- Online 64.57 59.09 59.52 62.50
- (2005-06)
- Online 64.43 57.39 61.11 64.97
- (2006-07)
28Questionnaire ResultsStudent Background
(n72)
- 1. What is your level of comfort with computers?
- 1 1 2 8 3 13 4 29 5 46
- 2. How enthusiastic are you about the use of
information technology in your classes? - 1 7 2 14 3 25 4 39 5 13
- 3. How satisfied are you with the time and
efforts you allotted yourself to prepare for the
tests? - 1 6 2 15 3 40 4 29 5 7
29Questionnaire ResultsOnline Testing (n72)
- 4. How would you rate (your satisfaction with)
the number of questions included in each JP1000
test? - 1 11 2 25 3 32 4 22 5
7 - 5. How would you rate (your satisfaction with)
the time you were given to complete each test? - 1 17 2 29 3 25 4 18 5
7 - 6. How would you rate (your satisfaction with)
the content of the tests? - 1 3 2 13 3 29 4 36 5
15 - 7. How would you rate (your satisfaction with)
the organization of the tests? - 1 8 2 7 3 26 4 40 5
14
30Questionnaire ResultsOnline Testing (Contd)
(n72)
- 8. How would you rate the range of knowledge and
skills assessed by the computer-based testing? - 1 6 2 17 3 38 4 29 5
8 - 9. In your opinion, how accurately did the
computer-based assessment measure your knowledge
and abilities in comparison to a conventional
paper-based test? - 1 11 2 26 3 38 4 19 5
6 - 10. Please compare your performance in the
computer-based assessment with how you feel you
would perform in a paper-based test? - 1 4 2 21 3 42 4 22 5
11
31Questionnaire ResultsOpen-ended
- 11. Features of the on-line testing that you
liked - Avoidance of pen errors/conflicts with messy
writing - Easy legibility
- It wasnt messy i wasnt writing
- Tests were clear. I liked the ability to save
before sending. - It was organized and was easy to do.
- There were no ambiguous answers
- It is more direct and provide ease to answer the
question. - I liked the fact that we have two attempts on
the test. - I liked the use of images to help bring up ideas
in the questions ... the comic book
dialogues on the test were cool. - I can relax more somehow when Im doing the
online tests, and concentrate better.
32Questionnaire ResultsOpen-ended (Contd)
- 12. Features of the on-line testing that you did
not like - I absolutely hated the timer in the corner of
the test. it feels like a bomb is going to
explode. - The timer!
- Saving answers after every 5-10 questions was a
bit annoying - Too many possible variations for a correct
answer - Could only check answers twice.
- Numbers were often hard to differentiate from
each other (eg. 25 26 looked similar) - a small character can make a full answer wrong.
- i sometimes found the technical issues to be
quite annoying. - of questions asked in 50 mins
- Doesnt really test writing skills (kanji)
- Not having my test evaluated by my instructor
33Questionnaire ResultsOpen-ended (Contd)
- 13. How could those features (that you did not
like) be improved? - Remove the timer, and make the test easier to
navigate by putting all the questions on 1 long
page. - Clearer, larger fonts.
- I have no idea. Maybe instead of moodle, another
program created strictly for Jp classes with the
same concepts as moodle but not on a shared
database. - clearer instructions, quicker system
- Have a written section for hiragana, katakana,
and kanji. - Auto-save the answers as they are typed.
- Turn down the brightness on the computer
screens! - Allow answers to be typed in hiragana.
- just by increasing the time limit maybe by 10
minutes or so, nothing too much
34Questionnaire ResultsOpen-ended (Contd)
- 14. Any other comments
- I know that some students complain about the
number of questions but the difficulty level is
much lower than written tests from last year. I
don't think that online tests make it difficult
for students to perform well. If they're well
prepared and have studied hard, they should be
able to perform well no matter what. So, overall,
I think that online tests are good. - the online testing was pretty good, i think it
is a more efficient system than paper, and would
prefer online testing over paper testing any
day. - Thank you for your efforts to create an
efficient testing system. - Less questions and/or more time. For a 50-minute
test, there should be 50 questions. - All other courses give 2 hours for a 100
multiple choice question test. These tests were
so compressed in time that it made it stressful
to think and answer questions.
35(Future developmentMike)
36References
- Abhijeet Chavan (2004) Open-Source Learning
Management with Moodle - http//www.linuxjournal.com/article/7478
- Moodle (2006) Moodle for Language Teaching
- http//moodle.org/course/view.php?id31
- Aditya Nag (2005) Moodle An open source learning
management system - http//business.newsforge.com/article.pl?
sid05/05/09/2117200 - Den Pain and Judy Le Heron (2003) WebCT and
Online Assessment The best thing since SOAP? - http//www.ifets.info/journals/6_2/7.html
- Röver, C. (2000)Web-Based Language Testing
Opportunities and Challenges http//www2.hawaii.ed
u/roever/wbt.htm - _________(2001) WEB-BASED LANGUAGE TESTING
- http//llt.msu.edu/vol5num2/roever/defaul
t.html - Sabine Siekmann (2006) CALICO Software Report
- Which Web Course Management System is
Right for Me?A Comparison of WebCT 3.1 and
Blackboard 5.0 - http//calico.org/CALICO_Review/review/we
bct-bb00.htm - University of Ontario (2006) WebCT
- http//www.uoit.ca/EN/main/11258/12122/17
767/learning_webct.html -