Electronic Voting System Usability Issues - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Electronic Voting System Usability Issues

Description:

Electronic Voting System Usability Issues – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 32
Provided by: benbed
Learn more at: http://www.cs.umd.edu
Category:

less

Transcript and Presenter's Notes

Title: Electronic Voting System Usability Issues


1
Electronic Voting System Usability Issues
  • Benjamin B. Bederson
  • Computer Science Department
  • Human-Computer Interaction Lab
  • University of Maryland, College Park
  • Bongshin Lee, HCIL
  • Robert M. Sherman, HCIL
  • Paul Herrnson, Dept. of Government Politics,
    UMD
  • Richard G. Niemi, Political Science, Univ. of
    Rochester

www.cs.umd.edu/bederson/voting
2
Voting System Importance
  • The manner in which voters cast ballots is
    important
  • Voting technology and ballot design can
    influence
  • election outcomes
  • voters trust
  • which affects future elections

3
Voting System Technologies
  • Wide range of available systems from
  • paper and tally sheets
  • punch cards with chad
  • mechanical lever machines
  • To more recent technologies
  • optical-scan machines
  • Direct Recording Electronic - DREs(touch-screens)

4
US Political Scene
  • After the US 2000 general election, there was a
    flurry of activity
  • major problems
  • 4-6 million presidential votes lost in 2000
  • hanging chad, etc.
  • see CalTech/MIT Voting Technology Report, 2001
  • public awareness of unequal voting systems,
    widespread complaints, and new knowledge of
    system limitations
  • calls for new systems
  • So, how to make a judgement?

5
Other Studies
  • A range of research in this area
  • Historical ballot design studies
  • Caltech/MIT Study
  • But few investigations about usability

6
Learn from the Field of HCI
  • Electronic voting systems offer
  • Control font size
  • Switch languages
  • Accommodate disabilities
  • Accurate and fast vote tabulations
  • But have unique challenges
  • Must work for everybody
  • including elderly, disabled, uneducated, etc.
  • Walk up and use (no required training)
  • No external help (although it is allowed)
  • Not frequently used

7
Our Study
  • Requested by four counties in Maryland to examine
    their new machines (Diebold AccuVote-TS).
  • 30 days from first call to report
  • No funding
  • Machines to be used by 1,000,000 people in 2002
    general election
  • We agreed to perform
  • Expert review (5 reviewers)
  • Close-up observation (47 users)
  • Field testing (415 users)

8
Study Screens
9
Study Screens
10
(No Transcript)
11
(No Transcript)
12
(No Transcript)
13
(No Transcript)
14
(No Transcript)
15
Expert Review
  • Five HCIL faculty/staff analyzed system.
  • Concerns
  • Inconsistent terminology/labeling (5)
  • Color usage (4)
  • Inserting / removing card (4)
  • Help / Instructions (4)
  • Layout (long names, many candidates?) (4)
  • Screen glare (3)
  • Changing selection (2)
  • No overvote feedback (2)
  • Privacy (1)

16
Expert ReviewAudio-only System
  • Concerns
  • Inappropriate keypad mapping (5)
  • Audio quality (5)
  • No ballot review (3)
  • No button feedback (2)
  • No overvote warning (2)
  • Must review entire ballot (2)
  • Volume control (1)

17
Close Observation
  • Observed 47 UMD members
  • Primarily students
  • Same toy election
  • Used think aloud protocol
  • Average time was 2 minutes, 10 sec
  • One participant couldnt figure out how to
    write-in a candidate

18
Close Observation Results
  • Overall rating (1-9, 9 being positive)
  • Overall comfort 7.7
  • Screen layout and color 6.9
  • Representative comments
  • Easy to use, straightforward
  • Excellent idea
  • Inserting card was very confusing
  • Is it reliable?
  • Colors are not well chosen
  • Font could be bigger
  • Ballot layout was confusing

19
Close Observation Comments
  • System Failure
  • One card got stuck in reader
  • Card Insertion
  • Many participants had difficulty, expecting
    ATM-style interaction
  • Language Selection
  • Participants got stuck
  • Undervotes not highlighted

20
Field Study
  • Great expectations
  • Observation and recording of interaction
  • Questionnaire
  • Administer to representative sample
  • Reality
  • Study executed by election officials
  • No observation or notes on use
  • Sample from wealthier and more educated districts

21
Field Study Results I
  • Ratings (1-9, 9 being positive)
  • Overall impression
  • 8-9, 80
  • 7, 10
  • 1-6, 10
  • Felt comfortable using system
  • 8-9, 86
  • 7, 7
  • 1-6, 7
  • Ease of reading text
  • 8-9, 86
  • 7, 8
  • 1-6, 6

22
Field Study Results II
  • Terminology used
  • 8-9, 83
  • 7, 10
  • 1-6, 7
  • Correcting mistakes was easy
  • 8-9, 81
  • 7, 11
  • 1-6, 8
  • Trusted votes were recorded properly
  • 8-9, 85
  • 7, 7
  • 1-6, 8

23
Field Study Interpretation
  • 10 that rated 6 or lowerequals 383,000 voters
  • City/suburban dwellers rated higher
  • Women rated higher
  • Frequent computer users had lower trust

24
Diebold AccuVote-TSDeployed at 2002 General
Election
25
(No Transcript)
26
(No Transcript)
27
(No Transcript)
28
(No Transcript)
29
(No Transcript)
30
(No Transcript)
31
Conclusion
  • Studies leave us optimistic, but concerned
  • With elections called by 1, leaving 10
    unconfident voters is a problem
  • The requirements of DREs are unique, but the
    design issues arent
  • Typical of public access information systems
  • Need closer work with HCI professionals
  • Need qualitative and quantitative user studies
  • Need further field studies
  • Come see voting panel tomorrow morning

www.cs.umd.edu/bederson/voting
Write a Comment
User Comments (0)
About PowerShow.com