Title: CLASS ASSIGNMENTS
1CLASS ASSIGNMENTS
- Project
- 40 persons ? 7 groups of 5/6
- Own project
2Project (groups of 5/6)
- Initial report 1 page
- List of group
- Which techniques
- How evaluation
- Final report 15 pages
- Problem
- VR concepts
- Implementation and Results
- Conclusion
- Class presentation
3Projects
- 1. Mapping 6DoF to 2DoF
- 2. Primary Depth Cues Evaluation
- 3. Secondary Depth Cues Evaluation
- 4. Point Location Task Evaluation
- 5. Space Ball Device Evaluation
- 6. Gyration Device Evaluation
- 7.
4Own project (groups of min 3)
- Initial report 1 page
- List of group
- Problem
- Approach and Evaluation
- Final report 10 pages
- Problem
- VR concepts
- Implementation and Results
- Conclusion
- Class presentation
5Nine Lectures
- Introduction
- Human factors
- Interaction 1 (basic interaction)
- Interaction 2 (two handed interaction)
- Tracking
- Haptics and Auditory Systems
- Distributed / Collaborative / Telepresence
- Augmented Reality
- Usability
6Introduction
- Models of interaction
- Constraints
- Types of interaction
- Types of device
- Basic Interaction
- Locomotion
- Body-Centred Interaction Proprioception
- Selection Manipulation
71. Models of Interaction
- Virtual Reality Model
- The user is using their body as an interface to
the world - The system responds to everything they do or say
- Extended Desktop Model
- The user needs tools to do 3D tasks
8Virtual Reality Model
- Need to track the user precisely and interpret
what they do - Focus is on users exploring the environment
- Tension between realistic and non-realistic
responses of the environment - Mundane are where the world responds as if it was
controlled by laws of physics - Magical are everything else (automatic doors,
objects in vacuum, etc)
9Limits of VR Model
- Cant track user over very large areas
- e.g. Some form of locomotion metaphor will be
required for long distance travel - Physical constraints of systems
- Limited precision and tracking points
- Lack of physical force feedback
10Extended Desktop Model
- Focus on analysing a task and creating devices
that fit the task - Study ergonomics of the device and
applicability/suitability for the role
11Limits of ED Model
- 3D tasks are quite complicated to perform
- Tasks can become very specialised
- Leads to a proliferation of physical (and
virtual) devices
Fakespace Cubic Mouse
12Types of Physical Devices
3DConnexion Spacemouse
Polhemus Isotrak 3-Ball
Logitech 3D Mouse
Ascension Wanda
3DConnexion Spaceball
Inition 3DiStick
13Types of Physical Devices
- Wands with 6 DoF sensor
- 3-4 buttons
- 2D joystick
14Input Devices
- 6DOF position tracking systems
- Head tracking
- Hand(s) tracking
15Logical Input Types
- Continuous functions
- Wand joystick (x -1, 1, y -1, 1)
- Tracked position (x, y, z)
- Tracked orientation (h, p, r)
- Discrete events
- Wand buttons (on off)
16Virtual Devices
17The Problem
- How does one map a particular input device into a
particular virtual device? - How to start/stop the interaction
- How to drive the interaction
Inputdevices
Virtual devices
f - ?
18Basic interaction tasks
User
Object
Translate
Rotate
Scale
19Basic Math
- Coordinate systems
- nodes
- Transformations between coordinate systems
- Links
- Notation
- PW TWO PO
- TRW TWO TRO
20Basic Math Grabbing
- Object-hand transform
- TOWTWRTRTTTH
212. Basic Interaction Tasks
- Locomotion
- How to effect movement through the space
- Selection
- How to indicate an object of interest
- Manipulation
- How to move an object of interest
22Locomotion
- User points (somehow) in the direction of motion
- User presses a button
23Selection and Manipulation
- User points at object with their hand
- User selects by pressing a button
- User grabs by pressing 2nd button
- Object is rigidly attached to hand coordinate
system
24Selection Only
- Occlusion selection
- Similar to selection with a mouse
- Put hand over object (occlude it) to select it
253. Locomotion
- Travel in Immersive Virtual Environments An
Evaluation of Viewpoint Motion Control
Techniques, Bowman, Koller and Hodges - One of the first rigorous studies of some of the
trade-offs between different travel techniques
26Taxonomy of Travel
Bowman, Koller and Hodges
27Quality Factors
- 1. Speed (appropriate velocity)
- 2. Accuracy (proximity to the desired target)
- 3. Spatial Awareness (the users implicit
knowledge of his position and orientation within
the environment during and after travel) - 4. Ease of Learning (the ability of a novice user
to use the technique) - 5. Ease of Use (the complexity or cognitive load
of the technique from the users point of view) - 6. Information Gathering (the users ability to
actively obtain information from the environment
during travel) - 7. Presence (the users sense of immersion or
being within the environment)
28Experiment 1
- Absolute motion task
- Gaze v. Point AND constrained v. unconstrained
- Bowman claimed expected gaze to be better
- Neck muscles are more stable
- More immediate feedback
- Eight subjects, each doing four times 80 trials
(five times 4 distances to target, four target
sizes)
29Experiment 1
- No difference between techniques
- Significant factors were target distance and size
30Experiment 1
- No difference between techniques
- Significant factors were target distance and size
31Experiment 2
- Relative motion task
- No prior expectation
- Need forward and reverse direction
- Nine subjects, four sets of 20 trials
32Experiment 2
- Obvious difference
- Cant point at target and look departure point
simultaneously
Bowman, Koller and Hodges
33Summary of 1st Two Experiments
34Experiment 3
- Testing spatial awareness based on four travel
variations - Constant speed (slow)
- Constant speed (fast)
- Variable speed (smooth acceleration)
- Jump (instant translation)
- Concern is that jumps and other fast transitions
confuse users
35Experiment 3
- However, there was no main effect
- What does this tell us?
364. Body-Centred Interaction
- Proprioception is defined by Oliver Sacks as "...
that continuous but unconscious sensory flow from
the movable parts of our body (muscles, tendons,
joints), by which their position and tone and
motion is continually monitored and adjusted, but
in a way which is hidden from us because it is
automatic and unconscious - Proprioception is a resource for us to exploit in
designing virtual reality interfaces
37Body-Centred Interaction
- Participants can use their own body
- Can estimate distances
- Can perform actions without looking
- If we render that body (or other sufficient
feedback), then user can identify the interaction
metaphors with their own body
38BCI Examples
- Walking on the spot
- Head motion has a characteristic motion when the
user walk on the spot - Physically closer to the real action
- Self-scaling
- Using gestures to scale ones own body as a way
of re-scaling the world
395. Selection and Manipulation
- Moving Objects In Space Exploiting
Proprioception In Virtual-Environment
Interaction, Mine, Brooks Jr. and Sequin - One of the first papers to discuss a range of
selection and manipulation tasks
40Body-Relative Interaction
- Provides
- Physical frame of reference in which to work
- More direct and precise sense of control
- Eyes off interaction
- Enables
- Direct object manipulation (for sense of position
of object) - Physical Mnemonics (objects fixed relative to
body) - Gestural Actions (invoking commands)
41Working within Arms Reach
- Provides more direct mapping between hand motion
and object motion - Provides finer angular precision of motion
- Automatic scaling mechanism developed, so user
can interact with objects lying at any distance
as though in arms reach - user doesnt always notice scaling
42Scaled-World Grab for Manipulation
- Automatically scale world, so that selected
object is within arms reach - Near and far objects easily moved
43Mine, Brooks Jr, Sequin
44Previous Techniques
- Ray-Based
- Ray is centred on users hand
- All manipulations are relative to hand motion
- Translation in beam direction is hard
- Rotation in local object coordinates is nearly
impossible
Mark Mine, http//www.cs.unc.edu/mine/isaac.html
45Previous Techniques
- Object-Centred
- Select with ray as before
- Local movements of hand are copied to object
local coordinates
Mark Mine, http//www.cs.unc.edu/mine/isaac.html
46Scaled-World Grab for Locomotion
- User transports himself by grabbing an object in
the desired travel direction and pulling himself
towards it - User can view the point of interest from all
sides very simply - For exploration of nearby objects, virtual
walking is more suitable while going much
further, invoking a separate scaling operation or
switch to an alternate movement mode is better
47Physical Mnemonics
- Storing of virtual objects and controls relative
to users body - Pull-down menus
- Hand-held widgets
- Field of View-Relative mode switching
48Pull-Down Menus
- Problems with virtual menus
- Heads-up are difficult to manage
- Fixed in world often get lost
- Could enable menu with ..
- Virtual button (too small)
- Physical button (low acceptability)
- Instead hide menus around the body, e.g. above
FOV
49Hand-Held Widgets
- Hold controls in hands, rather than on objects
- User relative motion of hands to effect widget
changes
50FOV-Relative Mode Switching
- Change behaviour depending on whether a limb is
visible - Hand visible, use occlusion selection
- Hand not visible, use ray selection
51Gestural Actions
- Head butt zoom
- Look at Menus
- Two handed flying
- Over the shoulder deletion
52Experiment 1
- Align docking cube with target cube as quickly as
possible - Comparing three manipulation techniques
- Object in hand
- Object at fixed distance
- Object at variable distance (scaled by arm
extension)
53Experiment 1
- 18 subjects
- In hand was significantly faster
Mine, Brooks Jr, Sequin
54Experiment 2
- Virtual widget comparison
- Comparing
- Widget in hand
- Widget fixed in space
- 18 subjects (as before)
- Performance measured by accuracy not time
55Experiment 2
- Widget in hand was significantly better
Mine, Brooks Jr, Sequin
56Other Work
- Still an active field
- Shadow cone
- Shadow-based interaction
- Worlds in miniature
- Voodoo dolls
- Etc
57Summary
- A lot of work has been done and is being done in
3D interaction - Covered locomotion and selection manipulation
- However it is still quite tedious to use most 3D
user interfaces - Lack of precision is probably main problem
- However, people are able to interact
58Two-Handed Input
- Exploits the relationship between dominant and
non-dominant hands
Cooperative Bimanual Action, Hinckley et al., CHI
1997
Personal Interaction Panel, Vienna University of
Technology