Title: Visualisation, Animation and Virtual Reality
1Visualisation, Animation and Virtual Reality
- Lecture 9
- Tracking And Interaction
2Introduction
- Bob Hobbs
- K210
- r.g.hobbs_at_staffs.ac.uk
3How does VR work
- we live in a 3D world
- We have developed many methods to make sense of
the world around us - VR techniques have to try to recreate these
methods
42D images can be confusing
5shadows and high lighting create the illusion of
3D
6but which is closer?
7occlusion
8A simple shape
Defined as a set of points or Vertices With
surfaces or facets defined by spline boundaries
created by joining points with lines
9More complex shapes
- Shape defined as polygons(triangles)
- Rounded surfaces created by more polygons
10Defining the surface
- Colour
- variation over surface
- Texture
- rough, smooth, etc
- Lighting
- creates shadowing
- Reflectance
- dependant on texture and colour
11Realism added by surface mapping
12Lighting and reflectance
- Exhibits shadowing and shading
- Gouraud shading
- Ray-tracing used to calculate light paths based
on reflectance values
13Perspective and Z-buffering
- Objects appear smaller further away
- Zero-point
- Uses Z co-ordinate to compute
- Relative position
- Occlusion
14Perspective
15Depth of field
- Further away objects become hazier
- Focus attention on nearer objects
- Occurs naturally but must be added to virtual
environment
16Blue hazing with distance
- look at a distant hill or building
- fuzzy, less contrast, bluish tinge
- scattering effect of air
- brains get used to it
- blue objects seem further away
- red ones closer
- use in visualisation and VR (also used in garden
design!)
17Anti-aliasing
- Sharp contrast unreal
- Curved lines - stepped
- Edges blurred removing stepping
- More natural
18Underlying geometry
- Vertices
- Lines
- Faces
- Transforms
19Geometry pipeline
Animation/Interaction time
Modeling shapes
Shading reflection and lighting
Transformation viewing
Hidden Surface Elimination
Imaging Pipeline
20Imaging pipeline
Geometry Pipeline
Rasterization and Sampling
Texture Mapping
Image Composition
Intensity and Colour Quantization
Computer Monitor
Framebuffer/Display
21Example
22Wireframe model Orthographic views
23Perspective View
24Depth Cue
25Hidden Line Removal add colour
26Constant Shading - Ambient
27Faceted Shading - Flat
28Gouraud shading, no specular highlights
29Specular highlights added
30Phong shading
31Texture Mapping
32Texture Mapping
33Reflections, shadows Bump mapping
34Basic Analysis
- 1 Define points in 3D space
- 2 Define lines and Facets which join points
together - 3 Define light sources to generate shadows and
shading - 4 Apply texture to facets
- 5 Define reflectance properties and colour of
surface - 6 Redraw image as viewpoint changes applying
perspective and occlusion to induce Reality
35Viewing Frustrum
- Controls visibility,depth and perspective of
scene - Linked to virtual camera
- Stereo has a frustrum for each eye
36Recap
Initialize world
Calculate Geometry
Draw Wire Frame
Render Surfaces
Enhance Surfaces and lighting
Sensor input and output
37Stereo
- Human visual system gets two slightly different
images, one from each eye - Two new camera attributes, distance to zero
parallax distance and eye separation - Zero parallax- Distance of projection plane does
matter- Objects appear at the screen depth - Positive parallax- Projected objects are on the
same side as the corresponding eye- Objects
appear behind the screen - Negative parallax- Projected objects are on the
opposite side as the corresponding eye- Objects
appear in front of the screen
38(No Transcript)
39Symmetric frustum and trim
- Computing stereo pairs with asymmetric frustum
40Toe in camera (Symmetric frustum)
41Projection Mechanisms
- Goal Presenting left and right images
independently to each eye- Main determinant of
quality is the degree of cross-talk or
interference - Requires "perfect" syncronisation of left and
right images - Active Stereo
- Passive Stereo
42Active stereo
- 120Hz (frame sequential stereo), 60Hz per eye-
Flicker becomes objectionable for most people
around 110Hz- Project onto any surface- Good
quality glasses cost upwards of 300- Works with
monitors for personal viewing
43Passive Stereo
- 60-80 Hz per eye- Optionally circular
polarisers- Most suited to public exhibitions
(cheap glasses)
44- Z-Screen- 120Hz (frame sequential stereo)-
Projector or monitor
45Anaglyphic Stereo
Like 3D stereo Polaroid glasses
46Sensing position
47Head tracking
48Head tracking
49Head tracking
50Head tracking
51Head tracking
52Different sensing methods
53Accelerometer
Phantom
Fast Track
54How the tracker works
Receiver
Transmitter
55How the tracker works
Receiver
Transmitter
56Head tracking
57Head tracking
- Latency
- Filtering, keep steady
- Transients of sound
58(No Transcript)
59Human Dynamics
- Users described as participants
- basic interaction involves control of camera
(viewpoint) - exploratory navigation / locomotion
- Walk through systems
- More advanced environment allow interaction
- Touch , selection, manipulation
- referred to as direct manipulation
60Components of interaction
- VR model
- Simulation of body
- Interaction with virtual body
- Object pair collision
- General collision detection
61VR Model
- Goal of Being There
- Presence or Telepresnce
- Held and Durlach 1992, Draper 1998
- Must model expectations -gt realism
- Ideal VR model must Immerse participant in
visual, audio, touch , smell and taste - Humans can process several audio streams and can
focus and segrgate on one - Wenzel 1992
62VR model - Immersion
- Surrounds body
- fills visual field
- extensive
- inclusive (replaces reality)
- Vivid
- human body
- in CAVE actual body can obscure projection of
virtual objects - In HMD body must be represented
63VR model - HCI
- Mouse and keyboard has two problems
- gulf of execution
- gulf of evaluation
- Hutchins 1986
- Direct Manipulation paradigm
- Tracked HMD is simplest form - 1 to 1 mapping,
Low cognitive overhead - Using mouse - must map actions to different
translations
64VR Model - Interaction
- Immersion and tracking rely on registration
- Registration implies that motion of limbs
accurate - Better appreciation of 3D environment
- Cannot lose interaction - reduces gulf of
execution - Gulf of evaluation reduced when whole virtual
body used - Slater and Usoh 1994, Mine 1997
65Simulation of Body
- Body model is the description of the interface
- eyes are visual interface, ears are audio
interface - geometric description drawn from egocentric point
of view - description of hand and fingers forms basis of
grasping simulation for picking up objects
(Boulic 1996)
66Simulation of Body- Building the body
- The more points representing the body the more
realistic the movement - Up to 90 points for motion-capture in animation
- Standard for human skeleton (H-Anim 1999)
- More typically head, Torso, Both hands
- Inferred movement from limited points
- Inverse kinematics problem - infinite
possibilities of movement in virtual environment,
consistent restraint - Elbow position in 4- Tracker system (Badler, 1993)
67H-Anim
Humanoid
L Midtarsal
L Ankle
L Knee
L Hip
Sacroiliac
R Midtarsal
R Ankle
R Knee
R Hip
L Wrist
L Elbow
L Shoulder
vl5
R Wrist
R Elbow
R Shoulder
Skullbase
68Simulation Of body - tracking the participant
- Choice of system depends on 5 factors
- accuracy, resolution, range, lag, update rate
- Many different tracking technologies
- Meyer 1992
- frequency and time
- ultrasonic time-of-flight measurement
- Pulsed Infra-red
- GPS
- Optical Gyroscopes
- Phase difference
69Simulation Of body - tracking the participant
- Spatial Scan
- Outside-in
- Inside-out
- Inertial sensing
- mechanical gyroscope
- Accelerometer
- Mechanical Linkages
- Direct - Field Sensing
70Interaction with virtual Body
- Limitations mean reliance on metaphors for
- object manipulation (grasping and moving)
- locomotion (movement)
- Limitations in haptics mean that restraint on the
virtual environment exists
71- Sensors in joints detect position
- 3D viewer updates
- Robot applies force to joints
- Force is felt on hand
72Object Manuipulation
World
World
Body B
Object O
Body B
Object O
Hand H
Hand H
Object P
Releasing
Object P
Grasping
73Object Manipulation
- Hand posture may not be tracked - makes grasping
difficult - Must establish a point at which union is deemed
to have taken place - Moved by repositioning in the scene graph
- Robinett and Holloway 1992
74Locomotion
- Tracker has a limited range
- Must use locomotion metaphor to move greater
distances - Locomotion is on an even plane , virtual terrain
may not be - Collision detection can be employed to raise or
lower the participant accordingly
75Directions of locomotion
Fly in direction of aim Fly in direction of
pointing Fly in direction of gaze Fly in
direction of torso
76Books and Articles
- The Handbook of Virtual Environments (2002), Kay
Stanney (ed), Lawrence Erlbaum. - Isdale, J., 1998, What is VR? http//www.isdale.co
m/jerry/VR/WhatIsVR.html - Kalawsky, R., 1993, The Science of Virtual
Reality and Virtual Environments, Addison Wesley. - Rheingold, H., 1991, Virtual Reality, Secker and
Warburg, London. - Wilson, J.R., DCruz, M., Cobb, S. and Eastgate,
R., 1996, Virtual Reality for Industrial
Applications, Nottingham University Press.
77Resources
- www.vrweb.com (VR Solutions Company)
- www.barco.com/projection_systems/virtual_and_augme
nted_reality/ - www.sgi.com (VR Solutions Company)
- www.ptc.com (free modelling program)
- www.sense8.com (trial VR program)
- www.crystalspace.com (free Games Engine)