Title: Virtual Environments as Hybrid Systems
1Interacting in task-oriented virtual worlds
Lakshmi Sastry, Michael Wilson and David Boyd
Virtual Reality and Multimedia Group, Information
Technology Department, CLRC Rutherford Appleton
Laboratory http//www.itd.clrc.ac.uk/Activity/IN
QUISITIVE
2The presentation
- We are developing a portable interaction
toolkit for VR applications which will improve
support for developing user interaction within
task-oriented virtual environments. - Why an interaction toolkit?
- The toolkit architecture and components
- A simple component
- Issues and conclusion
3Why the toolkit?
- Within CLRC, we have
- The Applications
- one-off large scale engineering design and
deployment projects - maintenance and training projects
- visualization of (computational) analysis and
observation data. - The purpose
- design review (confirm design and sign off,
installation and maintenance planning) - training
- Real-time interactive visualization
4Why the toolkit? (- continued)
- Within CLRC, we have
- The users
- Teams of engineers, expert users of CAD packages
such as ProEngineer - Scientists with computational, experimental and
observation data, using various domain specific
visualization packages. - The data
- Detailed engineering designs, requirements
dictated by science - Data from scientific experiments, the numbers are
dictated by nature
5Why the toolkit? (-continued)
- Virtual Reality (VR) interaction techniques have
the potential to deliver intuitive user
interfaces for such post-process applications. - Facilities for user interface and interaction
development within today's VR systems are
rudimentary, limited and limiting - limitations on data conversion and optimization
- limited appropriateness or usability of
interaction techniques - variation in the nature of the worlds
- variation in the scale of the world
- variation in the tasks and users.
- dynamic tailorability and adaptability are absent
6 Example Engineering design review using
dvMockup. Selection of menu items objects uses
virtual hand metaphor only Limitation - no remote
contact is possible with distant objects
7Why the toolkit? (-continued)
- Development and deployment costs are high.
- To overcome these problems
- we are creating a generic interaction toolkit
with portable modules - makes a significant contribution to the rapid
development and successful application of 3D VR
interaction techniques to a wide range of virtual
environments.
8Toolkit architecture
- To use existing VR tools as development platforms
and develop portable interaction modules with
customisable API with mapping to the VR tool. - Four basic tasks of user interaction are
- navigation
- selection
- manipulation
- data input
- To provide support for higher level tasks that
can be implemented as a combination of basic
tasks. - Each basic interaction task can be realised using
a number of possible interaction techniques - Navigation - point-fly, magic carpet, indirect
object manipulation (e.g. car) - Selection - command, virtual hand, remote wand,
miniature world - Manipulation - virtual hand, remote wand, command
- Data input - typing, voice, indirect object
manipulation (e.g. keypad)
9Toolkit architecture (- continued)
- Support for modalities - single device for
navigation, selection manipulation
interchangeably - e.g. Space Mouse - Fine tuning of devices - API for setting
tolerance for measures and triggers - Based on the above analysis, the main functional
components which the toolkit provides are - a set of interaction techniques for the four
classes of basic interaction tasks - a set of generic virtual interaction objects such
as toolbox - a run-time interaction framework - mapping to VE
10Using the Interaction Toolkit
High Level Task - moving a cup
decomposes into
Basic Interaction Tasks - navigate, select,
manipulate
supported by
Interaction Techniques - point-fly, hand select,
hand grasp move
implemented on
Interaction Objects - virtual hand
and
Application Object - cup
11Relationship between interaction toolkit, input
devices, VR system and application
12Runtime Interaction Framework
- The first component of the runtime interaction
framework is - a Contextual Interpreter
- obtains the measures and triggers from devices,
- convert these into the VE co-ordinate system,
taking into account modality, device tolerance, - takes account of dynamic constraints and current
state of the interaction objects, - interprets the measures and triggers in that
context. - calls the appropriate interaction techniques to
generate the event tokens - is independent of the host VR system.
13Runtime Interaction Framework
- The second component of the runtime interaction
framework is - An Interaction Manager
- monitors the changing state of user interaction
within the VE - receives the event tokens from the Contextual
Interpreter. - Queries the current state within the VR system's
runtime object database - Communicates the update required.
- The operation of the Interaction Manager must be
customised for each host VR system.
14User
Input device
I n t e r a c t i o n T. k i t
Dev config., parsing mode of interaction, resolve
local actions, generate event tokens
2D/3D mice based navigation, selection and
manipulation
Modules for object class implementation, querying
VE state and send VE update and action requests
to Maverik or application
Window pane, buttons, text, virtual hand etc
SMS
Output devices
Maverik Renderer
Maverik kernel
Maverik Application classes
Application
Object classs relationship to I/o devices,
Maverik and application
15Interaction Objects - widgets
Window Pane Fixed Billboard Head
up Movable
Role Display Menu
Meter/Dial Display values at selected locations
Attachment Red Pin/display Pointer Single/Double
handed Slider Scale To change range of values
displayed Constrainable Tool - Inherit
constraints from environment
16Examples of Interaction Objects
- Generic Toolkit
- Examples shown from Maverik
- Also work in dvMockup
- Object examples at run-time use all of
interaction toolkit - Contextual Interpreter
- Interaction manager
17Styles of menu - fixed position - a demo.
18Styles of menu - billboard - a demo.
19Styles of menu - head-up - a demo.
20Styles of menu - movable - a demo.
21Complex Interaction Object A remote controller
for data visualization - menu display
22A remote controller for data visualization
23Figure shows a red-lining tool for engineering
design review using dvMockup. The red pinhead in
the left-hand image is indicative of an attached
annotation which can be activated as shown in the
right-hand image
24Issues and Conclusion
- Individual VE scale seem to impose a lot of
tweaking at the integration stage - Needs a higher level GUI for the application
designer