Title: More V
1More VV Insights from Foundations 02 for
Distributed Simulation
- Dr. Dale K. Pace
- JHU/APL
- 11100 Johns Hopkins Road
- Laurel, Maryland 20723-6099
- (240) 228-5650 / (240) 228-5910 (FAX)
- dale.pace_at_jhuapl.edu
- Presented by Simone M. Youngblood
The work reported in this paper was performed
under sponsorship of the Defense Modeling and
Simulation Office (DMSO), but its views are those
of the author and should not be construed to
represent views of DMSO or of any other
organization or agency, public or private.
2Comment
Paper 03S-SIW-007 presented at the Spring
Simulation Interoperability Workshop (SIW)
contains a number of insights for distributed
simulation from the Foundations 02 VV
Workshop. This paper (03E-SIW-003) presents
additional insights about distributed simulation
VV.
3Agenda
- Foundations 02 Background
- From the Spring SIW paper
- Distributed Simulation VV Insights
- policy, guidance, and standards
- basic MS theory as foundation for distributed
simulation VV - roles of primary verification technology and
automation - practical concerns about network limitations
- the role of Subject Matter Experts (SMEs)
- estimation of VVA resources
- research needs for distributed simulation VV
- Other VV Insights
- Additional Insights
4Why Foundations 02to address four needs
- Articulate state of MS VV art.
- Update VV bibliography.
- Comprehensive and coherent statement of MS VV
research needs. - VV info exchanges so that best practices may
be employed by all.
5Invited Paper Special Interest Topic Session
Topics
A1 Verification Technologies A2 Validation
Methods and Technologies A4 SME Use in MS
VV A5-T5 combined session Formal Methods in
VV A6 MS Foundations B1 Computational
Science/Engineering VV B2 VV for HWIL/System
in Loop MS B3 VV of MS with People or HBR B4
Estimating VV Resource Requirements B5 VV of
MS with Adaptive Processing B6 VV of
Multi-Resolution MS T1 VV Education in
Academia T2 Managing VV T3 VV
Research T4 VV Issues re MS Reuse
T6 VV Tools, Templates, and Resources T7
VV Policies, Guides, Standards T8 VV
Education in the Workplace
Red Italics topic addressed distributed
simulation explicitly
6Foundations 02 Sponsors
- ACM TOMACs
- Aegis Technology Group, Inc.
- ASME/AMD
- ACIMS
- Boeing Phantom Works
- CentER, Tilburg University (Netherlands)
- Clemson University
- DMSO main initiating sponsor
- FAA Flight Standards Service
- Gesellschaft für Informatik (Bonn, Germany)
- Illgen Simulation Technologies
- IMC
- JHU/APL facility provider
- JASA
- JANNAF MS Subcommittee -- initiating sponsor
McLeod Institute of Simulation Sciences
CSU/Chico MoD (UK) Synthetic Environment
Coordination Office MSIAC NASA Ames Research
Center NIST NTSA hosting sponsor ONR Shodor
Education Foundation, Inc. SISO SCS SURVIAC USACM
UCF/IST
7Diverse Communities Participation198 Attendees
(US, UK, Germany, France, Canada, Belgium)
8Four Keynotes
The Importance of Credible MS for the Defense
Community Anthony Cerri Experimentation
Engineering Dept MS Div. Chief, Joint Forces
Cmd The Essential Role of Credible
Correct Simulation in Assuring the Safety of
Americas Nuclear Stockpile David Crandall,
Dept of Energy Defense Programs Software
Quality Assurance at NASA Linda Rosenberg NASA
Chief Scientist for Software Quality
Assurance The Road to the Future How to
Implement Needed Research Randall Shumaker,
Director UCF/IST
Addressed distributed simulation explicitly
9Synopses of Challenges
- Management how to do what we know
- Qualitative Assessment
- Formal Assessment Processes
- Costs/Resources
- Research things we need to know how to do
- Inference
- Adaptation
- Aggregation
- Human Involvement/Representation
10Policy, Standards, Guidance Insights
- Various documents exist (RPG, AIAA CFD VV Guide,
etc.) but most are not as specific as as Federal
Drug Administration (FDA) guidance for software
validation - Distributed simulation VVA assume appropriate
VV information about simulation elements without
guidance for how to deal with absence of such
11Basic MS Theory Insights
- Most simulationists ignore this at both practical
and fundamental levels - Paradigms exist (e.g., DEVS) that try to employ
comprehensive theory, but these are not as widely
used as one might want
12Verification Technology Automation Insights
- Automation of verification technology is
essential for modern large and complex
simulations - Verification automation emphasizes specifications
instead of code since automation ensures code
complies with specifications
13Network Limitations Related Insights
- VVA of distributed simulation requires validity
for individual simulations, compatibility among
them, and infrastructure that does not interfere
with validity of the whole -- network
limitations can have major impacts - Live simulations can have variations due to
individual system performance differences - Live simulations may not permit testing with
actual simulation participants prior to the event.
14SME Related Insights
- SMEs have special roles in distributed simulation
VVA, especially if simulation testing is limited - SME selection and use can be improved if quality
processes are employed - VVA Agent may not be allowed to control SME
selection
15Estimation of VVA Resources
- Acceptable risk level is the primary driver for
determining required VVA resources BUT too
little too late is the common VVA resource
situation - VVA cost data are limited and do not comply with
a common cost breakdown structure - A number if VVA resource estimation approaches
have been employed
16Distributed Simulation VV Research Needs
- Distributed simulation VV has the same research
needs as simulation VV in general - Inference
- Adaptation
- Aggregation
- Human Involvement/Representation
17More VV Insights about Distributed Simulation
- Reliable prediction from simulation use demands
quantifiable validation - Simulation and referent data uncertainties have
to be appropriately identified/characterized - Distributed simulation may lack information
items needed for desired assessment - Impact of simulation architecture and allocation
of functions among distributed simulation
elements must be clearly understood especially
true for high fidelity distributed simulations
that must run in real time - More care is needed in statistical assessment of
simulation capabilities than is often done
18A Schematic for Characterization of Simulation
Predictive Capability
19Simulation Results/Real World Data Analysis Issues
20Conclusions
- Foundations 02 has many valuable VV insights
that need to be used - For the Foundations 02 proceedings on CD
- Contact Steve Branch (SCS)
- steve_at_scs.org
- Foundations 02 materials available at
- https//www.dmso.mil/public/transition/vva/foundat
ions