Title: Joint Metrics Development and Implementation in Support of SIAP Assessments in a Distributed Simulat
1Joint Metrics Development and Implementation in
Support of SIAP Assessments in a Distributed
Simulation Environment
- Brett Zombro and Laura Bennett
- Systems Planning and Analysis, Inc.
- 4900 Seminary Road, Suite 400
- Alexandria, VA 22311
- April 20, 2004
2 Contents
- Definition of the Single Integrated Air Picture
(SIAP) - Evaluation Methodology The SIAP Metrics
Hierarchy - Sample Attribute Measure Clarity
- The Joint Approval Process
- Implementation Associated Software Format and
Tools - Joint Distributed Engineering Plant (JDEP)
Integration - Future Directions
3 Purpose
- Provide the Integrated Air and Missile
Defense community with an evaluation methodology
based on a Jointly developed hierarchy of metrics
that are quantifiable, testable, and measurable
in both simulated and real environments to - - Evaluate the current state of the SIAP
for multi-platform interoperability. - - Predict the relationship between
engineering changes and warfighter benefit. - - Prescribe an engineering path towards a
fully operational SIAP and measure progress.
- - Integrate Jointly developed assessment
methodology and tools within a
state-of-the-art distributed simulation facility.
4 Definition of the SIAP
- The SIAP is the air track portion of the Common
Tactical Picture (CTP) and is the product of
fused, common, continual, and unambiguous tracks
of airborne objects of interest in the
surveillance area. - SIAP is derived from real-time and
near-real-time data and consists of correlated
air object tracks and associated information. - Theater Air and Missile Defense Capstone
Requirements Document (TAMD CRD), U.S. Joint
Forces Command, 2001.
5 Measurement of SIAP Attributes
6 SIAP Attributes Flow from
CRD Requirements
- Minimum set of air vehicle SIAP attributes
includes all quantified SIAP requirements
specified in TAMD and CID CRDs for air objects. -
-
-
7 SIAP Attribute Core Measures
8MOE Defined
- MOE Measure of operational success that
must be closely related to the objective of the
mission or operation being evaluated.
- MOEs quantify, at various levels (tactical,
operational, strategic), capabilities of direct
importance to the warfighter. - DSMC, Glossary of Defense Acquisition
Acronyms and Terms, 10th Ed. (2001). -
9MOP Defined
- MOP Measure of a systems technical
performance expressed as speed, payload, range,
time on station, frequency, or other distinctly
quantifiable performance features. - MOPs quantify aspects of system/subsystem
performance which influence the SIAP, but are
more immediately affected by system engineering
choices than are the Attributes. - DSMC, Glossary of Defense Acquisition
Acronyms and Terms, 10th Ed. (2001).
10 SIAP Functional Areas and Measures of
Performance (MOPs)
11 Measures of Effectiveness (MOE) Metric Examples
12Air Vehicle SIAP Attributes (1)
- Completeness - The measure of the portion of
true air objects that are included in the SIAP.
The air picture is complete when all objects are
detected, tracked, and reported. - Clarity - The measure of the portion of the
SIAP that contains ambiguous tracks and/or
spurious tracks. The air picture is clear when
it does not include ambiguous or spurious tracks. - Continuity - The measure of how accurately the
SIAP maintains track numbers over time. The air
picture is continuous when the track number
assigned to an object does not change. - Kinematic Accuracy - The measure of how
accurately the TAMD Family of Systems (FoS)
reports track position and velocity. The air
picture is kinematically accurate when the
position and velocity of each assigned track
agree with the position and velocity of the
associated object.
13Air Vehicle SIAP Attributes (2)
- Commonality - The measure of consistency of the
air picture held by TAMD Family of Systems (FoS)
participants. The air picture is common when the
assigned tracks held by each participant have the
same track number, position, and ID. - ID Completeness - The ID is complete when all
tracked objects are labeled in a state other than
unknown. - ID Accuracy - The ID is accurate when all
tracked objects are labeled correctly. - ID Clarity - The ID is clear when no tracked
object is labeled with conflicting ID states.
14 Sample Attribute Measure Clarity (1)
- The instantaneous system track picture ambiguity
Am(tk) at participant m at time tk is -
-
-
- NAm(tk) the number of assigned, uncorrelated
tracks held by participant m at
time tk - JTm(tk) the number of objects with at least one
assigned track held by participant
m at time tk.
15 Sample Attribute Measure Clarity (2)
- The instantaneous system measure of the
percentage of tracks that are spurious,
Sm(tk),as measured by participant m at time tk
is - Nm(tk) the number of tracks held by
participant m at time tk.
16Roll-up Metric (1)
- The aggregative form across time and number
of participants for the clarity attribute in the
case of ambiguous tracks appears as, -
-
17Roll-up Metric (2)
- The aggregative form across time and number
of participants for the clarity attribute in the
case ofspurious tracks appears as, -
-
18The Joint Approval Process
- JSSEO Working Group of Subject Matter
Experts (SMEs) from the four Services, JTAMDO,
JFCOM and MDA has as its primary mission to
ensure that - - SIAP metrics are jointly
selected. - - SIAP metrics are
rigorously defined. - - Implementation plan is
in place for - automated metrics tool
development. - Thereby providing the joint community with a
common frame of reference for quantifying and
assessing the aggregate performance of a given
SIAP configuration.
19Implementation Associated Software Format and
Tools
- Standard Air Track Data Format Developed by the
SIAP Metrics Working Group for standardized
input to the track-to-truth matching algorithm
and for data transfer to the metrics scorer. - Track-to-Truth Matching ARCTIC
- The Automated Reconstruction and Correlation
Tool for Interoperability Characterization
(ARCTIC) is an assignment algorithm suite
developed by the Center for Naval Analysis
(CNA) for input to the metrics scorer. - Metric Computational Software PET
- The Performance Evaluation Tool (PET) is
metrics scoring and visualization software
developed by the Naval Surface Warfare Center
(NSWC) Corona, CA.
20Track Assignment Procedure
21JDEP Integration
- The Joint Distributed Engineering Plant (JDEP) is
a simulation capability to link distributed
components in user-tailored federations for
hardware-in-the-loop, software-in-the-loop, and
simulation venues. - The role of the JDEP Data Extraction Federate is
to extract relevant data to calculate metrics in
post-event data analysis. - SIAP metrics and associated tools and data
formats will form an integral part of the JDEP
kit given to JSSEO partner programs for internal
testing, verification, validation, and
accreditation. - JDEP will enable the user community to implement
the SIAP metrics across a broad range of field
test and simulation environments.
22JDEP Technical Framework for 2004 Distributed
HWIL Event
23Summary
- The Joint SIAP Systems Engineering Organization
(JSSEO) Documented a SIAP Evaluation Methodology
Emanating from the CID and TAMD CRD (Capstone
Requirements Document) Guidelines. - Precise Definitions of SIAP Attributes are
Quantifiable, Testable, and Measurable in Real
and Simulation Environments. - MOPs and MOEs enable Linkages in a Metrics
Hierarchy for Defining SIAP-related Warfighting
Capability. - SIAP Attribute Definitions Vetted through the
Services and Joint Agencies, Providing a Common
Frame of Reference. - The SIAP Evaluation Methodology Forms an Integral
Part of the JDEP Simulation Environment for the
Assessment of JSSEO Partner Programs.
24 Future Directions
- Establish MOPs for Designated, High-priority SIAP
Problem Domains, e.g., Multi-source Integration
(MSI) Systems, Data Registration, Common Time
Reference (CTR), and Formation Tracking. - Formulate Comprehensive Sets of MOPs to Evaluate
Candidate Stand-alone MSI Systems as a Step
Towards the Evaluative Support of
Fully-Integrated MSI Systems. - Develop Network-based Metrics to Represent
Interconnections between Local/Remote Fusion
Nodes in the Tactical Data Link-16 Framework and
in Planned Peer-to-Peer Networks. - Formulate Multi-dimensional Metric Models to
Account for Synergies and Trade-offs in the
Fusion Process. - Transition to Real-time Capability for Rapid
Performance Feedback in Adaptive Systems on the
Digital Battlefield.
25 Back-ups
26 Objectives
- Develop common understanding of the SIAP
Attributes in terms of - - Definition
- - Mathematical Derivation
- Institutionalize the SIAP Attributes and their
Aggregative Formulations as the standard for
quantifying the air vehicle component of the
SIAP. - Provide consolidated sets of MOEs and MOPs, and
develop common understanding of their proposed
use. - Establish implementation plan using standardized
data formats, track matching algorithms, and
metrics scoring in a distributed simulation
environment.
27- SIAP Attributes Definitions
- Equations included for illustrative purposes.
For explanatory details, consult SIAP Attributes
Technical Report, Version 2.0
28 Completeness
- The air picture is complete when all objects are
detected, tracked and reported. - The completeness Cm(t) at participant m at time
t is -
- The roll-up measure of completeness C is an
object weighted average across time and
participants -
29Clarity
- The air picture is clear when it does not include
ambiguous or spurious tracks. - Tracks are ambiguous when more than one track,
assigned to the same object, is displayable
to some participant. - A track is spurious when it is not assigned to
any object.
30Ambiguous Tracks
- Tracks are ambiguous when more than one track,
assigned to the same object, is displayable to
some participant. - The track picture ambiguity Am(t) at
participant m at time t is -
- The roll-up measure of Ambiguity A is a
tracked-object weighted average across time
and participants
31Spurious Tracks
- A track is spurious when it is not assigned to
any object. - The percentage of tracks that are spurious Sm(t)
at participant m at time t is -
- The roll-up measure of percentage of spurious
tracks S is a track weighted average across time
and participants
32Continuity
- The air picture is continuous when the track
number assigned to an object does not change. - Characteristic Track Lifetime the reciprocal of
the average rate of track number changes. - Longest Track Segment the ratio of the longest
track segment associated with an object to the
time the object is in the AOI.
33Characteristic Track Lifetime
- The reciprocal of the average rate of track
number changes. - The rate of track changes for object j at
participant m is given by, -
- The characteristic track lifetime LTm at
participant m -
- The roll-up of characteristic track lifetime is
obtained from the weighted average of track
number change rate across participants
34Longest Track Segment
- The ratio of the longest track segment associated
with an object to the time that the object is in
the AOI. - The longest track segment, as a percentage of
time, LSj,m, at participant m is -
- The roll-up measure of continuity LS is a time
weighted average across objects and
participants
35Kinematic Accuracy
- The air picture is kinematically accurate when
the position and the velocity of each assigned
track agree with the position and the velocity of
the associated object. - The position and the velocity accuracy,
PAj,n,m(t) and VAj,n,m(t), for track n associated
with object j at participant m at time t is -
36Kinematic Accuracy (Contd)
- The velocity accuracy metric is analogous to the
position accuracy -
- The roll-up measures of position and velocity
accuracy, PA and VA, are appropriately defined
averages across participants, weighted averages
across scoring times t, and assigned tracks
37ID Attributes
- The ID is
- Complete when all tracked objects are labeled in
a state other then unknown. - Accurate when all tracked objects are labeled
correctly. - Clear when no tracked object is labeled with
conflicting ID states.
38ID Completeness
- The ID is complete when all tracked objects are
labeled in a state other than unknown. - The ID Completeness CIDm(t) at participant m at
time t is -
- The roll-up measure of the ID completeness CID
is a tracked-object weighted average across
time and participants
39ID Accuracy
- The ID Correctness IDCm(t) is the fraction of
tracked objects with correct IDs at participant m
at time t is -
- The roll-up measure of ID Correctness IDC is the
tracked-object weighted average across time and
participants
40ID Clarity
- The ID is clear (unambiguous) when no tracked
object is labeled with conflicting ID states. - The ID Ambiguity IDAm(t) at participant m at
time t is -
- The roll-up measure of ID Ambiguity IDA is the
tracked-object weighted average across time
and participants
41Commonality
- The air picture is common when the assigned
tracks held by each participant have the same
track number, position, and ID. - The commonality CM(t) at time t is
-
- The roll-up measure of commonality CM is a
track weighted average across time -