ATML Test Description - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

ATML Test Description

Description:

... requirements, and support equipment to locate, align, and ... The original schema generates the a validation error in XML Spy (it's OK for the .NET parser) ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 52
Provided by: groupe64
Category:

less

Transcript and Presenter's Notes

Title: ATML Test Description


1
ATML Test Description
  • January 2006

2
Overview
  • Format for exchanging the test description
    information defining test performance, test
    conditions, diagnostic requirements, and support
    equipment to locate, align, and verify proper
    operation of a UUT.
  • Purpose
  • Support the development of TPSs that will be used
    in an automatic test environment
  • Test Program generation
  • Test Requirement Document development and
    maintenance
  • Test Description analysis, etc.
  • Promote and facilitate interoperability between
    components of ATSs where UUT test requirement
    definitions need to be shared.
  • Ex. Rehosting test requirements between ATS
    platforms

3
Overview
  • Rehosting benefits
  • Current TPSs are implemented with tight coupling
    between components. The components are typically
    developed specific to that particular
    architecture.
  • Once the test program is fielded, the
    requirements and strategies used to initially
    develop the TPS typically become obsolete.
  • As the ATS is replaced or achieves some level of
    obsolescence, it is typical to re-host the
    implementation of the TPS.
  • This is a more expensive, time consuming task,
    than that if implementing the UUT test
    requirement on the newer Automatic Test Equipment
    (ATE) or as part of instruments replaced within
    the existing ATE.

4
Overview
5
Overview...
  • UUT Description
  • UUT description by name and nomenclature

6
Overview...
  • Documentation
  • References to all Documents, Drawings, Diagrams,
    Parts Lists associated with the UUT and its
    sub-assemblies (used to verify UUT operation).

7
Overview...
  • General Data
  • All of the General Data that may be of use in
    developing test scenarios.

8
Overview...
  • Power Requirements

9
Overview...
  • Interface Definition

10
Overview...
  • Interface Definition (contd)

11
Overview...
  • Interface Requirements
  • The characteristics of equipment and circuitry
    required to test the UUT, excluding the Test
    Equipment (e.g. UUT Connector information).

12
Overview...
  • Interface Requirements / Electrical

Reference to Connector Definition
13
Overview...
  • Interface Requirements Pin Function

14
Overview...
  • Interface Requirements / Mechanical, EO

15
Overview...
  • Performance Characteristics
  • Detailed description of the performance
    characteristics of the UUT.

16
Overview...
  • Detailed Test Information
  • All of the sufficient data for each UUT test to
    completely describe all input conditions and
    measurements required.

17
Overview...
  • Failure Fault Data
  • UUT Design Fault Data (Faults and Failures)

18
Overview...
  • Failure

19
Overview...
  • Fault

Reference to Component
Reference to Component / Pin
20
Overview...
  • Components

21
Overview...
  • ATPG
  • All of systems level information sufficient to
    identify any Automatic Test Program Generation
    (ATPG) tool(s) (e.g. LASAR)

22
Overview...
  • Detailed Test Information - Layers
  • 1. Test Groups
  • Subtypes
  • Sequences describe fault trees as sequence of
    steps
  • Parallel,
  • Reasoner, ...
  • Call Tests and other Test Groups
  • 2. Tests
  • Describe stimuli, measurements, limits and
    behavior (ex. actions)
  • Tests that have common behavior may reference the
    same Test Template
  • 3. Test Templates
  • Contain data items common to multiple Tests
  • Optional

23
Overview...
  • Test
  • Contents
  • Outcomes
  • Outcome is a special output used for sequencing
  • Parameters (inputs)
  • Test Results (outputs)
  • Conditions
  • Behavior
  • Sequence of Actions
  • User-defined
  • Free-form description
  • Ref. to Test Template (optional)

24
Overview...
  • Test Group
  • Contents
  • Outcomes
  • Parameters
  • Test Results
  • Conditions
  • Initialization Termination
  • References to other Test Entities
  • Possible entry point (independently executable
    entity)

25
Overview...
  • Test Sequence
  • Specialization of Test Group
  • Additional contents
  • Steps
  • Reference to Test or a Test Group (ex. Sequence)
  • For each possible Outcome of the Test or Test
    Group
  • 1. Reference to Next Step Components to Adjust
    (optional), or
  • 2. Reference to Sequence Outcome (Entry Points
  • Reference to Step

26
Schema Integration
  • Adjusted root schema (from Draft N) for style
    consistency
  • Streamlined the definition of UUT interface
  • Created new type Connection, to allow the
    enforcement of referential integrity. Used in
    Parameters and in PowerRequirements.
  • Enhanced the assignment of fault data to
    Outcomes, to better support the integration
  • Individual Faults can be now be specified for
    Test Entity Outcomes (Draft 6 supported only
    faulty components)
  • Connected the two parts of the schema through
    references

27
Schema Integration...
  • Removed Fault Ambiguity Group data
  • It can be reconstructed from fault information
    already present in instance documents
  • Under UUT Power Requirements, replaced the type
    Characteristics from Common with a new, simpler
    type called Requirements.
  • Assigned new type to individual parameters of
    power sources, to allow the specification of
    max/min limits and tolerances for multiple
    parameters of the same power source.
  • Note Some elements are simply defined as strings
    (ex. SignalConditioning, Fixtures, etc.) these
    definitions may be extended or deleted, when
    verifying consistency with UUT Description

28
Test Templates (Design Change)
  • During the last meeting a solution was proposed
    to simplify referencing of Test Templates by
    eliminating the Parameter elements residing under
    Test, when the parameters are inherited from the
    test template.
  • A similar solution cannot be implemented for Test
    Results and Outcomes. All Test Result elements
    must exist under Test for referencing purposes.
    Similarly, all Outcome elements must exist under
    TestGroup, for referencing purposes.
  • While potentially reducing the amount of data in
    instance documents, this solution would make the
    design inconsistent.
  • The design from Draft 6 includes a set of rules
    that cannot be expressed in the XML schema (for
    example, when a Test inherits from a Test
    Template, all Test Result, Parameter and Outcome
    elements must contain references to the
    corresponding Template elements).
  • The new solution simplifies the referencing of
    Test Templates by using the order of elements in
    collections

29
Test Templates (Design Change)...
  • Test Templates
  • Optional
  • Used to model commonality in Test functionality

30
Test Templates (Design Change)...
  • Overriding Test Template data in Test
  • When a Test inherits from a Test Template, it
    shall contain exactly one corresponding element
    for each of the Outcomes, Parameters and Test
    Results of the Test Template. The correspondence
    between Test elements and Test Template elements
    shall be determine by their order in the
    collections. For example, the second Parameter of
    the Test inherits from the second Parameter of
    the Test Template.
  • When a Test inherits from a Test Template, all
    its Outcome elements shall have FromTestTemplate
    children. This rule models the fact that Outcomes
    inherited from the Test Template cannot be
    overridden.
  • When a Test inherits from a Test Template, its
    Behavior element shall have a FromTestTemplate
    child. This rule models the fact that Behavior
    inherited from the Test Template cannot be
    overridden.

31
Description of Measured Values (Design Change)
  • The original solution was not extensible
  • The set of data types was represented by an
    enumeration simple type. This type is no longer
    extensible.
  • The representation of default values for
    collections and arrays was not very good -
    consistency between the descriptive part and the
    default value part had to be maintained by the
    producer.
  • New solution
  • Changed former Measurement type into
    ValueDescription type, similar to cValue.
  • Matching data types for the child elements.
  • The data types for the terminal elements
    (doubleDescription, doubleArrayDescription, etc.)
    are similar to the types from Common, but instead
    of the required value have an optional default
    value.
  • Advantages
  • Ensures extensibility
  • Offers a better mechanism for representing
    default values
  • Is more consistent with Value associated types
    from Common
  • May be included in the Common schema

32
Description of Measured Values (Design Change)...
33
Description of Measured Values (Design Change)...
34
Description of Measured Values (Design Change)...
35
Default Value for Signal Measurement (Design
Change)
  • Added Default Value
  • Applies to the measured signal attribute
  • The ATML type must be consistent with the type of
    the measured attribute
  • To Do Define mapping from 1641 signal attribute
    types to ATML data types

36
Names of Parameters and Test Results (Design
Change)
  • Attribute name of Parameter/Local is now
    required
  • For consistency, the attribute name of
    TestResult/Local is also required
  • Q Add uniqueness constraints (name to be
    unique for a Test Entity Test Template)?
  • Note that ID provides a unique identifier
  • However, having two parameters with the same name
    may be confusing...

37
Conditions (Design Change)
  • Removed Exit Conditions
  • In AI-ESTATE, the intent of preconditions is to
    override optimization process.
  • I interpreted this as verification (off-line or
    at run-time), and not automatic execution of a
    procedure that would make the condition happen.
  • This is consistent with use cases from NI, TYX
    and Indra
  • Without automatic execution, the concept of Exit
    Conditions is no longer applicable.
  • Abandoned Setup Actions
  • During the last meeting a solution was proposed
    to assign Setup Actions to values of state
    variables. The Setup Actions would be
    automatically executed to set the state variable.
  • The solution was proposed in response to the
    question what should the application executive
    do if more that one Test or Test Group can make a
    precondition true?. As automatic execution is no
    longer envisioned, this is no longer a problem
  • Preserved Post Conditions
  • As a side effect of the above solution, Tests are
    no longer required to change state variables. But
    this situation may occur in applications. Thus, I
    believe Post Conditions associated with Tests are
    still necessary. This is consistent with
    effects in AI-ESTATE.

38
Conditions (Design Change)...
  • Q Rename Post Conditions to Effects?
  • For consistency with AI-ESTATE
  • Q Use predicates instead of state variables?
  • The two approaches are equivalent.
  • Predicates are consistent with AI-ESTATE.
  • Not clear how predicates are used in AI-ESTATE.
  • Effects imply that predicates become true.
  • In applications, Tests could cause predicates to
    become false (ex. Power in ON becomes false
    when power is turned off)

39
Specification of Fault/Failure Data (Design
Enhancement)
  • Fault/failure data can now be specified for any
    Test Entities in the fault tree
  • In Draft 6, could be specified only for terminal
    Test Entities.
  • This ensures consistency with AI-ESTATE.

Applicable to any Test or Test Group
40
Other Design Changes
  • Multiple Test Results can be referenced as the
    source of a Parameter.
  • TestGroup_Unspecified and TestGroup_Parallel can
    now reference Test Entities (Tests and Test
    Groups) . Before they could only reference Tests.
  • Initialization and Termination can now reference
    a Test Entity (Test or Test Group). Before they
    could only reference a Test Group.

41
Open Issues
  • Associate Post-Conditions with Test Outcomes
  • During the last meeting a solution was proposed
    to associate post-conditions with specific
    Outcomes of Test and Test Groups. This was in
    response to the question what does it mean for
    post-conditions if a Test returns a Failed
    outcome?.
  • There are two problems with this solution
  • In AI-ESTATE effects do not depend on the
    Outcome.
  • It is likely that a Test would change the UUT
    state in the same manner, regardless of the
    Passed/Failed Outcome it returns.
  • The Outcome is typically calculated based on a
    measurement, while the state change is the result
    of stimulus operations, which typically occur
    before the measurement.
  • Proposed solution
  • Keep the current design post-conditions are
    associated with Tests Entities
  • Add rule If a Test Entity returns an Aborted
    outcome, the values of the state variables
    referenced by its post-conditions shall be
    considered unknown. Pre-condition relative to
    those state variable shall be considered not
    fulfilled.
  • In general, it is impossible to know if the Test
    failed before of after changing the UUT state.

42
Open Issues...
  • Q Do adjust actions apply to terminal steps?
  • In the current design, they do not.
  • To support the feature, move AdjustComponent one
    level up, under Result.

43
Open Issues...
  • Q Should we support parametrized Test Group
    calls?
  • The current design does not
  • Test is a call
  • Test Group is a definition
  • Possible solutions
  • 1. Create a Test Group Call element.
  • A sequence Step would reference this element,
    which in turn would reference a Test Group.
  • Associate parametric data and fault data to the
    Test Group Call element.
  • Two Test Group Calls could reference the same
    Test Group, but two Steps cannot reference the
    same Test Group Call.
  • 2. Enable a Test to be a Test Group call (in
    addition to the currently supported behaviors).
  • In this case, parametric data and fault data
    would be associated to the Test (this is
    currently supported), but not to the Test Group.
  • For parametric data associated with a Test Group,
    we may support a description (i.e., data type
    unit, but not value).

44
Open Issues...
  • Q Do we want to be able to define Conditions for
    Test Templates and inherit them in Tests?
  • This may make sense, as the common functionality
    captured by the Test Template is likely to
    include the effect on UUT state.
  • Not supported now.
  • Q Add support for variables?
  • Many test executives support variables. Not
    supporting them in Test Description would limit
    the exchange of Test Description data between
    these test executives.
  • On the other hand, the significant differences
    between the sequencing models of test executives
    impose serious limitations of data exchange.
  • Q DiagTestDescription/EntryPoints/EntryPoint is
    a reference to TestEntity, which can be a Test
    Group (ex. Sequence) or a Test. This means that
    Tests could be executed individually. Is this a
    good idea?

45
Indra Issues
  • Why the AdjustComponent element is under
    TestGroupSequence and not under Outcome or
    TestEntity. Would not it be the same case than
    "ReplaceComponent"?
  • In the typical DTIs there is an "Adjust" field at
    the same level that the "Replace" field.
  • Current design
  • Replace Components apply to all types of Test
    Entities, to support static fault trees, as well
    as reasoners.
  • Adjust Components apply only to steps of
    sequences, to support modeling of TRDs
  • May be changed.
  • Q Where to move AdjustComponent?

46
Indra Issues...
  • Dont allow "ReplaceComponents" element at Test
    Group level
  • Should assign fault data for a particular call of
    the Test Group, rather than to the Test Group
    itself
  • This is similar to the problem described before
    for Parameters

47
Common.xsd Issues
  • The set of units is not extensible. This may
    cause problems when reverse engineering existing
    TPSs, where incorrect or non-standard units are
    used.
  • Suggestion design an extensible solution.
  • Collection has a global unit (applies to all
    items) and also a local unit.
  • The global unit is not necessary (the intent of
    Collection is to represent a set of heterogeneous
    items) and may be confusing (what means if both
    are present and different?).
  • Suggestion remove the unit attribute of
    Collection. Also, adjust CollectionDescription
    for consistency.

48
STDTSFLib.xsd Issues
  • Removed attributes abstract false from
    elements. The original schema generates the a
    validation error in XML Spy (its OK for the .NET
    parser)
  • Note Removing this attribute does not change the
    operation of the schema. It may be accommodated
    in the stylesheet that converts XML to XSD.

49
STDTSFLib.xsd Issues...
  • Replaced the character ยต with u. The original
    schema generated a validation error with the .NET
    parser (XML Spy does not complain)
  • Note Removing this attribute does not change the
    operation of the schema. It may be accommodated
    in the stylesheet that converts XML to XSD.

50
STDTSFLib.xsd Issues...
  • Temporarily removed limits for values of
    attributes derived from Physical. The original
    schema generates a validation error in XML Spy
    and a validation error in the .NET parser
  • Note Removing this attribute eliminates a useful
    validation feature. We should find a different
    way of expressing the constraints.

51
STDTSFLib.xsd Issues...
  • The sample StdSignals.xml does not validate with
    XML Spy (2006 SP2) but validates with the .NET
    parser. The IDREFS attribute shown below
    generates a validation error
  • Note I believe this worked with an earlier
    version of XML Spy...
Write a Comment
User Comments (0)
About PowerShow.com