DHS Proposal - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

DHS Proposal

Description:

that is pre-validated, vetted and evidence based ... Evidence-based product assessment (using criteria definitions) Component. Pattern Recognition ... – PowerPoint PPT presentation

Number of Views:58
Avg rating:3.0/5.0
Slides: 32
Provided by: ich4
Category:
Tags: dhs | proposal

less

Transcript and Presenter's Notes

Title: DHS Proposal


1
SAIL Public/Private Partnership Prospectus
To provide IT Decision Makers with timely and
fact based information to solidify IT Capital
Investments
Discussion Briefing on DHS Solution Architecture
Integration Lab (SAIL Pilot)
Jonathan Houk SAIL PM DHS CIO Staff Jonathan.houk
_at_dhs.gov John Weiler, E.D. john_at_ICHnet.org www.IC
Hnet.org/sail.htm 703 768 0400
August 9, 2004
This document is confidential and is intended
solely for the use and information of the client
to whom it is addressed.
2
The impetus for establishing an IT public/private
partnership is overwhelming, and DHS and ICH are
leading the way!
  • (IT Leadership) shall develop, .., a process
    for analyzing, tracking, and evaluating the risks
    and results of all major capital investments made
    by an executive agency for information systems.
    and shall include explicit criteria for
    analyzing the projected and actual costs,
    benefits, and risks associated with the
    investments Clinger/Cohen Act
  • Most (PMs) are struggling with the complexity
    and a few have failed miserably. The complexities
    are numerous and less than obvious. .
    Requirements must flow into an architecture that
    can truly exploit the advantages of COTS.
    Contractors must shift from a design and build
    unique product to buy and integrate standards
    products.(Everyone) freely admits that have
    made every mistake imaginable along the way.
    Unfortunately, others cant image the mistake
    that are about to make. AF Scientific Advisory
    Board Report, (April 2000)
  • Current architectural initiatives (C4ISR, JTA)
    dont quite fit the E-Business problem space.
    Rate of Technology Change Exacerbates the
    Problem the lack of an easily understood
    process modeling technique is a root cause for
    the lack of participation of the general user in
    the definition of business processes. ECCWG (a
    partnership between OSD and IAC, AFCEA, ICH,
    NDIA) 2000 report to the DEPSECDEF
  • ... the concept of the Interoperability
    Clearinghouse is sound and vital. Its developing
    role as an honest broker of all interoperability
    technologies, no matter what the source, is
    especially needed. Such efforts should be
    supported by any organization that wants to stop
    putting all of its money into maintaining archaic
    software and obtuse data formats, and instead
    start focusing on bottom-line issues of
    productivity and cost-effective use of
    information technology. Assessment by DoDs
    Leading FFRDC (June 2000)
  • "Since the value of the ICH to our programs
    increases rapidly through results sharing, we
    encourage the defense community and IT industry
    to participate directly in the public service
    initiative in terms of sponsorship and lessons
    learned" Office of the Secretary of Defense, DCIO
    (Jan 2000)
  • A neutral party should facilitate reaching
    consensus on the many areas necessary for
    adoption of Component-based Architectures. Rather
    than attempting to achieve global consensus (ie
    standards), it should be developed among a few
    motivated agencies and the results provided to
    other agencies to adopt as they are able. IAC
    EA SIG Recommendations to OMB and the CIO Council
    (2003)

3
Meeting Agenda
  • What is SAIL?
  • What problem is it solving?
  • How does it solve the problem?
  • Why is it a better mouse trap ?
  • What does a SAIL enabled engagement look like?

A neutral party should facilitate reaching
consensus on the many areas necessary for
adoption of Component-based Architectures. Rather
than attempting to achieve global consensus (ie
standards), it should be developed among a few
motivated agencies and the results provided to
other agencies to adopt as they are able. IAC
EA SIG Recommendations to OMB and the CIO Council
4
  • What is SAIL?

A research and validation collaboratory
(public/private partnership) that provides IT
decision makers with evidence based comparative
data (analysis of alternatives) for selecting
commercial IT solutions
... the concept of the Interoperability
Clearinghouse is sound and vital. Its developing
role as an honest broker of all interoperability
technologies, no matter what the source, is
especially needed. Such efforts should be
supported by any organization that wants to stop
putting all of its money into maintaining archaic
software and obtuse data formats, and instead
start focusing on bottom-line issues of
productivity and cost-effective use of
information technology. Assessment by Leading
FFRDC, 2000
5
SAIL provides an in-context architecture view of
commercial IT solutionsthat is pre-validated,
vetted and evidence based
  • Utilizes a proven standards body based vetting
    process for evaluation frameworks, evaluation
    templates and evaluation outcomes
  • Provides an open forum for exchanging best
    practices
  • Leverages an industry-wide knowledge base of
    domain expertise
  • Organized as a not-for-profit public/private
    partnership with processes for eliminating
    conflict of interest concerns
  • Provides standardized evaluation frameworks and
    templates that eliminate the obfuscation of facts
    that exist in the industry

6
Current IT research and validation mechanisms are
ineffectiveDue to the complexity of the fast
paced IT market
  • No links between EA artifacts and commercial IT
    offerings
  • Current sources and methods are not producing
    actionable decision models (GAO)
  • No formal mechanism of deriving selection
    criteria from Business Reference Model (IAC EA
    SIG)
  • No body of knowledge from which PM can judge
    competing offerings. (ECCWG Report)

PMs Feel...
  • No mechanisms for assessing risks, composability
    or interoperability of commercial solution (AF
    SAB Report)

Overwhelmed by offerings Ill-equipped to
evaluate Out paced by market Meeting best
practices? Conflicted
  • Current contracting processes result in 80
    failure rates of major IT programs! (GAO, IDG)

7
Removing the ObfuscationsProviding a clear and
unambiguous view of what is possible
  • For a variety of reasons, timely and reliable
    information on technology form, fit and function
    are unavailable in an architectural context
  • Vendors are struggling just as hard as IT
    decision makers to keep up with the changes
    dictated by rapidly changing market forces
  • It is too hard and time consuming for IT decision
    makers to assess market capability and best
    practices required to make informed decisions
    about solution alternatives
  • The procurement process was not designed to
    support the IT planning and architecture process.
    It does not provide access to the wide range of
    knowledge sources needed to get through the
    discovery process

Select, recommend, plan, guide, and assist
initiative teams in the deployment of
technologies that are proven, stable,
interoperable, portable, secure, and scalable.
Facilitate the migration and transition of
E-Government initiatives from legacy and
"inward-driven" architectures, to architectures
that embrace component-driven methodologies and
technology reuse. OMB FEA-PMO objective
8
How does SAIL work?A true public/private
partnership working towards common goals
  • At its core SAIL is the ICHnet.org Architecture
    Assurance Method TM for vetting commercial
    solution offerings in an open and conflict free
    environment
  • Evaluation efforts can be sponsored either by
    agencies to assess alternative commercial
    offerings to solve a particular solution domain
    need OR by vendors who would like to see their
    product evaluated against the ICHnet.org
    Assessment Framework and Common Criteria
  • All Solution Architecture Frameworks are derived
    from implementation best practices and are done
    based on real world problems to ensure the
    relevance of evaluation criteria to context
    specific decision making requirements not an
    abstract academic exercise
  • SAILs shared knowledge repository provides a
    mechanism for architecture re-use
  • The Virtual Lab concept leverages existing
    testing, facilities and member organization
    experts
  • A shared cost, fee-for-service model provide
    lowest overall cost and best value to all partners

9
An adaptive IT research and validation consortia
that assures the leveraging of industry best
practices and lessons learned
Facilitate the Discovery
Whats the problem domain space? What are the
requirements? What are the drivers? What are the
organizational constraints?
Define potential Solution Sets
What is the architectural framework of
the solution? What are the relevant factors for
evaluation? What are the implementation and life
cycle risk factors that need to be considered?
Analysis of Alternatives
What is the comparative landscape? What evidence
is there to support vendor claims? What direction
is the market going in? How well is the vendor
and product positioned?
EDS Business Case Analysis 2001 The leveraging
of our efforts with other parties through the
formulation of a non-profit consortium is the
most cost effective and efficient way of
achieving the goal of interoperability assurance
among heterogeneous systems. This ICH capability
will augment our capability and provide us much
more information about products, standards, and
viable enterprise solution sets than we could
ever realize through our own internal efforts.
10
SAIL info exchange focuses on propagating SRM
to align agency business needs with technical
solutions based on key metrics
Associated Metrics
Reference Models
BRM
Business Drivers Metrics
Performance Metrics
Core Business Mission Objectives
User/Integrator Best Practices
Business Processes Infrastructure
Tradition Top Down EA
Security Profiles
Effectiveness/Efficiency
BRM
Service Components Metrics (SRM)
SAIL Solution Lexicon
Appl Service Components Layer 1
Infrastructure Service Components Layer N
SAIL Solution Frameworks Aligns with business
needs
Common Criteria
Vendor Solution Templates
Interoperability, Fit, Finish
BRM
Technical Solution Metrics
Application Layer 1
Common Infrastructure Layer M
Secure Solutions
11
Strength of Evidence Risk MetricsCollaborative
solution vetting process significantly increase
confidence
85
Avoidable Risk
ICH Strength of Evidence
50
Pre-acquisition Success Factor
Pre-acquisition Confidence Level
25
Evidence Sources
Time and Materials
Bi-directional vendor claim
Functional/ conformance testing
Implementation successes
Integration testing
12
SAILs accelerated process provides DHS
executives with timely access to the information
they need to make sound decisions
  • Whats the problem were trying to solve? Has
    it been solved before?
  • What are the metrics were trying to
    accomplish? Do they already exist?
  • What impact will this have on the citizen,
    business and stakeholders?

Problem Identification
  • Creation of common service components templates
  • Identification of as is and target processes
    and infrastructure
  • Definition of critical success factors,
    business, and technical requirements

Process and Requirements
System Integrators (Domain Expertise)
  • Creation of usage patterns / use cases
  • Creation of business, technical, and
    infrastructure patterns
  • Identification of supporting components /
    products, capture of evidence of compliance

Pattern Creation and Component Alignment
  • Creation of common criteria assessment
    definitions / factors
  • Creation of weighting algorithms (i.e., risk,
    cost, benefit, mission)
  • Evidence-based product assessment (using
    criteria definitions)

Accelerated Assessment Process (AAP)
Criteria and Component Assessment
Solution Architecture Integration Lab
(SAIL) (Component / Product Assessment /
Objectivity)
Product Providers (Evidence Assessment)
  • Down-selection of product(s) based on clearly
    defined business patterns
  • Creation of notional end state
  • Identify linkages and potential problem points

Component Pattern Recognition
  • Publishing of assessment criteria
  • Hand-over to client, perform weighting
    assessment
  • Development of alternatives ( product suites)
    and associated services

Publishing
  • Prototype selection of products (prior to
    procurements)
  • Prove results, creation of evidence
  • Engage in procurement process

Prototype
13
Value Prop Information Sharing and collaboration
reduces time, cost and risk of redundant IT
research and validation efforts
High
Acceptable Risk Level
Risk Delta
Confidence Level
Project A
Project B
Inconclusive findings
Project C
Low
Strategy Discover Architecture Validation
Acquisition Implementation
Validation Resources (cost time line)
Redundant Market Research and Testing SAIL
Collaborative Research and Validation
E-Gov Act requires agency CIOs to identify and
pursue investments that could be shared across
the agency or with agencies to address similar
missions and constituencies.
14
Why S.A.I.L. Public/Private partnership is the
right choice
  • SAIL leverages collaborative methods and tools
    already proven in both government and industry
    (DARPA, GSA, Discovery, PRC, OSD Health Affairs,
    CCIA)
  • Current pre-acquisition RD approaches are
    expensive, do not tap existing lessons learned,
    and do not produce desired outcomes
  • Trade associations provide vendor access, but no
    reliable solution information
  • SAIL ability to tap directly into knowledge
    sources (best practices and lessons learned)
    greatly exceeds any single vendor green fields
    research efforts
  • Legislation requires new processes to leverage
    best practices and commercial offerings
  • SAIL is a low cost, low risk, high return
    alternative to familiar failure patterns

15
Target private/public partnership membersthat
facilitates industry wide knowledge development
and sharing
Government/Industry Users
Vendors VARs
  • Dept of Homeland Security
  • US Customs
  • TSA
  • Homeland Security Advisory Council
  • DOT/FAA
  • OSD HD
  • NRO
  • OMB/GAO FMS
  • Navy PEO IT/FORCENET
  • DOJ
  • Dell
  • Microsoft
  • Sun Microsystems
  • BEA Systems
  • Intel Corp.
  • Novell
  • BroadVision
  • CISCO Systems
  • SAIL Value Chain Partners working towards a
    common goals
  • Greater planning effectiveness
  • Common EA Nomenclature
  • Common Solution Models
  • Shared Best Practices
  • Shared Testing results
  • Shared Expertise
  • Adaptive Resource pool

Non-Profits
Consultants Solution Integrators
  • Aerospace Corp
  • IDA.org
  • Battelle
  • Financial Services Technology Consortium
  • Harvard School of E-Business
  • George Mason University
  • GW University
  • ISSA.org
  • Professional Services Council
  • Object Management Group
  • Components.org
  • Computers Communications Industry Assoc.
  • Computer Sciences Corp.
  • Lockheed Martin
  • Titan Corporation
  • Harris Corp
  • SRA
  • Northrop Grumman
  • Securities Industry Automation Corporation (NYSE)
  • CGI/AMS

16
S.A.I.L.s 3 phase Research and Validation
process leverages best practices and
implementation/testing results while managing
risk/OCI
Users

Phase 1 SOA Validation
Agency PMs
Value Chain Analysis
Business Patterns
Business Requirements, Policy, Guidance
Industry Best Practices
Best Practices Lessons Learned
Business Ref Models
Integrators Consultants
Common Business Models
Integration Lesson Learned
Phase 2 Best Practices Alignment
Solution Exist?
Down select Solution Alternatives
Yes
Solution Patterns
Solutions Knowledge Base
Solution Component Common Criteria
no
MDA Service Components
Vendors
Model New Solution
Phase 3 Analysis of Alternatives
Testing Data and Evidence
MDA Service Components
A11-300 Solution Architecture Validated
Normalized Solution Frameworks
AoA Complete?
Solution Architecture Validation and
Demonstrations
Yes
no
Validated Business Cases Solution Acquisition
Models
17
BACK UP SLIDES- Validation Process Flows- Past
Performance .
18
Case Study Most innovative Media Company
Discovery Channel
Challenge Select enterprise web infrastructure
to integrate stovepipe applications
  • Applied ICH Solutions Validation Program
  • Performed architecture baseline assessment
  • Provided guidance and selection support for
    Web-app server, VPN, portal, last-mile wireless
    connectivity
  • Outcomes
  • Validated requirements against marketplace
    offerings
  • Improved confidence in technology decisions
  • Delayed VPN implementation
  • Purchased Web application server, database, and
    media products
  • Deployed system without a hitch
  • Significantly reduced time/cost to implementation

19
Case Study World Largest Healthcare Project
4.1 Billion Govt Wide e-Healthcare program
Challenge develop enterprise architecture for
patient record integration
  • Applied ICH Architecture Immersion Program
  • Developed architecture validation criteria to
    GCPR Program Office
  • Developed product selection guidelines for Prime
    Contractor
  • Applied ICH Architecture Assurance Method
  • Outcomes
  • Enabled award based on unambiguous design specs
  • Augmented UML/MDA to address legacy and COTS
    capabilities
  • Ensured viability of chosen technologies
  • Met HIPPA requirements
  • Met security requirements
  • Provided integration framework for web
    infrastructure
  • Assured implementation success

The ICH repository data and analysis
methodologies was very helpful in supporting a
quick turn around for Information Assurance
section of COTS security products. Highly
detailed ICH technology domain and product
evaluation data comprised over 60 of this
urgently needed architecture report. GCPR,
Program Manager, Northrop Grumman/PRC
20
Case Study Worlds Largest Research Agency
(DARPA)ICH Distributed Component-based
Architecture Modeling Project
Challenge model and simulate information
assurance and interoperability attributes
  • Develop a method for represent EA heuristics, and
    use AI tools to model solution
  • Developed common criteria to represent each major
    IT class (DB, OS, Middleware)
  • Create profiles and templates that can represent
    complexity of market
  • Apply Java Expert System Shells (JESS) to build
    rules based AI tool
  • Take existing system specifications from DII COE
    to make decisions
  • Outcomes
  • Developed approach for modeling complex solutions
    based on rules based engine
  • Developed means of representing security
    attributes into a solution architecture
  • Developed a tool that could do what if analysis
    on the fly based on partial and incomplete data.
  • Demonstrated how component-based architectures
    could be applied to improve the richness of our
    planning and EA processes as a means of managing
    risk

The unique work youve done to frame issues
about Interoperability effects on IA of systems
is particularly encouraging. I believe it can
contribute to ongoing work in the field of IA
metrics and composition ongoing at DARPA and
other agencies. I was pleasantly surprised at
the effective and efficient use of funding I was
able to invest in this effort as well as your
ability to leverage other funding sources. You
guys came through! DARPA PM regarding ICH
Distributed Component-based Architecture Modeling
Program (DCAM)
21
Case Study World Largest Intelligence Agency
Challenge means of integrating diverse
communities via the web
  • Applied Architecture Validation Program
  • Developed common criteria for emerging portal
    market
  • Evaluated selection of Enterprise Portal for
    pilot project
  • Developed impact analysis on enterprise
    architecture
  • Maintained view of evolving marketplace
  • Outcomes
  • Enhanced and normalized portal selection criteria
  • Identified key features/functional areas for
    testing
  • Applied commercial best practices for successful
    production rollout
  • Improved understanding and alignment of
    technology to problem domain

22
Case Study GSA Financial Systems Enterprise
Architecture
Challenge making EA actionable, eliminate
redundant Financial Mgt. Systems
  • Applied Value Chain Analysis
  • Developed metrics for FMS implementation success
  • Evaluated current EA products
  • Developed Value Chain assessment model
  • Moved EA effort into CFO office
  • Outcomes
  • Enhanced and normalized existing EA products
  • Identified key business processes required for
    implementation
  • Enabled senior management to interact with EA
    process for the first time.
  • Helped GSA go from red to green based on
    Value Chain effort
  • Identified over 100 M in potential savings via
    ICH Value Chain Approach

23
Value Chain Benefit Agency Leadership, Congress,
GAO OMB
  • Provides view into planning process
  • Establishes metrics for mission fulfillment
  • Aligns IT Capital investment with Agency Mission
  • Leverages industry best practices/lessons learned
  • Creates Understandable Actionable Project
    Transition Plans
  • Catalyst for implementing Presidents Management
    Agenda

Policies
Compliance
Congress, Agency Leadership COOs,
CFOs Auditors
24
Value Chain Benefit CIO, CTO, Chief Architects,
Chief Security Officer
IT Management CIO/CTO/CAO
IT Program Plans
  • Identifies viable COTS solution frameworks
    Quickly
  • Models Architectures and Inferences to Identify
    Linkages with Strong and Weak Track Records.
  • Saves and Shares Models using common architecture
    terms (MDA)
  • Provides What If Modeling and Analysis
  • Provides In-Context, Real-Time, Just-In-Time
    Research Data
  • Use of SAIL Feeds Data Back to Knowledge Base

AoA, Resolved COI
Congress, Agency Leadership COOs,
CFOs Auditors
25
Value Chain Benefit Solution Integrators
Consultants
IT Management CIO/CTO/CAO
Integrators
Re-usable BluePrints
Integration Expertise
  • Provides Solid Evidence for making COTS decisions
  • Shares Integration Success Record with Potential
    New Customers (Who are Looking!)
  • Finds Potential COTS Component Matches Based on
    Prior Similar Context Successes
  • Provides What If Modeling and Analysis for
    component composition
  • Provides Design Differentiation to Enhance
    Quality Marketing
  • Use of Tool Feeds Data Back to Knowledge Base

Congress, Agency Leadership COOs,
CFOs Auditors
26
Value Chain Benefit Standards Bodies and
Industry Groups
IT Management CIO/CTO/CAO
Standards Industry Groups
Standards Templates
Integrators
  • Captures Use-Cases that Support and Justify
    Standards
  • Models Correct Use of Standards in selection COTS
    products
  • Links Pertinent Standards to Products
    Features/functions
  • Provides Dynamic, but Solid Compliance Record
    (A119)
  • Enable collaboration between disparate industry
    groups

Adoption
Congress, Agency Leadership COOs,
CFOs Auditors
27
Value Chain Benefit COTS Vendors, Open Source,
Small Businesses, Public Domain (GOTS)
IT Management CIO/CTO/CAO
Standards Development Organizations
Integrators
  • Provide Buyers with the Exact Product Information
    they Need When They Need It
  • Certifies Vendor Claims Based on 3rd-Party
    Validation/History Data
  • Increased Buyers Chances for Successful Use of
    Vendor Product --gt Happy Customers!
  • Generates normalized COTS solution blue prints

Specifications References
Technology Adoption
COTS Vendors/ Component Builders Small Businesses
Congress, Agency Leadership COOs,
CFOs Auditors
28
Solutions Template Ontology
  • SAIL OUTCOME templates will be based on an
    ontology that builds a lexicon for clear and
    precise communications
  • The ontology will start with the DHS FEA-PMO
    models as the highest level construct this will
    help ensure architectural relevance of the
    OUTCOMES and ease in cross-agency use of the
    knowledge repository
  • Vendors would be advised to adapt the ontology in
    their product literature in an effort to better
    serve their customers by providing clearer
    information about their product features,
    functions, interfaces, and past performance
  • Individual consortium members can protect their
    IP invested in source and methodologies, as the
    only thing that is published are OUTCOMES
  • Consortium members agree to participate in the
    development and maintenance of the OUTCOME
    templates
  • Their competitive advantage embodied in their
    source and methods is protected
  • Solution Profiles will be maintained as XML
    templates to enable direct insertion into agency
    specific reference models.
  • SAIL templates will provide equal access to
    small innovative businesses, VARs, Open Source,
    and GOTS.

29
SAIL Will Help DHS Develop Cross-Agency
Horizontal Solutions Decreasing Time To Market
Increasing Probability of Success
  • For each horizontal solution, SAIL will engage
    (based on past performance) a consortium of
    domain expert vendors (4-5) with deep experience
    in the specific solution area to evaluate
    potential solutions and help DHS select a subset
    that can meet cross agency needs
  • Their past experience will allow SAIL to cut
    through industry claims and go directly to actual
    experiences with potential solutions
  • This allows the SAIL team to more effectively
    develop criteria to evaluate vendor solutions
  • Through the teams experience they bring to the
    table an initial survey of potential solutions
    which would be difficult or impossible for a
    single vendor
  • The team knows which factors are most important
    to a successful implementation
  • In evaluating a vendors response to the
    criteria, the SAIL team can effectively evaluate
    the validity of the response based on a large
    experience base evidence based on real world
    experience

The result is an AOA pre-vetted set of vendor
solutions a BPA for software solutions that
provides a clear baseline for agencies and a
framework they can use for adding their own
specific criteria
30
SAIL Consortium Governance
  • Consortium members establish standardized
    common criteria templates for each domain
  • Consortium members have a vote on each domain
    template through participation on the SAIL
    Technology Review Board.
  • Domain specific vendors are provided a vote which
    is weighted with the Review Board Vote (there
    must be at least three domain vendors
    participating for their vote to be counted in
    order to reduce the potential dominance of a
    single vendor)
  • The founding consortium members will be
    responsible for providing a meta model for domain
    OUTCOME templates and criteria for evaluating fit
    and relevance of individual domain templates
  • SAIL RD team will develop a knowledge repository
    for housing the SAIL IP and Best Practices
  • No preference will be given to integrators or
    vendors that participate as consortium members
    for vetting process all industry offerings will
    be judged on an even playing field
  • Consortium members whose solution offerings are
    being evaluated will not be able to judge their
    own or competitor offerings.
  • Research and validation teams will be selected
    based on best qualifications and demonstrated
    past experience in the domain under evaluation

31
SAIL Funding Structure
  • SAIL will begin operations upon approval of an
    unsolicited proposal based on DHS/TSA established
    guidelines .
  • SAIL will be operated as a fee for service model
    (501c6) in which any organization (vendor,
    integrator, user) can engage to research and
    validate a solution domain. A basic membership
    fee will cover minimal service levels, access to
    templates, and SAWG activities.
  • SAIL will set up certification process based on
    the NIAP model. Certification fees would cover
    costs associated with the evaluation of their
    selected solution sets only, to include costs for
    processing evaluation through the SAIL mechanism.
    Cost will be significantly less than green
    fields testing/research as it will leverage
    existing testing, research and implementation
    results provided by the solution provider and
    independently verified.
  • A cross agency agreement will be put into place
    (like for EAMS) to stand up a shared knowledge
    repository. Cost of standing up and maintaining
    the KMS will be equally shared among agency
    partners.
  • Member have donated a working prototype and
    series of solution templates, and will integrate
    other related mechanisms as needed (ie. EAMS).
  • Most of the IVV services being offered are
    already on GSA Schedule 70 and MOBIS. This
    demonstrates viability of the process and ability
    to deliver.

32
Structure of SAIL Knowledge Exchange Structured
of resulting Solution Framework (ebXML)
Write a Comment
User Comments (0)
About PowerShow.com