Case Study : Ireland - PowerPoint PPT Presentation

About This Presentation
Title:

Case Study : Ireland

Description:

Statistical Metadata Systems and the Statistical Cycle ... is more important than just ticking boxes but throughout the software ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 33
Provided by: une74
Learn more at: https://unece.org
Category:
Tags: case | ireland | study | ticking

less

Transcript and Presenter's Notes

Title: Case Study : Ireland


1
Case Study Ireland
  • METIS Workshop, 4-6 July 2007


Data Management System (DMS)
2
Presentation Agenda
  • Introduction and Overview
  • Statistical Metadata Systems and the Statistical
    Cycle
  • Statistical Metadata in each phase of the cycle
  • Systems and Design Issues
  • Organisation and Cultural Issues

3
Introduction Overview
Project Governance
4
Introduction Overview
Overall Strategy
  • Main drivers
  • EU requirement to move to Open Systems
  • Storage of all CSO data in a RDBMS
  • DMS to be business led with metadata driven
    processes
  • DMS to require use of common classifications
    (CARS)
  • DMS to require use of common dissemination
    database
  • CSO produced
  • IT Strategy for 1999-2002 beyond (April 1999)
  • Data Warehouse / Data Management Strategy
    (November 1999)
  • CGEY (10 week contract) produced an
    implementation plan for CSOs IT Data
    Management Strategies (first quarter 2001)

5
Introduction Overview Project Objectives
  • ITSIP - Information Technology Strategic
    Implementation Programme
  • Deliver set of applications to meet the survey
    processing and dissemination needs of CSO
  • Migrate existing DEC Alpha-based applications to
    client server environment
  • Implement the new applications within the CSO
    Corporate Data Model
  • Interface these applications with the existing
    client server and Sybase systems

6
Introduction OverviewProject Goals Obtained
  • Legacy Situation
  • Stove type approach to survey processing
  • 150 systems written maintained centrally
  • 250 end-user applications written maintained
    locally
  • SAS V6.12 PC SAS V8.02, Excel, Access
  • Data Management System (DMS)
  • Consolidates legacy processes into a suite of
    survey processing system
  • Nine corporate applications reside on a corporate
    database storing all data and metadata required
    in the survey-processing lifecycle.
  • Promotes consistency and reuse across the various
    survey areas

7
Introduction Overview
Project Background
  • Stage A contract (6 Months) awarded to Accenture
    who compiled the Requirements Specification
    High Level Architectural Design (April 2003)
  • Stage B contract (30 Months) awarded to Cognizant
    Technology Solutions Ltd., Chennai, India.
  • Project currently at performance testing phase
  • CSO first Irish Government office to use
    onshore/offshore outsourcing model
  • Cognizant staff onsite have ranged from 2-17
    depending on need
  • Offshore team has ranged from 20-50 depending on
    Project phase
  • CSO ITSIP team 15 staff
  • All above contracts were fixed price contracts

8
Introduction Overview Stage A Review
(Accenture)
  • Requirements Analysis Phase
  • Bottom up approach through 20 workshops with
    over 100 CSO business users
  • Production of
  • 51 As Is process descriptions with 61 process
    maps
  • Data Model (Swedish Data Model for Aggregation
    Dissemination)
  • 44 To Be process descriptions
  • Consolidation of existing processes into 9 survey
    processing applications plus a security
    application
  • Design phase
  • High Level Architectural Requirements
  • High Level Architectural Design
  • High Level Performance Model
  • High Level Interface Requirements and Design
    Specification
  • Web Enablement Specification

9
Introduction Overview Stage B Review (CTS)
  • Stage B involved
  • Further validation of Stage A system design for
    baseline DMS
  • Building DMS
  • Migration of historic data and integrity metadata
    from legacy systems
  • UAT
  • Migration of metadata from the UAT environment to
    the production environment

10
Introduction Overview Stage B Review (CTS)
  • Original Schedule 3 Nov 2003 - 2 May 2006
  • Latest Schedule 3 Nov 2003 - 29 August 2007
  • Delay of 16 months arose because of
  • delay in initial increment deliveries due to new
    requirements
  • delay in CSO testing due to underestimation of
    time required
  • extra functionality in the DMS
  • change in design needed for better performance
  • change from Windows to Unix for Sybase to cope
    with production load
  • Reworking of Java code to meet QA standards

11
Introduction Overview Data Migration Approach
  • Business areas identified for 100 surveys
  • minimum data and integrity metadata required to
    support normal survey processing
  • all required back versions of data including all
    historic data required to be migrated (back to
    1939 in some cases)
  • any additional data which should be migrated
  • Cognizant produced required ETL scripts using
    Informatica (data restructured into cube format
    to use classifications)
  • ETL scripts run to move data to UAT environment
  • Same scripts will move all data to Production
    environment (including latest processed periods)
  • Minimum integrity metadata migrated to all
    relevant databases because of application
    dependancy on same metadata

12
Introduction Overview Process
Metadata Migration Approach
  • Business areas should and would only enter
    process metadata once (in UAT environment)
  • Business areas identified process metadata
    entered during UAT
  • For Survey Instance specific modules (SS, SM, DC
    IMP) UAT Survey instance and Production Start
    Survey instance
  • For Survey specific modules (Reg. M., Agg.
    Diss.) list of Registers, Aggregate, Weight
    Disseminate Tables to be available in Production
  • Cognizant produced required ETL scripts to move
    this process metadata from the UAT to the
    Production environment
  • Comparison reports of metadata residing in UAT
    Production will be used to validate migration
    process
  • Ultimate check will be another parallel run in
    the Production environment to ensure that all
    migrated metadata (process mimimal integrity
    metadata ) is consistent and correct

13
Introduction Overview Recommendations for
others
  • Consider carefully the organisations capacity
    for insourcing / outsourcing development work
  • Consider the time scale for implementation of the
    solution
  • Manage the change process well
  • Understand the complexity of the solution and in
    procurement stage reject very low bids
  • Assume contractor has no knowledge of your
    business
  • Ensure adequate in house skills in IT Design so
    IT Partners assumptions can be validated
  • Ensure adequate in-house skills in IT Partners
    development tools and proposed application
    infrastructure
  • Dont accept IT Partners project plan lightly
    where your offices resources are concerned

14
Introduction Overview
Recommendations for others
  • Dont under estimate the resources needed to (1)
    manage the project and (2) keep abreast of all
    project documentation
  • Consider carefully the items that are for
    sign-off, review and for information by you -
    these will have financial implications later
  • QA is more important than just ticking boxes but
    throughout the software development lifecycle
    should include
  • reviewing the decisions taken to obtain technical
    solutions
  • examining the underlying deliverable
  • adherence to agreed standards
  • Allocate adequate time to reviewing the test
    process and test cases
  • Managing the contract requires high-level expert
    resources with project management, statistical
    and IT skills
  • Organisational support and commitment from top
    management critical

15
Introduction Overview Future
Challenges
  • DMS is to Go Live in Sept 2007
  • Six month gradual implementation
  • New SAS environment as we move from SAS V6, on
    the VAX, and PCSAS V8.02
  • New IT Strategy is required

16
Statistical Metadata Systems Process Model
SAS
17
Statistical Metadata Systems DMS Applications
Metadata
  • Register Management - Create Register
  • - Define Register Variables
  • - Set-up Register Coding
  • Sample Selection - Set-up Sample Selection
    Criteria
  • - Define Stratification Groups
  • Data Capture - Create Data Capture Form
  • - Define Variable Characteristics
  • - Set-up Coding Rules
  • - Set-up Import Details
  • - Set-up Edit Rules and Validations
  • - Version control of data

18
Statistical Metadata Systems DMS Applications
Metadata
  • Imputation - Set-up Imputation Groups
  • - Set-up Imputation Rules
  • Aggregation - Define Groups, Data Columns,
    Tables
  • - Create Weights and Weight Tables
  • - Macro edits and Confidentiality Rules
  • Dissemination - Create disseminate tables
  • - Define Additional Data Column attributes
  • Seasonal Adjustment - Set-up Seasonal
    Adjustment Rules
  • Survey Management - Set-up Post Out details

19
Statistical Metadata Systems Existing Systems
  • CBR Central repository for all enterprises
    engaged economic activity
  • CARS Database containing all classifications
    and concordances
  • SPROCET Re-usable survey processing template
    used by the Industrial surveys in the CSO
  • BoPFACTS Data processing and survey management
    system used by the Balance of Payments section
  • SAS SAS V6.12 and PC SAS V8.02
  • External Data Capture Applications - Blaise,
    Scanning

20
Statistical Metadata Systems Mapping the DMS
to the CMF Life Cycle
  • Register Management ? Survey Preparation (2)
  • Sample Selection ? Survey Plan Design (1)
  • Survey Management ? Survey Preparation (2)
  • Data Capture ? Data Collection (3)
  • ? Input Processing (4)
  • ? Derivation (5)
  • Imputation ? Estimation (5)
  • Aggregation ? Aggregation (5)
  • Dissemination ? Dissemination (7)
  • Respondant Management ? Post Survey Evaluation
    (8)
  • The DMS is a processing and not an analysis
    tool, therefore CMF LifeCycle Model (6)
    Analysis cannot be linked to the DMS.

21
Statistical Metadata in the Statistical Cycle
Input Metadata Examples
22
Statistical Metadata in the Statistical Cycle
Output Metadata Example
23
Statistical Metadata in the Statistical Cycle
Output Metadata Example
24
Systems and Design Issues Technical Starting
Point
  • Sybase
  • In-house knowledge in Sybase (ASE) Technologies
  • SAS access for complex analysis
  • (SAS did not bid for tender)
  • Link to Classifications and Related Standards
    (CARS) system
  • All disseminated data groups must link to a CARS
    classification
  • Windows platform
  • (Not possible due to performance issues
    identified with Sybase transactions, hence move
    to Solaris)

25
Systems and Design Issues Technical Overview
Unix Sybase
Win WebLogic Cluster
ASE
PC / Client IE6 JRE1.4.2_05
JDBC
T3 (RMI)
(failover) ASE
CARS
IQ
SAS
(failover) IQ
SSA Names3
Filestore
CBR
26
Systems and Design Issues Database Layer
  • The CSO has now established in-house skills in
    both Sybase ASE IQ Technologies
  • High Level Technical Architecture
  • Data Capture, Imputation Sybase ASE
  • Aggregation, Dissemination Sybase IQ
  • Two types of table
  • Core DMS Table (Survey Metadata)
  • Survey Specific Table (Data)
  • All complex numerical processing is performed
    within the database layer through the use of
    stored procedures
  • User of Veritas Clustering software on database
    layer to facilitate database failover

27
Systems and Design Issues Weblogic / J2EE MidTier
  • J2EE Application Server (Weblogic)
  • Stateless Session Beans
  • JMS Queues
  • JDBC Connection to ASE / IQ Databases
  • Application Security
  • Users validated against corporate Active
    Directory Service
  • Within DMS Database validated users will have
    assigned roles / privileges

28
Systems and Design Issues Client Layer
  • The DMS is a complex GUI interface
  • Fat Client using Java Swing technology
  • The client is deployed using Java Web Start
    Technology
  • Centrally managed releases
  • Quick deployment to client desktop
  • Client uses Java RMI to communicate with the J2EE
    server
  • (Currently using WebLogic T3 protocol)

29
Systems and Design Issues Other Components
  • Filestore
  • Shared network drive onto which data to be
    Imported / Exported to the DMS resides
  • SAS
  • Required for Seasonal Adjustment
  • Required for Import / Export of SAS Datasets
    to/from DMS
  • CARS (Classifications) Statistics New Zealand
  • All data to be dissemintated must use a CARS
    classification
  • CBR (Central Business Register) Statistics New
    Zealand
  • Hierarchical database
  • SSA Names3
  • Duplicate matching / searching of registers

30
Organisational Cultural Issues Roles within
the CSO
  • DMS Administrator (I.T.)
  • Highest level of access to the DMS
  • Supports the DMS
  • Manages the day to day interaction with the DMS
  • Survey Administrator (Statistician)
  • Defines the survey
  • Runs the survey
  • Assigns staff survey access and privileges

31
Organisational Cultural Issues DMS Maintenance
  • In the future the DMS will be supported by
  • Cognizant Technology Solutions Ltd
  • 1 year maintenance contract
  • provision for a 5 year support contract
  • CSO Java Development Team
  • CSO Weblogic Team

32
Thank You for Your Attention
Write a Comment
User Comments (0)
About PowerShow.com