Safety-Critical Systems 2 Requirement Engineering - PowerPoint PPT Presentation

About This Presentation
Title:

Safety-Critical Systems 2 Requirement Engineering

Description:

Two people shall be able to lift the boat onto the roof of the average saloon car. ... Cisco Systems, Hewlett Packard, Kodak, Otis Elevator, Pitney Bowes, Xerox ... – PowerPoint PPT presentation

Number of Views:92
Avg rating:3.0/5.0
Slides: 30
Provided by: CT5
Category:

less

Transcript and Presenter's Notes

Title: Safety-Critical Systems 2 Requirement Engineering


1
Safety-Critical Systems 2Requirement Engineering
  • T- 79.5303 Spring 2006
  • Ilkka Herttua

2
Safety Context Diagram
HUMAN
PROCESS
- Operating Rules
SYSTEM
- Hardware - Software
3
Critical Applications
  • Computer based systems used in avionics, chemical
    process and nuclear power plants.
  • A failure in the system endangers human lives
    directly or through environment pollution. Large
    scale economic influence.

4
Safety Definition
  • Safety
  • Safety is a property of a system that it will
    not endanger human life or the environment.
  • Safety-Critical System
  • A system that is intended to achieve, on its
    own, the necessary level of safety integrity for
    the implementation of the required safety
    functions.

5
Developing safety-related systems
  • To achieve safety
  • - safety requirements (avoiding hazards,
    risks)
  • - quality management (follow up process)
  • - design / system architecture (reliability)
  • - defined design/manufacture processes
  • - certification and approval processes
  • - known behaviour of the system in all
    conditions
  •  

6
Overall safety lifecycle
  •  

7
Risk Analysis
  • Risk is a combination of the severity (class) and
    frequency (probability) of the hazardous event.
  • Risk Analysis is a process of evaluating the
    probability of hazardous events.
  • The Value of life??
  • Value of life is estimated between 0.75M 2M
    GBP.
  • USA numbers higher.

8
Risk Analysis
  • Classes - Catastrophic multiple deaths
    gt10
  • - Critical a death or severe injuries
  • - Marginal a severe injury
  • - Insignificant a minor
    injury
  • Frequency Categories
  • Frequent 0,1 events/year
  • Probable 0,01
  • Occasional 0,001
  • Remote 0,0001
  • Improbable 0,00001
  • Incredible 0,000001

9
Hazard Analysis
  • A Hazard is situation in which there is actual or
    potential danger to people or to environment.
  • Analytical techniques
  • - Failure modes and effects analysis (FMEA)
  • - Failure modes, effects and criticality
    analysis (FMECA)
  • - Hazard and operability studies (HAZOP)
  • - Event tree analysis (ETA)
  • - Fault tree analysis (FTA)

10
Fault Tree Analysis 1
The diagram shows a heater controller for a tank
of toxic liquid. The computer controls the heater
using a power switch on the basis of information
obtained from a temperature sensor. The sensor is
connected to the computer via an electronic
interface that supplies a binary signal
indicating when the liquid is up to its required
temperature. The top event of the fault tree is
the liquid being heated above its required
temperature.
 
11
Fault event not fully traced to its source
Basic event, input
Fault event resulting from other events
OR connection
12
Risk acceptability
  • National/international decision level of an
    acceptable loss (ethical, political and
    economical)
  • Risk Analysis Evaluation
  • ALARP as low as reasonable practical (UK, USA)
  • Societal risk has to be examined when there is
    a possibility of a catastrophe involving a large
    number of casualties
  • GAMAB Globalement Au Moins Aussi Bon not
    greater than before (France)
  • All new systems must offer a level of risk
    globally at least as good as the one offered by
    any equivalent existing system
  • MEM minimum endogenous mortality
  • Hazard due to a new system would not
    significantly augment the figure of the minimum
    endogenous mortality for an individual
  •  

13
Risk acceptability
  • Tolerable hazard rate (THR) A hazard rate which
    guarantees that the resulting risk does not
    exceed a target individual risk
  • SIL 4 10-9 lt THR lt 10-8 per hour
    and per function
  • SIL 3 10-8 lt THR lt 10-7
  • SIL 2 10-7 lt THR lt 10-6
  • SIL 1 10-6 lt THR lt 10-5
  • Potential Loss of Life (PLL) expected number of
    casualties per year
  • SIL safety integrity level
  •  

14
Current situation / critical systems
  • Based on the data on recent failures of critical
    systems, the following can be concluded
  • Failures become more and more distributed and
    often nation-wide (e.g. air traffic control and
    commercial systems like credit card denial of
    authorisation)
  • The source of failure is more rarely in hardware
    (physical faults), and more frequently in system
    design or end-user operation / interaction
    (software).
  • The harm caused by failures is mostly economical,
    but sometimes health and safety concerns are also
    involved.
  • Failures can impact many different aspects of
    dependability (dependability ability to deliver
    service that can justifiably be trusted).

15
Examples of computer failures in critical systems
16
V - Lifecycle model
17
Safety Requirements
  • Requirements are stakeholders (customer) demands
    what they want the system to do. Not defining
    how !!! gt specification
  • Safety requirements are defining what the system
    must do and must not do in order to ensure
    safety. Both positive and negative functionality.
  •  

18
Specification
  • Supplier instructions how to build the system.
    Derived from the required functionality
    Requirements.
  • Requirements R Domain Knowledge D gt
    Specification S

19
Where do we go wrong?
  • Many system failures are not failures to
    understand R requirements they are mistakes in
    D domain knowledge
  • A NYC subway train crashed into the rear end of
    another train on 5th June 1995. The motorman ran
    through a red light. The safety system did apply
    the emergency brakes. However the ...signal
    spacing was set in 1918, when trains were
    shorter, lighter and slower, and the emergency
    brake system could not stop the train in time.
  • Are you sure?

20
Requirement Engineering Right Requirements
  • Ways to refine Requirements
  • - complete linking to hazards (possible
    dangerous events)
  • - correct testing modelling
  • - consistent semi/formal language
  • - unambiguous text in real English
  •  

21
Requirement Engineering
  • Tools Doors (Telelogic)
  • Data base and configuration management
  • History, traceability and linking
  •  
  •  

22
Requirements Management with DOORS
Slides provided by Telelogic/ Quality Systems
Software
23
Dynamic Object Oriented Requirements System
Interfaces
Configuration- management
Requirements Links
Effizienz
DOORS
Reports Analysis
Multiuser-Databank User Accounts
Change Proposal System Filter, Views
Text Processing Templates, Standards
Capture, Link, Trace, Analyse, Administer
24
Terminology in DOORS
Project
Module
25
Traceability in DOORS
Architectural Design
Requirement
Specification
Test Plan
Follow Customer Ammendments through all the
Documentation
26
Traceability - Requirements from Scenarios
Boat lifted
Boat loaded
Two people shall be able to lift the boat onto
the roof of the average saloon car.
Boat on car
Ready to sail
Boat unloaded
Mast rigged
Center-plate rigged
traceability
Boat rigged
To have sailed and survived
Rudder rigged
The sailor shall be able to perform a tacking
manoeuvre.
Goal hierarchy
Gibed
user requirements
Tacked
Boat manoeuvred
Sailed
Cruised
The sailor shall be able to contact the
coastguard when the boat is capsized.
Boat righted
Boat capsized
Returned home
Coast guard contacted
Gone ashore
27
References
  • Telecommunications ATT, Alcatel, British
    Telecom, General Dynamics, ITT, L3 Comm, MCI
    Worldcom, Motorola, Nokia, Nortel, Tellabs
  • Defense/Aerospace Boeing, Jet Propulsion Labs,
    Lockheed Martin, Raytheon
  • Equipment Manufacturers Cadence, Carrier, Cisco
    Systems, Hewlett Packard, Kodak, Otis Elevator,
    Pitney Bowes, Xerox
  • Automotive BMW, Chrysler Daimler-Benz , Ford,
    General Motors, Rolls-Royce
  • Financial/Insurance Citicorp, Experian, Freddie
    Mac, Mastercard, NASD/NASDAQ/ASE, Nations Bank,
    Norwest Financial Services, Prudential, State
    Farm, UNUM, USAA, VISA
  • Government CND, FDA, FAA, MoD, NIMA, NASA, NSA,
    DISA, IRS, DOD
  • Healthcare/Medical Abbott Labs, Beckman
    Instruments, GE Medical, HP Medical,
    Kaiser Permanente, Siemens Medical
  • Systems Integrators Booz Allen, CSC, EDS, IBM,
    Litton/PRC, Mitre, SAIC, Unisys

28
(No Transcript)
29
V - Lifecycle model
30
Additional home assignments
  • From Neil Storeys book Safety Critical Computer
    Systems
  • 1.12 (Please define primary, functional and
    indirect safety)
  • 2.4 (Please define unavailability)
  • Email by 1 March to herttua_at_eurolock.org
Write a Comment
User Comments (0)
About PowerShow.com