Humans in safety critical systems - PowerPoint PPT Presentation

About This Presentation
Title:

Humans in safety critical systems

Description:

Title: Cognitive Systems Engineering Approach to the Design of Complex Systems Author: Erik Hollnagel- Last modified by: Erik Hollnagel Created Date – PowerPoint PPT presentation

Number of Views:72
Avg rating:3.0/5.0
Slides: 16
Provided by: ErikH161
Category:

less

Transcript and Presenter's Notes

Title: Humans in safety critical systems


1
Humans in safety critical systems
2
Estimated number of human errors
100
The diagram shows the attribution of human
errors as causes, which may be different from
the contribution of human errors to incidents /
accidents.
90
80
70
60
50
Human action attributed as cause
40
30
20
10
1960
1965
1970
1975
1980
1985
1990
1995
3
What is an error?
Actual outcomes intended outcomes
Correctly performed actions
Detected and recovered
Detected but tolerated
Incorrect actions
Overt effects
Detected but not recovered
Latent effects
Undetected
4
Humans and system safety
Technology centred-view
Human-centred view
Humans are a major source of failure. It is
therefore desirable to design the human out of
the system.
Humans are the main resource during unexpected
events. It is therefore necessary to keep them in
the system.
Automation permits the system to function when
the limits of human capability have been reached.
The conditions for transition between automation
and human control are often vague and context
dependent.
Automation does not use humans effectively, but
leaves them with tasks that cannot be automated -
because they are too complex or too trivial.
Automation is cost-effective because it reduces
the skill-requirements to the operators.
Conclusion Humans are necessary to ensure safety
5
Ironies of automation
The basic automation philosophy is that the
human operator is unreliable and inefficient, and
therefore should be eliminated from the system.
Designer errors can be a major source of
operating problems.
1
The designer, who tries to eliminate the
operator, still leaves the operator to do the
tasks which the designer cannot think how to
automate.
2
Lisanne Bainbridge (1987), Ironies of automation
6
Automation double-bind
Safety critical event
Design teams are fallible, therefore humans are
required in the system
Humans are fallible, and should therefore be
designed out of the system
7
Maintaining control
What can help maintain or regain control?
What causes the loss of control?
Sufficient time
Unexpected events
Good predictions of future events
Acute time pressure
Reduced task load
Not knowing what happens
Clear alternatives or procedures
Not knowing what to do
Being in control of the situation means
Capacity to evaluate and plan
Not having the necessary resources
Knowing what will happen Knowing what has happened
8
Cyclical HMI model
Information / feedback
Provides / produces
Goals for what to do when something unusual
happens
Goals Identify, Diagnose, Evaluate, Action
Modifies
Team
Next action
Current understanding
Directs / controls
9
Effects of misunderstanding
The dynamics of the process only leaves limited
time for interpretation
Unexpected information / feedback
Provides / produces
Increases demands to interpretation
Operator may lose control of situation
Inadequate actions
Incorrect or incomplete understanding
Loss of accuracy increases unexpected information
Leads to
10
Prevention and protection
Accident
Initiating event (incorrect action)
Protection (safety barriers) Active barrier
functions that deflect consequences
Protection (boundaries) Passive barrier
functions that minimise consequences
Prevention (control barriers) Active or passive
barrier functions that prevent the initiating
event from occurring.
11
Types of barrier systems
  • Material barriers
  • Physically prevents an action from being carried
    out, or prevents the consequences from spreading
  • Functional (active or dynamic) barriers
  • Hinders the action via preconditions (logical,
    physical, temporal) and interlocks (passwords,
    synchronisation, locks)
  • Symbolic barriers (perceptual, conceptual
    barriers)
  • requires an act of interpretation to work, i.e.
    an intelligent and perceiving agent (signs,
    signals alarms, warnings)
  • Immaterial barriers (non-material barriers)
  • not physically present in the situation, rely on
    internalised knowledge (rules, restrictions, laws)

12
Barrier system types
  • Physical, material
  • Obstructions, hindrances, ...
  • Functional
  • Mechanical (interlocks)
  • Logical, spatial, temporal
  • Symbolic
  • Signs signals
  • Procedures
  • Interface design
  • Immaterial
  • Rules, laws

13
Barriers systems on the road
Symbolic requires interpretation
Physical works even when not seen
Symbolic requires interpretation
Symbolic requires interpretation
14
Classification of barriers
Containing
Walls,fences, tanks, valves
Material, physical
Restraining
Safety belts, cages
Keeping together
Safety glass
Dissipating
Air bags, sprinklers
Preventing (hard)
Locks, brakes, interlocks
Functional
Preventing (soft)
Passwords, codes, logic
Hindering
Distance, delays, synchronisation
Countering
Function coding, labels, warnings
Regulating
Instructions, procedures
Symbolic
Indicating
Signs, signals, alarms
Permitting
Work permits, passes
Communicating
Clearance, approval
Immaterial
Monitoring
Monitoring
Prescribing
Rules, restrictions, laws
15
Barrier evaluation criteria
  • Efficiency how efficient the barrier is expected
    to be in achieving its purpose.
  • Robustness how resistant the barrier is w.r.t.
    variability of the environment (working
    practices, degraded information, unexpected
    events, etc.).
  • Delay Time from conception to implementation.
  • Resources required. Costs in building and
    maintaining the barrier.
  • Safety relevance Applicability to safety
    critical tasks.
  • Evaluation How easy it is to verify that the
    barrier works.
Write a Comment
User Comments (0)
About PowerShow.com