Lecture on - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

Lecture on

Description:

Nuclear, chemical plants, public transportation systems, banks ... Stress caused err. ... Modeling & Testing of H/S processes (structural modeling) ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 35
Provided by: gio62
Category:
Tags: lecture | nuclear | stress | test

less

Transcript and Presenter's Notes

Title: Lecture on


1
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski, 1998
  • Lecture on
  • Safety and Reliability of Human-Machine Systems
  • / Sicurezza e Affidabilità dei Sistemi
    Uomo-Machina/
  • Adam M.Gadomski
  • E-mail gadomski_a_at_casaccia.enea.it
  • URL http//wwwerg.casaccia.enea.it/ing/tispi/
    gadomski/gadomski.html
  • 1998/99

2
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
Presentation outline
  • Definitions Reliability, Safety and Human Errors
  • Human-Machine Systems Low risk systems, High
    risk systems
  • Human Errors Operator - Designer - Organization
  • User Modeling for Decision support
  • Reduction of Human Errors From Passive DSS to
    Intelligent DSSs
  • Some Examples

3
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Reliability and Safety
  • Reliability problem - generation of economic
    losses
  • Characterized by different loss of
    function over a given period of time under a
    given set of operational conditions.
  • Safety problem - generation of health,
    environmental and cultural losses
  • direct losses for humans body (harm,
    injury).
  • Safety (effects) Yes
    No
  • Reliability Yes
  • (causes) No
  • As we see Safety and Reliability are either
    independent (wrong design) or dependent
    indicators of the system utility.

4
Human errors
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Human error
  • Human action or inaction that can produce
    unintended results () or system failures ().
  • () ISO/ ITC Information Technology
    Vocabulary,96
  • () NUREC-1624

Machine failures
Human errors
Complex consequences interrelations
Reliability problems
Safety problems
5
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Human - Machine Systems
  • High risk systems ( most important safety
    problems )
  • Human errors cause high losses disasters
    (off site),
  • accident (on site), incidents, human dead.
  • - Nuclear, chemical plants, public
    transportation systems, banks ...
  • Low risk systems ( most importany reliability
    problems)
  • Human errors cause only long term low
    economical losses or quality problems
  • - Office systems, public information
    systems, travel, sale systems, Internet ...

6
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Causes of Human Errors

Machine (controlled system/processes)
Control and Measurement System
Computer Console
Human operator
Hardware Software
7
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Propagation of Human Errors Consequences
    (Losses)

consequences
Design err.
Machine
Organization
consequences
Stress caused err.
Environment
Remarks Critical modificable element is the
Human-Computer Interface System
8
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
Mental
  • Direct Human User Errors Sensorial Rational
    Emotional
  • Erroneous perception
  • of images and texts
  • Erroneous request
  • of information
  • Erroneous manipulation

Possible propagation
9
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Sources of errors from the user perspective
  • Designer Errors cause Operator Errors !
  • Main Designer error Adopting system interface
    to his own needs.
  • 1. Neglecting human factors and cognitive
    importance scale on the level of human sensing
    and manipulation
  • - too much information(images, texts) on the
    screen
  • - not clear hierarchy (criteria) of the
    information presentation
  • - mode of presentation, use of size,
    structure, color, voice.
  • - choice of proper buttons of control
    (importance scale) place, abreviations,
  • - lack the possibility of correction of
    errorneouss commands.

10
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • 2. Neglecting of the necessity of the user
    understanding of plant/process global and
    particular situation - not clear User-Computer
    Cooperation.
  • - lack of hierarchical monitoring of the
    situation
  • flat representation of the intervention
    domain.
  • - lack of explanations on the operator
    request.
  • - lack of warning.
  • - lack of suggestions.
  • ? User should always know what the system may
    offer.
  • ? System should help users to understand the
    machine.

11
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Organization Errors cause Operator Errors !
  • 1. Related to Human Decisional Problems - Not
    sufficiently clear duties and responsibility of
    the human operator/user
  • - the system offers forbidden or never
    requested functions.
  • - role of user is modified during
    machine exploitation.
  • - stress caused by individual
    responsibility - too high individual
  • risk, or too low individual
    responsibility.
  • 2. Lack of proper instructions and training
    (competencies) creates gap between possible
    interventions and tasks received from superiors.
  • 3. Knowledge support Lack of an easy access
    to organisation experts.
  • 4. Co-operation Not sufficiently precise
    define co-operations conditions
  • 5. Ergonomy Improper organisation of
    workplace.

12
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Integrated Solution
  • To design Active/Intelligent Decision Support
    Systems
  • - Reduction of operator functions, it
    requires a function allocation and
  • the design of new cooperation
    functions
  • - Active support structured on the
    levels of
  • 1. Data presentation/manual
    manipulations - goal-oriented
  • 2. Data processing Selected data
    processing/calculations -
  • task- oriented
  • 3. Mechanical Reasoning
    (qualitative) implementation of
  • decision- making components.

13
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski, 1998
  • Flexible Interaction Strategy of human-computer
    interface,
  • Control of the role of the operator (operator's
    competencies, responsibilities, access to
    information),
  • Understanding support textual and graphical
    languages, information density,
  • Decision support related to information,
    preferences, knowledge management
  • Active intervention support suggested solutions,
    explanations.
  • An integrated role-dependent
  • user/operator modeling is necessary.

14
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • User Modeling Frameworks - Artificial
    Intelligence
  • IPK (Information, Preference, Knowledge)
    framework Gadomski, ? 1989
  • BDI (Beliefs, Desires, Intentions) framework
    Anand Rao at al. ?1991 strong human subjective
    metaphor.
  • CKI - Communication, Know-How, Intentions Model
    M.Sing,94

Current tendency Active DSS designed by the
application of human metaphor.
15
Basic concepts of IPK
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Information how a particular situation looks
    (before, now, in the future) ?
  • - facts , measurements, observations
  • Knowledge how situation may be classified and
    modeled, and what is possible to do
    in this type of situation ?
  • - descriptive frames, rules,
    procedures, methods
  • Preferences what is more important? what is
    more efficient?
  • Goal what should be achieved ?

16
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Application of the IPK to
  • Abstract Intelligent Agent construction

Possible domain-independent reasoning
mechanisms -deductive, inductive, abductive,
case based ... different logics
17
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
Abstract Intelligent Agent (AIA)
  • abstract because such model of intelligent agent
    is independent from its application-domain,
    specific role of the decision-maker, and
    independent from its software implementation
    environment
  • AIA is dependent on its architecture constrains

18
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Basic architecture element of AIA
  • Gadomski,93

19
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
MONDO O SIMULATORE ESTERNO
Multi-Agent Structure of Abstract Intelligent
Agent
20
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Abstract Intelligent
  • Agent
  • Knowledge
  • Preferences
  • Information

Role model
Decisional Errors
Competencies Responsibilities, Duties Access
to information
Out of competencies Wrong choice criteria Not
proper or insufficient information
21
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Routine software engineers task is
  • To design software systems which satisfy users
    production goal (user requirements).
  • What more is needed?
  • To satisfy also safety economic goals.
  • It means
  • User Modeling is a new paradigm in the software
    life cycle

22
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • New Life Cycle Production, Safety and Economic
    Goals M.Lind,92

Production Goal Design of Physical Processes
Safety Economics Goal Design of
Computer Support Processes
Safety Economics Goal Design of Human
Decision-making Process
Integration/modifications
Integration/modifications
Modeling Testing Production and Control
Modeling Testing human factors and cognitive
reasoning processes
Modeling Testing of H/S processes (structural
modeling)
constrains
constrains
23
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
New components in the Software Life-Cycle required
  • Identification of possible causes and mechanisms
    of human errors and possible consequences
  • - Cause-Consequence analysis.
  • Ideal Users/Operator functional modeling.
  • Allocation of functions and the definition of new
    interface functions.
  • Design of additional cooperation functions.
  • User training in new conditions.
  • They requires
  • New Systems Technologies

24
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Active/Intelligent Decision Support Systems
  • Can be viewed as computerized interfaces for
    fitting passive DSS functions to the
    requirements, properties and preferences of man.
  • Eliminates redundancy of not actual in this
    moment alternatives
  • Suggests choices determined by criteria defined
    on higher abstraction levels
  • Is based on goal-driven paradigm

Passive Decision Support Systems Passive
Decision Support Systems (Information Systems)
have been the first attempt to the computer aid
for plant operators and emergency
managers Unfortunately, their application
requires from their users continuous learning and
training to which typical emergency managers are
not enough motivated Large part of the user
decisions relies on the choice of the concrete
button from menubars or menu tools being parts of
a visualized hierarchical menu structures
(menu-driven paradigm)
25
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Human-Computer Cooperation
  • Gadomski at al,1995

Information em. domain current data Knowledge rule
s, models, plans, strategies Preferences risk,
roles and resources criteria
EMERGENCY DOMAIN
Intervention decisions
Images, Measured Data
Continuous monitoring
Active DECISION SUPPORT SYSTEM
EMERGENCY MANAGER
Cooperation
IDSS
dialogue suggestions explanations
(Intelligent Agent)
cooperation
data acquisition
Human Organization
Experts
26
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Mental errors reduction IPK Architecture.

27
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
EXAMPLE
1
2
3
- Requirement specification phase
- Prototyping phase
  • .

28
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Examples of different ADSSs and IDSSs
  • FIT, the Institute for Applied Information
    Technology,, Germany's national research center
    for information technology.
  • FABEL
  • Distributed AI-based support system for complex
    architectural design tasks integrates case-based
    and rule-based methods
  • GeoMed
  • Distributed open geographical information systems
    - implemented as extensions to the World-Wide
    Web - which support urban and regional planning
    as multi-party / multi-goal processes

  • KIKon
  • Knowledge-based system for the configuration of
    telecommunication services and
  • customer premise installations
  • ZENO
  • develops and evaluates AI-based tools for
    Mediation in real-world cooperative planning and
    design tasks.

29
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
Evolution of DSSs - ENEAs Example
  • 1990 -- Passive DSS Information Support with
    Large Data Bases.
  • ISEM Information Technology
    Support for Emergency Management
  • Multi-actors, Large territorial
    emergency.
  • 1993 - CAT (Computer Aided Tutoring)
    Recognition of human errors.
  • MUSTER Multi-Users System for
    Training and Evaluating Environmental
  • Emergency Response. Genoa Oil Port.
    Goal Training support in emergency
  • managers cooperation.
  • 1995 - Active DSS Implementation of some
    mental functions GIS
  • CIPRODS Civil Italian PRotection
    Overview and Decision Support
  • Supervision of territorial
    emergency on the national level.
  • 1996 - Active DSS Some mental functions
    inserted as autonomous
  • software tools with graphical
    interface. GEO Emergency Management
  • on Oil Transport Devices (Lines
    and Deposits)
  • 1997/8 - Intelligent DSS User role modeling
    User must know - What?
  • System must know - How? IDA -
    Intelligent Decision Advisor Multipurpose
    agent- based system

MINDES
30
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
Example 1 A Cognitive Functional Architecture
of ADSS - The system suggests possible actions
in a concrete application domain
EDSS (Emergency Decision Support System) CIPRODS
General Architecture
Di Costanzo et al.,1995
What happens?
What will happen or could Happen?
What to do ?
31
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
Example 2 Schema of functions allocation
among Active DSS and its user Balducelli,Gadomsk
i,97.
32
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Example 3 Dynamic Humans Modeling - Cooperation
    Training

K
P
Trainee 1
D
Trainee 3
K
P
D
Trainee 2
D
P
K
D
Tutor (Training Supervisor)
Balducelli at al.1994
C
P
33
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • Some References
  • Human Reliability and Safety Analysis Data
    Handbook
  • David I. Gertman and Harold S. Blackman
  • Published by John Wiley Sons, ISBN
    0-471-59110-6 .
  • NASA SAFETY POLICY AND REQUIREMENTS
  • DOCUMENT NASA Handbook NHB 1700.1 (V1-B)
    June 1,1993.
  • Most advanced Nuclear power industry and
    associated Government regulatory agencies - it
    does little to broaden the analysis of human
    reliability in other applications, such as air
    traffic control and other human-in-the-loop
    situations.
  • IAEA instruction, manuals.
  • US Nuclear Regulatory Commission Reports
  • Stress and Operator Decision Making in Coping
    with Emergencies
  • T.Kontogiannis, Int.J. Human-Computer
    Studies (1996) v.45.

34
Lecture on Safety and Reliability of
Human-Machine Systems Adam M.Gadomski
  • ENEA
  • A.M. Gadomski, V.Nanni, Intelligent Computer Aid
    for Operators TOGA Based Conceptual Framework.
    Proceedings of "Second International Conference
    on Automation, Robotics, and Computer Vision",
    Singapore, Sept.1992.
  • A.M. Gadomski , S. Bologna, G. Di Costanzo.
    Intelligent Decision Support for Cooperating
    Emergency Managers the TOGA based
    Conceptualization Framework. The Proceedings of
    "TIEMEC 1995 The International Emergency
    Management and Engineering Conference", J.D.
    Sullivan, J.L. Wybo, L. Buisson (Eds), Nice, May,
    1995.
  • C. Balducelli, S. Bologna, G. Di Costanzo, A. M.
    Gadomski, G. Vicoli. Computer Aided Training for
    Cooperating Emergency Managers Some Results of
    Muster Project. Proceedings. of The MemBrain
    Conference. Oslo95, 1995.
  • A. M. Gadomski, C. Balducelli, S. Bologna, G.
    DiCostanzo. Integrated Parallel Bottom-up and
    Top-down Approach to the Development of
    Agent-based Intelligent DSSs for Emergency
    Management. Proceedings of the International
    Emergency Management Society Conference.
    TIEMS98 Disaster and Emergency Management.
    Washington, May 1998.
Write a Comment
User Comments (0)
About PowerShow.com