Title: Experimental Crew Training to deal with Automation Surprises
1Experimental Crew Training to deal with
Automation Surprises
- René Amalberti
- DGAC France
2Outline of the presentation
- Human factors and Automation in the 90s
- Transitioning on Glass
- First accidents, evidence for automation-related
problems - Human factors and Automation now
- Old folks and new vision of problems
- Coping with complexity, Regulations, and Change
of context - Design and training strategies
3- Human factors and automation in the 90s
- Promises and first drawbacks
4The ALT HOLD Automation SurpriseB737-300 (Sarter
Woods, 1992)
- disconnect autopilot
- (press any of the two autopilot disconnect
- switches OR detune ILS frequency OR
- override control column)
- set both flight directors to OFF
- re-engage autopilot
- press ALT HOLD switch
context 1
context 2
VNAV PTH
VNAV PTH
VNAV SPD
LVL CHG
APP (VOR/LOC G/S)
LVL CHG
TO
5Drawbacks of AutomationComplacency or
Over-reliance
- Automation is so efficient and reliable that it
can induce complacency - Monitoring a systems that runs almost perfectly
is boring - Such reliability tends to transform active
monitoring into passive monitoring - Pilots tend to check that the automation behaves
as intended instead of flying the aircraft!
6Discovering automationand transitioning on
Glass cockpits
7Factors affecting communicationEnglish level
of pilots with poor level of proficiency in
English, according to age Source Amalberti
Racca, 1988
8Drawbacks of Automation Difficulties in
Programming
- Programming tends to prematurely determine the
representation a pilot has of the next phases of
the flight - Consequences of errors shift into the future
- Data-bases aids that can turn into traps
9Overconfidence in aid systems(Smith al., 1994)
- 30 crews were asked to choose a route between
Oklahoma City and New Orleans, minimising weather
problems. - Group 1 standard weather equipmentGroup 2
standard equipment plus, on
demand, an automated systemGroup 3 standard
equipment plus the automated
system displaying
the optimum option. - The automated system, when used provided non
optimal option. - The first group chose routes avoiding the bad
weather but 75 of the other pilots accepted the
system's recommendation or, at best, corrected it
slightly
Bad weather area
À
Á
10- Human factors and automation now
- Evolution of ideas
11Poor situation awareness and false expectation on
crew behaviour
A320 SIMULATOR
- 10 crews experiencing rare software failures
- Modification of Flight Simulator Capacities
- LOFT scenario LYON - MADRID
micro
micro
Control room
INSTRUCTOR
ATC
12System failures
13Type of Scenario
ATC requests to speed up to MACH 0.81 Erroneous
selected MACH
a few minutes later Autothrust fault
Altitude preset inoperative
Radar regulation by ATC
ATC requests flight level 350 (impossible due to
MEL item). End of climb at FL 310
After 15 sec Reversion of descent mode
Hdg-Vs/Tk-FPA
ATC requests maintain level 210 due to traffic
Erroneous selected SPEED
ATC requests direct navigation to another
waypoint-SID is interrupted.
Two MEL ITEMS before departure (PACK 2
inoperative (max flight level 310), Spoilers 2
4 inoperative)
Go around due to adverse wind conditions
Flaps jam
Second approach and landing
14Detection time (in seconds) as a matter of
failure and context
15Dominant type of co-operation during failure
diagnosis and recovery (N 61 failures)
16Example
- 1-Detection Test
- Captain Well, did you see, whats rather funny
is that I have Track here which corresponds to FD
Track (FD flight director), but with Heading
information, (the captain manipulates the press
button of the FCU (Flight Control Unit) to
illustrate his explication to the First Officer),
do you agree? And when I put Heading, I get FD
Heading but I have information track, and it
takes a Track,...therefore it would be better,
eh,...we are on the course, so I would rather
have a FMA HDG indication, what do you think? no?
What do you think? - First-Officer Here its on Track, and it has
taken the reference Track... - Captain Here it has taken the track, so its a
nuisance... - First-Officer With the heading..., yes,...
- Captain Therefore What I would prefer is to put
Heading to get Heading... - ATC Flight XX, left heading 275
- First-Officer Left heading 275, Flight XX (the
captain puts the course index on 275) - Captain OK, if you agree, I propose to keep the
Track-FPA information here... - First-Officer Yes
- Captain You have the FMA Heading Information,
eh, Heading which is taken effectively by eh...no
it is not taken eh...its heading over that
way..., its not heading? - First-Officer 275
- 2-Diagnosis
- Captain Good track (he pressed the button) with
a DV which is like that Heading and 275. OK, we
are going to take course. Check the security,
10000 feet for the moment, no problem - Right now lets resume HDG-VS 272 which is
confirmed here and the course taken is 275 with a
Track information which is not up to standard...
and if I show up Track with this switch (he
pressed the button) I get HDG and it will
search...for the 272 which is displayed. I am not
so confident when we have to turn... but it
works... - 3- Action
- Captain Right, I propose my earlier solution
with the heading (he pressed the button). I
suggest to keep Heading on, and to turn off the
autopilot when we have to turn, otherwise we turn
on the autopilot and use the Heading mode with
the track symbology. That should work.
17Main results
- All crews were able to land safely. Most of them
had an overall feeling of high or extreme
workload. The detection time of software failures
ranged from 2 seconds to 3 minutes, depending on
the workload level and the course of action at
the moment of the failure. - All crews showed a tendency to cycle the failing
system several times to test the failure before
trying to reset the system in order to recover it
(resetting fuse, proceeding the global function
reset, e.g. resetting FMGS). Many crews were also
tempted to reset again a few minutes after an
initial failing procedure, therefore proving that
software malfunctions are often inexplicable to
pilots. - Five of the crews decided to revert to manual
control when experiencing software bugs.
Conversely five crews decided to fly through the
failure, maintaining the remaining automation
functions in order to manage workload.
18- Coping with complexity, regulations and change of
context
19Impact on cognition of system complexity
20The minimum acceptable contract
Easy to do
Metaknowledge Representation of Personal chance
of success
Past experience Meta knowledge Self evaluation
(fatigue, etc)
Challenging
Acceptance threshold
Unrealistic
21Systemic Migration to Boundaries
Expected safe space of action as defined by
professional standards
VERY UNSAFE SPACE
ACCIDENT
PERFORMANCE
22Accident
Défaillance
Mitigation
Recovery
Prevention
Selon Jean Pariès, Dédale SA
23Accident
Défaillance
Mitigation
Recovery
Prevention
Selon Jean Pariès, Dédale SA Revu Amalberti
Auroy, Nov 2001
241991, 44 Flight Hours
Ambiguities of error definition
Average of 2 Errors per hour Almost 30
Identified by Certification Team Only 10
Mentioned in Report, and Considered as Relevant
for Certification Goal
25Feedback from the FAA/JAA report on The
Interfaces Between Flight crews and Modern Flight
Deck Systems, June 1996
Reduction time in training courses
- Reduction time
- Strong belief (evidence?) that learning cognitive
skills needs a shorter time compared to learning
motor skills - Back to the minimum
- Ambiguities for decision making
26Modelling the Natural Safety Control of situation
Too many questions unexplored
System and situation Complexity
NON controllable Area
Performance
NON controllable Area
MAXIMUM PERFORMANCE
Late error detection Cognitive surprises
NON controllable Area
Too many tasks piling up
Work intensity Workload
Too many errors
27Performance and technology induced problems
Performance
Comprehension Management
New solutions?
Back to manual?
LOSS OF CONTROL
Workload Management
28Safety Strategies must change with Improvement of
the Safety Level
Risk of Catastrophic Event per trial
1 1.10-1 1.10-2 1.10-3 1.10-4 1.10-5 1
.10-6 1.10-7
From local
From accidents
Technical KNL
AMATEUR SYSTEMS
Safety regs barriers
Reporting systems
Making risks more visible Cleanin regs,
symplifying barriers Back to human responsability
SAFE SYSTEMS
Work organisation
Safety culture
TO general
To near incidents individual errors
ULTRA SAFE SYSTEMS
NO SYSTEMS BEYOND THIS POINT
29Technology and Human Error
- DENMARK, 6, 2000 a patient broke wind while
having surgery and set fire to his genitals. The
30-Years-old man was having a mole removed from
his bottom with an electric knife when his attack
of flatulence was ignited by a spark. His
genitals, which where soaked in surgical spirits,
caught fire. The man who is suing the hospital,
said When I woke up, my penis and scrotum were
burning like hell. Besides the pain, I cant have
sex with my wife. Surgeons at the hospital in
Kjellerups saidIt was an unfortunate accident