Research into the human factors related to aircraft accidents and
incidents has highlighted Decision Making as a crucial element. Pilots
usually intend to fly safely, but they sometimes make errors. It has
been observed that the majority of fatal crashes are attributable to
decision errors rather than to perceptual or execution errors.
While we cannot eliminate human error, a thorough understanding of human
factors principles can lead to appropriate strategies, means and
practical tools to prevent most errors, better detect and manage them,
and mitigate their adverse impact on aviation safety.
Decision errors in aviation are typically not slips or lapses but
mistakes. In other words, the problem doesn't lie with a failure to
execute a correct decision but to making a wrong or poor decision in the
first instance.
Human Factors research and theories have described, using several
models, the characteristics of human decision making, which rather
differs from the way aircraft systems for instance 'make decisions'.
The SHELL model for instance provides a framework that illustrates
the various components and interfaces or interactions between the
different subsystems involved in aircraft operations.
The LIVEWARE constitutes the hub of the model, the most critical as well
as the most flexible component in the system. Adverse mental states may
contribute to poor decision making.
Pilots behaviours and motivations affect decision making and training
aims at improving the decision making process.
Five hazardous attitudes increase the risk of poor decisions. They are
shown in the table below. These attitudes must be carefully addressed in
training. Safer attitudes, often referred to as "antidotes", are also
identified in the table. Compliance with the SOP's is a common, powerful
antidote.
HAZARDOUS ATTITUDES ANTIDOTES
Anti-authority Don't tell me what to do!" This attitude is found in people who do not like anyone telling them what to do. In a sense, they tend to regard rules, regulations, and procedures as unnecessary.
Impulsivity "Must do something now!" This is the attitude of people who frequently feel the need to do something, anything, immediately. They do not take the time to think about what they are about to do; therefore they often do not select the best alternative.
Invulnerability "It won't happen to me." Many people feel that accidents happen only to others, but can't happen to them. They never really feel or believe that they will be personally involved. Pilots who think this way are more likely to take chances and increase risk.
Macho/Egocentric "I can do it -- I'll show them." Pilots with this type of attitude often take risks to prove that they are good and to impress others.
Resignation "What's the use? There is nothing I can do." The pilot will leave the action to others, for better or worse. Sometimes, such pilots will even go along with unreasonable requests just to be a "nice guy".
There are a number of behavioural traps and biases that can distort
decision making. Pilots should be aware of these traps and take steps to
avoid getting caught.
Peer Pressure
Confirmation bias (fixation)
Overconfidence
Loss-Aversion Bias
Anchoring Bias (attentional tunnel)
Complacency
Decision making biases lead to poor decisions and put the safety of the
flight at risk.
Knowing the biases is important but is not enough: biases should be
actively combated!
When exploring factors that contribute to decision errors, a common
pattern is the pilot's decision to continue with their original plan
whereas conditions suggest that other courses of action might be more
prudent or appropriate.
Situational factors (ambiguity)
Erroneous risk perception & risk management
Goal Conflicts
Workload & Stress
Many models have been developed to describe decision making. Two of
these are presented below.
NASA research describes a decision process model for aviation that
involves two components: Situation Assessment (SA) and choosing a Course
of Action (CoA).
Situation assessment and awareness is crucial. It involves defining the
situation or problem as well as assessing the levels of risk associated
with it and the amount of time available for solving it. It is also an
awareness of what the situation will be in the future.
Once the problem is defined, the course of action is selected from the
options available (known about) in the situation. Once the pilot
understands a situation, an acceptable course of action is often easily
identified.
This simple model based on the Observe, Orientate, Decide and Act stages
originates from the military fighter pilot community. Developed for
single-pilot operations, it describes control of behaviour in a rapidly
changing environment.
Observation, orientation, and action occur continuously and
simultaneously in flight (Skill based behaviour).
The decide stage is dependent on remaining resources. During periods of
rapid change, these can be very limited (hence the importance of flight
preparation).
Orientation (safety oriented approach) is the most important part of the
OODA loop model because it shapes the way we observe, the way we decide,
and the way we act.
Because decision making is not always perfect and may suffer short cuts,
pilots should be trained to better prepare and review their decisions,
as time allows.
The following strategies can improve decision making. Training pilots on
these solutions will allow them to make better decisions.
SOPs are widely used throughout the commercial aviation community as a
means to manage risk. Establishing safety oriented SOPs (including
personal and weather minimums) will provide pilots with pre-planned
responses that manage the risks and break the "chain of events" leading
to accidents.
Planning conducted prior to a flight in a low stress environment can
enable a pilot to produce a safe strategy for the flight (i.e.: the
pilot can be proactive and plan ahead to select a safe route and
establish "decision points" during each flight phase).
Research has suggested that having a plan B (safety net) encourages
continuation and possibly more risky behaviour. Naturally it is indeed
easier to take a risk when you know that you can count on a plan B.
Pilots however rarely assess their plan B properly; so the protection
can be weaker than expected.
Simulators can allow training decision making in high stress, high
workload situations with poor or conflicting information. Training
scenarios can be tailored to the trainees needs. In addition, simulators
allow exploration of the consequences of poor decisions without
endangering the safety of the aircraft and its occupants.
See also
Reference
Author