Hindsight 15
Hindsight 15
Published in May 2012, HindSight 15 focuses on Emergency and unusual situations in the air.
Editorial
by Tzvetomir Blajev
"Can training and everyday practice with normal operations provide the skills and knowledge to deal with the unexpected, unfamiliar, and very often never-experienced-before, situations? Surprising situations do happen to us. They sometimes trigger a physiological reaction known as the startle response. This is a knee-jerk and instinctive reaction to a sudden, unexpected external stimulus like someone firing a gun behind us when we were not expecting it.
The startle reflex normally triggers within 100 milliseconds of the stimulus, it is pre-emotional and contains physiological and subjective dimensions. The subjective dimension is very similar to fear or anger. Startle prompts an increase in the speed of your reaction and helps to focus attention. The reaction is autonomic for at least the first 1-3 seconds but may continue for as much as 20 seconds in extreme circumstances. In the jungle this is the fight-or-flight response when encountering something unexpected. And unexpected in the jungle is rarely good. Even if it is good, it is better to err on the safe side. Pure survival!
Coming back to modern times – pilots and controllers are sometimes confronted with rapid-onset, dynamically developing situations with different, sometimes opposing, strategies available. How should the situation be assessed to find the best strategy in a blink of an eye? And at the same time coping, potentially, with impulses for autonomic, startle-triggered, reactions.
Let us take as an example the situation where the stall protection system of an aircraft has been activated. In the past pilots flying multi crew public transport aircraft were generally advised that their response to such a warning should only involve the minimum reduction in aircraft attitude needed to recover from the ‘edge’ of the fully stalled condition. This advice implied that the loss of aircraft altitude resulting from the recommended response should be minimised. Thus implication soon became a widely accepted objective in its own right. The important strategy to recover became obscured by a secondary consideration. The result was that pilots who did not understand the aerodynamics of the stall simply responded to this rare event in a way that failed to restore a normal flight condition. What is really important is that if an initial response is not effective, the result may be a much more difficult situation which in some flight conditions can follow very fast.
Do you believe that we in ATC never encounter similar situations? Really? Think about the example in the case study of this edition of HindSight. On initial contact with the crew there is an indication of ‘fuel at minimum’. What should the Controller do first? Give the requested ‘direct to the destination’, ask if the crew wish to declare an emergency or first coordinate with the adjacent ATC centre? The Controller elected to immediately give a direct route and this, in hindsight, may have helped to save the lives of the people in the aeroplane.
Knowing in hindsight what happened and what would have been the best decision(s) is very easy, but how do we know this when the decision is still to be taken? More than one option may seem credible and there may not be enough time to analyse them. Hindsight bias is one of those features of human thinking that delivers results which are different from those we can get if we analyse the situation with the help of statistics. The idea of ‘hindsight bias’, which also inspired the name of our magazine, was formulated by two scientists in the 1970s – Amos Tverski and Daniel Kahneman.
A few months ago, Kahneman published another book [1] which I would like to spend some time discussing from the perspective of our current theme. In his new book Kahneman describes our thinking process as consisting of two systems – System 1 and System 2. System 1 thinks fast, is unconscious, intuitive and effort-free. System 2 thinks slow, is conscious and analytical. System 1 recognises patterns in a fraction of a second, and ‘automatically produces an adequate solution to the challenges’. System 2 is systematic but tires easily; therefore it usually accepts what the fast System 1 tells it.
These systems are not actually two distinctive agents in our head. Not really, says Kahneman. Rather, they are “useful fictions” – useful because they help explain the traits of the human mind.
One may think System 2 is in charge, but the reality is that System 1 is the boss most of the time. This is for good reasons, because System 1 is for the most part very good at what it does; it is very sensitive to subtle environmental cues and signs of danger. It kept our ancient ancestors alive. There is simply too much going on around us for System 2 to analyse everything in depth.
Another benefit of System 1 is the ‘expert intuition’ which comes from experience. Expert intuition can be learnt by prolonged exposure to situations that are “sufficiently regular to be predictable”, and provided quick feedback is given to the expert on whether he did the right or the wrong thing. This is how experts develop their unconscious“pattern recognition” mechanism to get the right answer quickly. A trained expert (Kahneman gives as an example a fire fighter) can unconsciously, and almost immediately, produce the right response to complex emergencies.
All the marvels of System 1 come at a price. The high speed is paid for. System 1 works in an oversimplified world, it assumes WYSIATI (“what you see is all there is”), and it has no doubt whatsoever in its thinking process. System 1 is notoriously bad at the sort of statistical thinking often required for good decisions, it jumps easily to conclusions and it is subject to various irrational biases, including the already mentioned hindsight bias. Speed is achieved at the expense of precision. System 1 is “quick and dirty”. We do not want our reaction in aviation emergency situations to be like this, do we?
But System 1 does well most of the time; it is because of System 1 that we have our (good) performance and intuitive expertise. Not relying on it will deny us all the benefits as well. Kahneman implies that knowing the fallacy of our behaviour won’t help a lot to overcome it. It helps if more then one person is involved and they cooperate. Because it is easier to recognise someone else’s errors than our own, working in a trained team, with ongoing feedback mechanisms, is part of the ideal solution.
System 1 is always working, but the situations that happen to us occur with varying degrees of surprise for it. Similarly, the amount of time available for our System 2 to take over from System 1 and analyse in-depth the issue before making a decision varies. I was thinking how to map graphically the distinctive situations, which are associated by the different combinations of surprise for System 1 and available time for System 2.
I have attempted to visually represent the diversity of these combinations above.
There are situations, represented in green, where there is either sufficient time for the crew to adopt knowledge-based strategies or which can be reasonably expected, such as wind shear encountered when approaching to land at an airport with significant convective weather in the vicinity. In this latter case the expectation can trigger a pre-briefing for the actions required if an actual encounter occurs, and although the situation is sudden and there is no time for System 2 type of thinking, after the encounter the strategy is still knowledge-based. An example of such a team knowledge-based strategy is the Airbus 380 emergency landing in Singapore in 2010.
Other situations involve unforeseen or highly unlikely events but with sufficient time available either for personal or team System 2 type thinking. The available knowledge-based strategies are generic rather than specific.
Sometimes, I hope very rarely, the situation develop suddenly and will be both unexpected and unknown. Then there is not much in the way of a preformed strategy available. What one needs to do is to prevent or reduce the likelihood of such situations.
There are known but unexpected situations with sufficient time for personal reflection but not for use of team resources. An example of this would be the Airbus 320 ditching in the Hudson River in 2009 after the loss of almost all engine thrust following a multiple engine bird strike at low level.
Finally, there are those cases that combine unexpected but relatively frequent and known situations with sudden development and no time for reflection. If these cannot be prevented then the best strategy is to train for them extensively so that an optimum reaction becomes second nature and is more likely to be intuitively applied if needed. This is the famous rule for becoming an expert by spending 10,000 hours on training and practice. Take your time!
Intuitive reaction is not always bad; it helped us survive in the Darwinian sense. Flying and providing air traffic control to modern aircraft, however, is less of a reaction from the jungle and more about preparation. It is true that the design of aircraft and ATC systems should be human-centred, accommodating instinctive human reactions. But this assumes that someone will know everything about humans and their reactions and will successfully integrate this into the design of machines and procedures.
Obviously, this is not fully achievable and there will be situations that surprise us. If these situations have potentially dangerous outcomes, if what is at stake is an accident, then when confronted with emergencies one should be equipped to adopt the best available strategy which minimises ‘blink’ and maximises ‘think’. The challenge is how to train the professionals to ‘think slow’ but faster".
Enjoy reading HindSight!
Editor's Notes:
- ^ “Thinking fast and slow”, Daniel Kahneman, New York, 2011
HindSight 15 Articles
- Serious about Safety - by Joe Sultana
- Emergency and unusual situations in the air - by By Carlos Artero
- Sudoku of teamwork - by Maciej Szczukowski
- This is a dangerous issue - by Professor Sidney Dekker
- Being prepared – for worse than ‘expected’! - by By Captain Ed Pooley
- Interceptions of Civil Operation of SSR and Aircraft -ACAS II - by Richard “Sid” Lawrence
- Request of Support Message: Reduced Runway Length Operations during Construction/Work in Progress – ATIS and Radiotelephony Messages
- Case Study - The garden party - by Bengt Collin
- E.R. - by Alberto Iovino
- The small technical problem... - by Eileen Senger
- It all went quiet - by Harry Nelson
- Emergency and unusual situations – whose world view? - by Anne Isaac
- TCAS II version 7.1 has arrived - by Stanislaw Drozdowski
- 17 minutes - by John Barrass
- Beyond the outcome - by Bert Ruitenberg
- Fixed wing or helicopter?
Click here to view other editions of HindSight
Related Articles
Articles within SKYbrary which cover the topics discussed in HindSight 15 include:
- See the dedicated Category on SKYbrary: Aircraft Emergency and Unusual Situations
- Guidelines for Dealing with Unusual/Emergency Situations in ATC
- Emergency or Abnormal Situation
- Airborne Collision Avoidance System (ACAS)
- Fire in the Air
Related OGHFA Material
- Unexpected Events Training (OGHFA BN)
- Pilot judgment and expertise (OGHFA BN)
- Threat Management Training (OGHFA BN)