Published in December 2011, HindSight 14 focuses on Training for Safety.
Training is necessary but we need to learn
by Tzvetomir Blajev
"We have a proverb in my home country that can be approximately translated in English as “one learns while one lives”. I have always thought that in aviation it is actually the other way round…
How do we learn in aviation from our experience and from the experience of others? And, do we really learn the lessons? Let me highlight a unique event.
On 17 January 2008, a Boeing 777 crash-landed short of a runway at London Heathrow after a loss of engine power on short final. This reduction of thrust was the result of ice causing a restriction in the fuel feed system that subsequently led to a loss of airspeed and the aircraft touching down 330 m short of the paved surface of Runway 27L at London Heathrow. The investigation identified that this restriction occurred on Fuel Oil Heat Exchanger. Ice had formed within the fuel system, from water that occurred naturally in the fuel, whilst the aircraft operated with low fuel flows over a long period and the localised fuel temperatures were in an area described as the ‘sticky range’. The Fuel Oil Heat Exchanger, although compliant with the applicable certification requirements, was shown to be susceptible to restriction when presented with soft ice in a high concentration, with a fuel temperature that is below -10°C and a fuel flow above flight idle.
What makes this event unique, in my opinion, is the following “probable causal factor” included in the Final Report of the Investigation carried out by the UK Air Accident Investigation Branch and published on 9 February 2010:
“Certification requirements, with which the aircraft and engine fuel systems had to comply, did not take account of this phenomenon as the risk was unrecognised at that time.”
In other words, the aviation industry had not fully understood the properties of an aircraft fuel system under conditions of prolonged low fuel flow in a particular fuel temperature area. We were not aware before the accident of such a potential scenario - the event was the first to be learned from. Even supposing that an ideal mechanism exists in aviation to immediately spread the everyday lessons learnt, we could not have known about this because it had not happened to anyone before.
How many accidents happen to us for which we, collectively, do not know the reasons and the available mitigations beforehand? I am always surprised by the excellent aviation accident reviews of Jim Burin, Director Technical Programs of the Flight Safety Foundation (FSF). Jim presents a summary of accidents from the previous year during the FSF annual seminars, examining the “big killer” types and scenarios, their causal factors, distributions by flight phase and other interesting aspects.
What surprises me is one fact, often emphasised by Jim, that each year we have the same old accident types, the same scenarios keep repeating themselves, with the same, or at least very similar, combination of factors for which we, as an aviation industry, have for a long time had reliable prevention strategies. By reliable strategy here I do not mean telling people that they “just to try harder” or “next time keep better their situational awareness”. No, what I mean are things like the Enhanced Ground Proximity Warning System, which has proved to be a very reliable Controlled Flight Into Terrain mitigation. One of the only exceptions for the last decade, one of the few cases where we, collectively, did not know enough to be able to pre-vent the accident was the above mentioned accident at London.
So, why do we not learn the lessons? There are, I think, two major explanations. The first is that the knowledge, no matter how smart we are individually, does not belong to a single person. As Friedrich Hayek puts it in “The Use of Knowledge in Society”:…the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. Or, to put it briefly, it is a problem of the utilization of knowledge which is not given to anyone in its totality.
If accidents happen and we, collectively, know why they happen, then we need to find mechanisms to make the collective awareness available and accessible to individuals. The second explanation for why we repeat our mistakes is that it is not enough just to know. What is needed is the transfer of knowledge into daily practice, to shape the behaviour of individuals and organisations, to implement what has been learnt.
Where is the theme of training in all this? Well, I believe that training is needed to address both major reasons for accident repetition. Training helps to consolidate knowledge and to establish reliable professional behaviour. What is even more important is that training is just a part of another process, part of the bigger picture of learning. Having effective team briefings, sharing common explanations for risks, social networking and even “camp fire storytelling” are all parts of a healthy learning culture.
I remember my first years as a controller, when sometimes a more experienced colleague of mine would ask me to sit to one side and would tell me, in a kindly way, “a similar story that took place years ago” or “how things can be done even better”. This informal and intuitive coaching helped us a lot and was, I now realise, a big part of our learning.
Recurrence of similar accidents tells us that there is a lot more learning that we need to do – collectively to consolidate the knowledge, and individually to apply it in our daily work.
enjoy reading Hindsight!"
HindSight 14 Articles
Click here to view other editions of HindSight
Other elements of SKYbrary which cover the topics discussed in HindSight 14 include:
Register to Receive Your Copy of HindSight Magazine
Register here for a hardcopy of HindSight magazine to be sent to you by post every time a new edition is published