Complacency

Complacency

Use of the term complacency in aviation safety is sometimes disputed.

Definition

A state of self-satisfaction with one's own performance coupled with an unawareness of danger, trouble, or controversy.

Description

Critics of the use of the term “complacency” often refer to the lack of its precise definition. It is a topic which has not yet been adequately conceptualized and any use of the term contributes to an illusion of understanding of what causes risks.

The use and definition of complacency is referred to in Folk Modelling. Folk models share the following characteristics:

  • Folk models substitute one big term for another instead of defining the big term by breaking it down into more little ones (we call this decomposition, or deconstruction). So instead of human error, you would simply say “complacency”. But you still don’t explain anything.
  • Folk models are difficult to prove wrong, because they do not have a definition in terms of smaller components that are observable in people's real behaviour. Folk models may seem glib; they appeal to popular (supposed) understandings of difficult phenomena.
  • Folk models easily lead to overgeneralization. Before you know it, you may see “complacency” and “loss of situation awareness” everywhere. This is possible because the concepts are so ill-defined. You are not bound to particular definitions, so you may interpret the concepts any way you like.

Critics often also refer to the variety of “substitution” definitions in the literature, where one label is used instead of complacency. Here is what a good sample of the literature has equated complacency with:

  • Overconfidence
  • Self-satisfaction
  • Trait that can lead to a reduced awareness of danger
  • State of confidence plus contentment
  • Low index of suspicion
  • Unjustified assumption of satisfactory system state
  • Loss of situation awareness, and unpreparedness to react in timely manner when system fails

For example, self-satisfaction takes the place of complacency and is assumed to speak for itself. Looking for “self-satisfaction” in a controller’s behaviour is not any better or more convincing than looking for “complacency”. There is no explanation (or breakdown) of a psychological mechanism that makes self-satisfaction emerge, and which in turn produces a lack of Vigilance in ATM.

It can be argued that if the literature can’t provide ways in which you can start to define and identify the phenomenon in question, then it is easy to argue for its existence in real situations. To be complacent, some argue, an observer must be shown to sample a variable less often than is optimal, given the dynamics of what is going on in the system at that time. But it is very difficult to rigorously define the optimal sampling rate in supervisory monitoring. This can also be tested against particular situation in ATC.

Another criticism is that we cannot claim that somebody was complacent because he or she missed a piece of data (that we, in hindsight, find important). Complacency, after all, is about under-sampling or defective monitoring (which is impossible to establish because you can’t define the optimal). It is not about whether people detected signals. Detectability is a function of signal-to-noise ratio and somebody’s response criterion (as in, when do I have enough evidence to do something), not of sampling strategy.

Categories

SKYbrary Partners:

Safety knowledge contributed by: