If you wish to contribute or participate in the discussions about articles you are invited to join SKYbrary as a registered user

 Actions

Continuation Bias

From SKYbrary Wiki

Revision as of 13:59, 30 October 2019 by Integration.Manager (talk | contribs) (Integration.Manager moved page Work in progress:Continuation Bias to Continuation Bias over a redirect without leaving a redirect)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Plan Continuation Bias

Article Information
Category: Human Behaviour Human Behaviour
Content source: SKYbrary About SKYbrary
Content control: SKYbrary About SKYbrary

Definition

(Plan) Continuation Bias is the unconscious cognitive bias to continue with the original plan in spite of changing conditions.

Discussion

The following explanation of continuation bias is derived from a Transport Safety Board of Canada accident report.

To make decisions effectively, a pilot or controller needs an accurate understanding of the situation and an appreciation of the implications of the situation, then to formulate a plan and contingencies, and to implement the best course of action. Equally important is the ability to recognize changes in the situation and to reinitiate the decision-making process to ensure that changes are accounted for and plans modified accordingly. If the potential implications of the situation are not adequately considered during the decision-making process, there is an increased risk that the decision and its associated action will result in an adverse outcome that leads to an undesired aircraft state.

A number of different factors can adversely impact a pilot's decision-making process. For example, increased workload can adversely impact a pilot's ability to perceive and evaluate cues from the environment and may result in attentional narrowing. In many cases, this attentional narrowing can lead to Confirmation Bias, which causes people to seek out cues that support the desired course of action, to the possible exclusion of critical cues that may support an alternate, less desirable hypothesis. The danger this presents is that potentially serious outcomes may not be given the appropriate level of consideration when attempting to determine the best possible course of action.

One specific form of confirmation bias is (plan) continuation bias, or plan continuation error. Once a plan is made and committed to, it becomes increasingly difficult for stimuli or conditions in the environment to be recognized as necessitating a change to the plan. Often, as workload increases, the stimuli or conditions will appear obvious to people external to the situation; however, it can be very difficult for a pilot caught up in the plan to recognize the saliency of the cues and the need to alter the plan.

When continuation bias interferes with the pilot's ability to detect important cues, or if the pilot fails to recognize the implications of those cues, breakdowns in situational awareness (SA) occur. These breakdowns in SA can result in non-optimal decisions being made, which could compromise safety.

In a U.S. National Aeronautics and Space Administration (NASA) and Ames Research Center review of 37 accidents investigated by the National Transportation Safety Board, it was determined that almost 75% of the tactical decision errors involved in the 37 accidents were related to decisions to continue on the original plan of action despite the presence of cues suggesting an alternative course of action. Dekker (2006) suggests that continuation bias occurs when the cues used to formulate the initial plan are considered to be very strong. For example, if the plan seems like a great plan, based on the information available at the time, subsequent cues that indicate otherwise may not be viewed in an equal light, in terms of decision making.

Therefore, it is important to realize that continuation bias can occur, and it is important for pilots to remain cognizant of the risks of not carefully analyzing changes in the situation, and considering the implications of those changes, to determine whether or not a more appropriate revised course of action is appropriate. As workload increases, particularly in a single-pilot scenario, less and less mental capacity is available to process these changes, and to consider the potential impact that they may have on the original plan.

Accidents and Incidents

SKYbrary includes the following reports relating to events where continuation bias was considered to be a factor:

  • WW24, vicinity Norfolk Island South Pacific, 2009 (On 18 November 2009, an IAI Westwind on a medevac mission failed to make a planned night landing at Norfolk Island in unanticipated adverse weather and was intentionally ditched offshore because of insufficient fuel to reach the nearest alternate. The fuselage broke in two on water contact but all six occupants escaped from the rapidly sinking wreckage and were eventually rescued. The Investigation initially completed in 2012 was reopened after concerns about its conduct and a new Final Report in 2017 confirmed that the direct cause was flawed crew decision-making but also highlighted ineffective regulatory oversight and inadequate Operator procedures.)
  • DHC6, Jomson Nepal, 2013 (On 16 May 2013, a DHC6-300 on a domestic passenger flight made a tailwind touchdown at excessive speed in the opposite direction of the of 740 metre-long runway to the notified direction in use and, after departing the runway to one side during deceleration, re-entered the runway and attempted to take off. This failed and the aircraft breached the perimeter fence and fell into a river. The Investigation identified inappropriate actions of the aircraft commander in respect of both the initial landing and his response to the subsequent runway excursion and also cited the absence of effective CRM.)
  • A306, vicinity Birmingham AL USA, 2013 (On 14 August 2013, a UPS Airbus A300-600 crashed short of the runway at Birmingham Alabama on a night IMC non-precision approach after the crew failed to go around at 1000ft aal when unstabilised and then continued descent below MDA until terrain impact. The Investigation attributed the accident to the individually poor performance of both pilots, to performance deficiencies previously-exhibited in recurrent training by the Captain and to the First Officer's failure to call in fatigued and unfit to fly after mis-managing her off duty time. A Video was produced by NTSB to further highlight human factors aspects.)
  • B738, Katowice Poland, 2007 (On 28 October 2007, a Boeing 737-800 under the command of a Training Captain occupying the supernumerary crew seat touched down off an ILS Cat 1 approach 870 metres short of the runway at Katowice in fog at night with the AP still engaged. The somewhat protracted investigation did not lead to a Final Report until over 10 years later. This attributed the accident to crew failure to discontinue an obviously unstable approach and it being flown with RVR below the applicable minima. The fact that the commander was not seated at the controls was noted with concern.)
  • A320, vicinity Sochi Russia, 2006 (On 3 May 2006, an Airbus 320 crew failed to correctly fly a night IMC go around at Sochi and the aircraft crashed into the sea and was destroyed. The Investigation found that the crew failed to reconfigure the aircraft for the go around and, after having difficulties with the performance of an auto go-around, had disconnected the autopilot. Inappropriate control inputs, including simultaneous (summed) sidestick inputs by both pilots were followed by an EGPWS PULL UP Warning. There was no recovery and about a minute into the go around, a steep descent into the sea at 285 knots occurred.)

more

References

Related Articles

Further Reading

  • The “Barn Door” Effect by C. West, Ph.D., NOAA - a paper about pilots’ propensity to continue approaches to land when closer to convective weather than they would wish to get while en route.