How to Overcome Instinctive Decision-Making
The Dual-Process Theory — Part II
In the first part on the Dual-Process Theory we delved into the crash of Air France Flight 447, which highlights the dangers of instinctive decision-making in high-pressure scenarios. Startled by the autopilot’s disengagement and alarms, the pilot reacted emotionally, pulling back on the control stick to climb — a decision that caused a fatal stall and the loss of 228 lives [1]. This tragedy illustrates the risks of relying on System 1, the fast, intuitive decision-making process described in Tversky and Kahneman’s Dual-Process Theory. While System 1 is efficient, it is prone to errors, especially under stress. Engaging System 2, the slower, analytical process, is crucial in critical moments to avoid catastrophic outcomes [3] [6].
The accident teaches a vital lesson: effective decision-making demands balancing intuition with informed, deliberate choices, especially in high-stakes situations. Ultimately, it serves as a powerful reminder that high time is no time for deciding, emphasizing the need for preparation, situational awareness, and emotional control to prevent such tragedies. In this article, we will explore what the pilots should have done, examine the established procedures that could have prevented this tragedy, and set the stage for Part III, where we discuss how such procedures can be applied to other occupational sectors. You have to gain a profound understanding how guidelines work in aviation to transfer, define and apply rules and procedures for improving decision-making processes in other working environments. So, let us take a look:
Lessons from Aviation: Learning from Tragedies
Aviation prioritizes safety above all else, and every accident is meticulously analyzed to prevent future tragedies. Most countries in the world have investigative bodies, like the BEA in France, the NTSB in the USA or the BFU in Germany. Those entities are responsible for the investigation of aviation accidents and incidents to uncover root causes, from human errors to mechanical failures. These investigations often result in recommendations for implementing specific rules and procedures designed to address identified vulnerabilities. By continuously refining these guidelines, aviation has become one of the safest modes of transportation, demonstrating the immense value of thorough preparation and structured decision-making.
In the case of AF 447, these guidelines were already in place. The tragedy could have been avoided had the pilots adhered to the established procedures. According to the BEA report, after the autopilot disengaged, the first responsibility of the pilot in control of the aircraft was to stabilize the flight path, maintain manual control, and ensure the aircraft’s safety. The second pilot should have focused on monitoring the instruments and assessing the situation, identifying that the airspeed readings were no longer valid. This diagnosis would have led to the critical call to execute the Unreliable Airspeed Indication Procedure (IAS procedure), which would have prevented rash and instinctive decision-making.
The Unreliable Airspeed Indication Procedure
This procedure provides a structured response for situations involving conflicting or unreliable airspeed data. Pilots are guided to:
- Maintain aircraft control: Avoidance of abrupt control inputs, disconnect automation and revert to safe default parameters (pitch and trust settings).
- Consult manuals: Refer to the Flight Crew Training Manual (FCTM) or follow the checklist in the Quick Reference Handbook (QRH): These steps are crucial when encountering abnormal conditions (e.g. if sensors do not provide valid measurements).
- Confirm and troubleshoot: Identify malfunctioning systems.
- Ensure crew cooperation and clear distribution of roles: Assign clear roles and communicate actions to maintain situational awareness. Everybody should know what to do and communicate what they are doing.
- Declare an emergency if needed: Notify Air Traffic Control (ATC) and prepare for diversion if necessary.
- Land safely: Use visual references, alternative data, or manual techniques to ensure a safe landing [4] [5].
Calling the Unreliable IAS procedure could have saved the pilots and the plane by providing a structured response. Executing this procedure would have helped the AF 447 crew avoid instinctive errors, ensuring composure, teamwork, and a systematic approach, and prevented the series of misjudgments that led to the stall and crash. These guidelines promote situation awareness which is crucial in stressful events. Instead of reacting instinctively, the pilots would have analyzed the situation in a structured manner avoiding emotional and excessive reactions. The procedure’s emphasis on composure, teamwork, and systematic action demonstrates how aviation protocols are designed to prevent human error and ensure safety even under extreme pressure.
From Aviation to Broader Applications
The benefits of the Unreliable IAS procedure extend far beyond aviation and provide a blueprint for improvement of the decision-making process in other occupational sectors. In fields like medical, business or IT, confusion during high-pressure situations is nothing new. Without structured protocols, teams can easily find themselves caught up in circles, revisiting the same issues without resolution and acting instinctively [2]. Having structured protocols in place can prevent reactive decision-making under stress. Such procedures can foster situational awareness, allowing teams to focus on the most critical issues without becoming overwhelmed by conflicting information. By emphasizing composure, a step-by-step approach, collaboration with clear understanding of roles and responsibilities, these frameworks reduce the likelihood of errors caused by stress or miscommunication. Whether it’s diagnosing a medical emergency, managing a corporate crisis, or responding to a cybersecurity threat: Adopting these principles can lead to more effective and confident decision-making.
Keep reading the next article, where we’ll explore how to transform aviation’s structured rules into universally applicable guidelines for navigating uncertainty and achieving success in diverse fields
References
[1] BEA — Bureau d’enquêtes et d’analyses pour la sécurité de l’aviation civil. (2011). Aircraft accident report: On the accident on 1st June 2009 on the Airbus A330-203. Paris. https://aaiu.ie/foreign_reports_fr/final-report-accident-to-airbus-a330-203-registered-f-gzcp-air-france-af-447-rio-de-janeiro-paris-1st-june-2009/
[2] Cyndi Lauper. (1983). Time After Time. https://www.youtube.com/watch?v=VdQY7BusJNU
[3] Kahneman, D. (2013). Thinking, fast and slow. First paperback edition. New York, Farrar, Straus and Giroux.
[4] Skybrary. Unreliable Airspeed Indications. Retrieved 2025.01.02. From: https://skybrary.aero/articles/unreliable-airspeed-indications
[5] Skybrary. Handbook (QRH). Retrieved 2025.01.02. From: https://skybrary.aero/articles/quick-reference-handbook-qrh
[6] Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science (New York, N.Y.), 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124
