A Tragic Lesson in Decision-Making
It was a quiet and calm night over the Atlantic Ocean as Air France Flight 447 cruised smoothly toward Paris. The plane had left Rio de Janeiro hours earlier, slicing through the cold, thin air at 35,000 feet. Inside the cabin, passengers dozed or flipped through magazines, accompanied by the steady hum of the engines. In the cockpit, the atmosphere was just as serene.
Out of a sudden an alarm shattered the quiet. The autopilot had abruptly disengaged, leaving the startled pilots in manual control of the 200-ton aircraft. The unexpected alarm was triggered by malfunctioning sensors which caused the airspeed indicators to fail. The pilot instinctively pulled back on the control stick, attempting to climb and avoid risk of overspeed. This decision, however, triggered further alarms, signaling a loss of lift and altitude. Despite their desperate efforts, both pilots failed to understand why the aircraft wasn’t responding as expected and the dire situation they were in. Tragically, four minutes later, Flight AF 447 plunged into the ocean, claiming the lives of all 228 onboard.
The crash of Flight AF 447 can serve as a lesson for many topics (e.g. Crew Resource Management or appropriate training), but it is also a profound lesson in decision-making. In the aftermath, investigators of the BEA (Bureau of Enquiry and Analysis for Civil Aviation Safety) identified a series of pilot errors as contributing factors [1]. Instead of leveling the aircraft and assessing the situation, the pilot in command of the aircraft did what felt instinctively right — he pulled back on the control stick. After all, climbing is safety, isn’t it? However, at high altitude, this maneuver exacerbated the problem, dangerously slowing the aircraft by forcing it into a steep pitch-up attitude and ultimately stalling it. Why did the pilot rely on instinct rather than training? Why did he act without first gathering critical information? To explore these questions, we turn to the psychology of decision-making.
Decision-Making and the Dual-Process Theory
One theoretical framework to explain human decision-making is the Dual-Process Theory by Amos Tversky and Daniel Kahneman first published in 1974. The scientists proposed two systems of thought:
- System 1: Fast and intuitive, but often prone to errors because it doesn’t review a lot of information.
- System 2: Slow and analytical, capable of processing more complex information but is therefore very resource-intensive.
It is as simple as saying you can either take “short cuts” when making decisions or decide in a complicated way. System 1 often takes precedence because in most cases it is efficient and requires less cognitive effort. It is especially essential in situations where time is limited, risks are little, or decisions seem straightforward [3].
In their studies Tversky and Kahneman confronted their test subjects with situations where they had to estimate probabilities without receiving enough information. The human brain will reduce the complexity of assessing probabilities and predicting outcomes to simpler operations of judgment [4], unless we force ourselves to evaluate, compute and then decide. The authors found that human decision-making is often based on applying heuristics or “rules-of-thumb”. This phenomenon is universal from layman to people highly educated within statistics or with a rather technical and rational mindset. While heuristics are effective in many contexts, they can be perilous in others especially when these rules are intertwined with biases. Blind reliance on System 1 can lead to dramatic errors, especially in high-stakes scenarios. In aviation, such shortcuts can turn deadly.
Human Factors in Flight AF 447
In their final accident report the BEA concluded amongst other contributing factors: The pilots were completely surprised by the warning and didn’t comprehend the situation. They failed to gather important information and overlooked other cues (e. g. the stall warning). Additionally, “their poor management of the startle effect that generated a highly charged emotional factor for the two pilots.”. In simple words: the pilots did not have sufficient information. Instead, emotion and instinct took over and fatal decisions were made. The pilot acted on what felt right, pulling back on the control to climb and disregarding all contrary information. All it would have taken was an informed decision and later on a deliberate push forward on the control stick. A calculated step to let the plane regain speed. But that moment of clarity never came.
Lessons from a Tragedy
Flight AF 447’s tragic end reminds us that intuition isn’t always reliable in complex or high-pressure situations especially when charged with emotions. Effective decision-making sometimes requires acting against instinct, overriding heuristics and impulses with deliberate, informed choices. In the end you have to remind yourself: High Time is no Time for Deciding [2].
It underscores a essential principle of decision-making: gathering, preparation and consideration of information are as important as the ability to manage cognitive and emotional challenges. This tragedy challenges us to examine how we can improve human decision-making in crucial situations and under pressure, ensuring that intuition serves as a tool and not a trap. Part II will dive into how the aviation industry handles these challenges. We will explore how specified procedures enable pilots to enhance situation awareness and prevent instinctive and erroneous decision-making. In part III you will understand how these procedures not only ensure an aircraft’s safety, but can also aid you in decision-making processes in critical situations within your working environment.
References
[1] BEA — Bureau d’enquêtes et d’analyses pour la sécurité de l’aviation civil. (2011). Aircraft accident report: On the accident on 1st June 2009 on the Airbus A330-203. Paris. https://aaiu.ie/foreign_reports_fr/final-report-accident-to-airbus-a330-203-registered-f-gzcp-air-france-af-447-rio-de-janeiro-paris-1st-june-2009
[2] Duran Duran. 1983. The Reflex. https://www.youtube.com/watch?v=J5ebkj9x5Ko
[3] Kahneman, D. (2013). Thinking, fast and slow. First paperback edition. New York, Farrar, Straus and Giroux.
[4] Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science (New York, N.Y.), 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124
