Caught up in Circles and Confusion - it is Nothing new

How to Over­come Instinc­tive Deci­sion-Mak­ing

The Dual-Process The­o­ry — Part II

In the first part on the Dual-Process The­o­ry we delved into the crash of Air France Flight 447, which high­lights the dan­gers of instinc­tive deci­sion-mak­ing in high-pres­sure sce­nar­ios. Star­tled by the autopilot’s dis­en­gage­ment and alarms, the pilot react­ed emo­tion­al­ly, pulling back on the con­trol stick to climb — a deci­sion that caused a fatal stall and the loss of 228 lives [1]. This tragedy illus­trates the risks of rely­ing on Sys­tem 1, the fast, intu­itive deci­sion-mak­ing process described in Tver­sky and Kahneman’s Dual-Process The­o­ry. While Sys­tem 1 is effi­cient, it is prone to errors, espe­cial­ly under stress. Engag­ing Sys­tem 2, the slow­er, ana­lyt­i­cal process, is cru­cial in crit­i­cal moments to avoid cat­a­stroph­ic out­comes [3] [6].

The acci­dent teach­es a vital les­son: effec­tive deci­sion-mak­ing demands bal­anc­ing intu­ition with informed, delib­er­ate choic­es, espe­cial­ly in high-stakes sit­u­a­tions. Ulti­mate­ly, it serves as a pow­er­ful reminder that high time is no time for decid­ing, empha­siz­ing the need for prepa­ra­tion, sit­u­a­tion­al aware­ness, and emo­tion­al con­trol to pre­vent such tragedies. In this arti­cle, we will explore what the pilots should have done, exam­ine the estab­lished pro­ce­dures that could have pre­vent­ed this tragedy, and set the stage for Part III, where we dis­cuss how such pro­ce­dures can be applied to oth­er occu­pa­tion­al sec­tors. You have to gain a pro­found under­stand­ing how guide­lines work in avi­a­tion to trans­fer, define and apply rules and pro­ce­dures for improv­ing deci­sion-mak­ing process­es in oth­er work­ing envi­ron­ments. So, let us take a look:

Lessons from Avi­a­tion: Learn­ing from Tragedies

Avi­a­tion pri­or­i­tizes safe­ty above all else, and every acci­dent is metic­u­lous­ly ana­lyzed to pre­vent future tragedies. Most coun­tries in the world have inves­tiga­tive bod­ies, like the BEA in France, the NTSB in the USA or the BFU in Ger­many. Those enti­ties are respon­si­ble for the inves­ti­ga­tion of avi­a­tion acci­dents and inci­dents to uncov­er root caus­es, from human errors to mechan­i­cal fail­ures. These inves­ti­ga­tions often result in rec­om­men­da­tions for imple­ment­ing spe­cif­ic rules and pro­ce­dures designed to address iden­ti­fied vul­ner­a­bil­i­ties. By con­tin­u­ous­ly refin­ing these guide­lines, avi­a­tion has become one of the safest modes of trans­porta­tion, demon­strat­ing the immense val­ue of thor­ough prepa­ra­tion and struc­tured deci­sion-mak­ing.

In the case of AF 447, these guide­lines were already in place. The tragedy could have been avoid­ed had the pilots adhered to the estab­lished pro­ce­dures. Accord­ing to the BEA report, after the autopi­lot dis­en­gaged, the first respon­si­bil­i­ty of the pilot in con­trol of the air­craft was to sta­bi­lize the flight path, main­tain man­u­al con­trol, and ensure the aircraft’s safe­ty. The sec­ond pilot should have focused on mon­i­tor­ing the instru­ments and assess­ing the sit­u­a­tion, iden­ti­fy­ing that the air­speed read­ings were no longer valid. This diag­no­sis would have led to the crit­i­cal call to exe­cute the Unre­li­able Air­speed Indi­ca­tion Pro­ce­dure (IAS pro­ce­dure), which would have pre­vent­ed rash and instinc­tive deci­sion-mak­ing.

The Unre­li­able Air­speed Indi­ca­tion Pro­ce­dure

This pro­ce­dure pro­vides a struc­tured response for sit­u­a­tions involv­ing con­flict­ing or unre­li­able air­speed data. Pilots are guid­ed to:

  • Main­tain air­craft con­trol: Avoid­ance of abrupt con­trol inputs, dis­con­nect automa­tion and revert to safe default para­me­ters (pitch and trust set­tings).
  • Con­sult man­u­als: Refer to the Flight Crew Train­ing Man­u­al (FCTM) or fol­low the check­list in the Quick Ref­er­ence Hand­book (QRH): These steps are cru­cial when encoun­ter­ing abnor­mal con­di­tions (e.g. if sen­sors do not pro­vide valid mea­sure­ments).
  • Con­firm and trou­bleshoot: Iden­ti­fy mal­func­tion­ing sys­tems.
  • Ensure crew coop­er­a­tion and clear dis­tri­b­u­tion of roles: Assign clear roles and com­mu­ni­cate actions to main­tain sit­u­a­tion­al aware­ness. Every­body should know what to do and com­mu­ni­cate what they are doing.
  • Declare an emer­gency if need­ed: Noti­fy Air Traf­fic Con­trol (ATC) and pre­pare for diver­sion if nec­es­sary.
  • Land safe­ly: Use visu­al ref­er­ences, alter­na­tive data, or man­u­al tech­niques to ensure a safe land­ing [4] [5].

Call­ing the Unre­li­able IAS pro­ce­dure could have saved the pilots and the plane by pro­vid­ing a struc­tured response. Exe­cut­ing this pro­ce­dure would have helped the AF 447 crew avoid instinc­tive errors, ensur­ing com­po­sure, team­work, and a sys­tem­at­ic approach, and pre­vent­ed the series of mis­judg­ments that led to the stall and crash. These guide­lines pro­mote sit­u­a­tion aware­ness which is cru­cial in stress­ful events. Instead of react­ing instinc­tive­ly, the pilots would have ana­lyzed the sit­u­a­tion in a struc­tured man­ner avoid­ing emo­tion­al and exces­sive reac­tions. The procedure’s empha­sis on com­po­sure, team­work, and sys­tem­at­ic action demon­strates how avi­a­tion pro­to­cols are designed to pre­vent human error and ensure safe­ty even under extreme pres­sure.

From Avi­a­tion to Broad­er Appli­ca­tions

The ben­e­fits of the Unre­li­able IAS pro­ce­dure extend far beyond avi­a­tion and pro­vide a blue­print for improve­ment of the deci­sion-mak­ing process in oth­er occu­pa­tion­al sec­tors. In fields like med­ical, busi­ness or IT, con­fu­sion dur­ing high-pres­sure sit­u­a­tions is noth­ing new. With­out struc­tured pro­to­cols, teams can eas­i­ly find them­selves caught up in cir­cles, revis­it­ing the same issues with­out res­o­lu­tion and act­ing instinc­tive­ly [2]. Hav­ing struc­tured pro­to­cols in place can pre­vent reac­tive deci­sion-mak­ing under stress. Such pro­ce­dures can fos­ter sit­u­a­tion­al aware­ness, allow­ing teams to focus on the most crit­i­cal issues with­out becom­ing over­whelmed by con­flict­ing infor­ma­tion. By empha­siz­ing com­po­sure, a step-by-step approach, col­lab­o­ra­tion with clear under­stand­ing of roles and respon­si­bil­i­ties, these frame­works reduce the like­li­hood of errors caused by stress or mis­com­mu­ni­ca­tion. Whether it’s diag­nos­ing a med­ical emer­gency, man­ag­ing a cor­po­rate cri­sis, or respond­ing to a cyber­se­cu­ri­ty threat: Adopt­ing these prin­ci­ples can lead to more effec­tive and con­fi­dent deci­sion-mak­ing.

Keep read­ing the next arti­cle, where we’ll explore how to trans­form aviation’s struc­tured rules into uni­ver­sal­ly applic­a­ble guide­lines for nav­i­gat­ing uncer­tain­ty and achiev­ing suc­cess in diverse fields

Ref­er­ences

[1] BEA — Bureau d’en­quêtes et d’analy­ses pour la sécu­rité de l’avi­a­tion civ­il. (2011). Air­craft acci­dent report: On the acci­dent on 1st June 2009 on the Air­bus A330-203. Paris. https://aaiu.ie/foreign_reports_fr/final-report-accident-to-airbus-a330-203-registered-f-gzcp-air-france-af-447-rio-de-janeiro-paris-1st-june-2009/

[2] Cyn­di Lau­per. (1983). Time After Time. https://www.youtube.com/watch?v=VdQY7BusJNU

[3] Kah­ne­man, D. (2013). Think­ing, fast and slow. First paper­back edi­tion. New York, Far­rar, Straus and Giroux.

[4] Sky­brary. Unre­li­able Air­speed Indi­ca­tions. Retrieved 2025.01.02. From: https://skybrary.aero/articles/unreliable-airspeed-indications

[5] Sky­brary. Hand­book (QRH). Retrieved 2025.01.02. From: https://skybrary.aero/articles/quick-reference-handbook-qrh

[6] Tver­sky, A., & Kah­ne­man, D. (1974). Judg­ment under Uncer­tain­ty: Heuris­tics and Bias­es. Sci­ence (New York, N.Y.), 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124