Caught up in Circles and Confusion – it is Nothing new

How to Over­co­me Instinc­ti­ve Decis­i­on-Making

The Dual-Pro­cess Theo­ry – Part II

In the first part on the Dual-Pro­cess Theo­ry we del­ved into the crash of Air France Flight 447, which high­lights the dan­gers of instinc­ti­ve decis­i­on-making in high-pres­su­re sce­na­ri­os. Start­led by the autopilot’s dis­en­ga­ge­ment and alarms, the pilot reac­ted emo­tio­nal­ly, pul­ling back on the con­trol stick to climb – a decis­i­on that cau­sed a fatal stall and the loss of 228 lives [1]. This tra­ge­dy illus­tra­tes the risks of rely­ing on Sys­tem 1, the fast, intui­ti­ve decis­i­on-making pro­cess descri­bed in Tvers­ky and Kahneman’s Dual-Pro­cess Theo­ry. While Sys­tem 1 is effi­ci­ent, it is pro­ne to errors, espe­ci­al­ly under stress. Enga­ging Sys­tem 2, the slower, ana­ly­ti­cal pro­cess, is cru­cial in cri­ti­cal moments to avo­id cata­stro­phic out­co­mes [3] [6].

The acci­dent tea­ches a vital les­son: effec­ti­ve decis­i­on-making demands balan­cing intui­ti­on with infor­med, deli­be­ra­te choices, espe­ci­al­ly in high-sta­kes situa­tions. Ulti­m­ate­ly, it ser­ves as a powerful remin­der that high time is no time for deci­ding, empha­si­zing the need for pre­pa­ra­ti­on, situa­tio­nal awa­re­ness, and emo­tio­nal con­trol to pre­vent such tra­ge­dies. In this artic­le, we will explo­re what the pilots should have done, exami­ne the estab­lished pro­ce­du­res that could have pre­ven­ted this tra­ge­dy, and set the stage for Part III, whe­re we dis­cuss how such pro­ce­du­res can be appli­ed to other occu­pa­tio­nal sec­tors. You have to gain a pro­found under­stan­ding how gui­de­lines work in avia­ti­on to trans­fer, defi­ne and app­ly rules and pro­ce­du­res for impro­ving decis­i­on-making pro­ces­ses in other working envi­ron­ments. So, let us take a look:

Les­sons from Avia­ti­on: Lear­ning from Tra­ge­dies

Avia­ti­on prio­ri­ti­zes safe­ty abo­ve all else, and every acci­dent is meti­cu­lous­ly ana­ly­zed to pre­vent future tra­ge­dies. Most count­ries in the world have inves­ti­ga­ti­ve bodies, like the BEA in France, the NTSB in the USA or the BFU in Ger­ma­ny. Tho­se enti­ties are respon­si­ble for the inves­ti­ga­ti­on of avia­ti­on acci­dents and inci­dents to unco­ver root cau­ses, from human errors to mecha­ni­cal fail­ures. The­se inves­ti­ga­ti­ons often result in recom­men­da­ti­ons for imple­men­ting spe­ci­fic rules and pro­ce­du­res desi­gned to address iden­ti­fied vul­nerabi­li­ties. By con­ti­nuous­ly refi­ning the­se gui­de­lines, avia­ti­on has beco­me one of the safest modes of trans­por­ta­ti­on, demons­t­ra­ting the immense value of tho­rough pre­pa­ra­ti­on and struc­tu­red decis­i­on-making.

In the case of AF 447, the­se gui­de­lines were alre­a­dy in place. The tra­ge­dy could have been avo­ided had the pilots adhe­red to the estab­lished pro­ce­du­res. Accor­ding to the BEA report, after the auto­pi­lot dis­en­ga­ged, the first respon­si­bi­li­ty of the pilot in con­trol of the air­craft was to sta­bi­li­ze the flight path, main­tain manu­al con­trol, and ensu­re the aircraft’s safe­ty. The second pilot should have focu­sed on moni­to­ring the instru­ments and asses­sing the situa­ti­on, iden­ti­fy­ing that the airspeed rea­dings were no lon­ger valid. This dia­gno­sis would have led to the cri­ti­cal call to exe­cu­te the Unre­lia­ble Airspeed Indi­ca­ti­on Pro­ce­du­re (IAS pro­ce­du­re), which would have pre­ven­ted rash and instinc­ti­ve decis­i­on-making.

The Unre­lia­ble Airspeed Indi­ca­ti­on Pro­ce­du­re

This pro­ce­du­re pro­vi­des a struc­tu­red respon­se for situa­tions invol­ving con­flic­ting or unre­lia­ble airspeed data. Pilots are gui­ded to:

  • Main­tain air­craft con­trol: Avo­id­ance of abrupt con­trol inputs, dis­con­nect auto­ma­ti­on and revert to safe default para­me­ters (pitch and trust set­tings).
  • Con­sult manu­als: Refer to the Flight Crew Trai­ning Manu­al (FCTM) or fol­low the check­list in the Quick Refe­rence Hand­book (QRH): The­se steps are cru­cial when encoun­tering abnor­mal con­di­ti­ons (e.g. if sen­sors do not pro­vi­de valid mea­su­re­ments).
  • Con­firm and trou­ble­shoot: Iden­ti­fy mal­func­tio­ning sys­tems.
  • Ensu­re crew coope­ra­ti­on and clear dis­tri­bu­ti­on of roles: Assign clear roles and com­mu­ni­ca­te actions to main­tain situa­tio­nal awa­re­ness. Ever­y­bo­dy should know what to do and com­mu­ni­ca­te what they are doing.
  • Decla­re an emer­gen­cy if nee­ded: Noti­fy Air Traf­fic Con­trol (ATC) and prepa­re for diver­si­on if neces­sa­ry.
  • Land safe­ly: Use visu­al refe­ren­ces, alter­na­ti­ve data, or manu­al tech­ni­ques to ensu­re a safe landing [4] [5].

Cal­ling the Unre­lia­ble IAS pro­ce­du­re could have saved the pilots and the pla­ne by pro­vi­ding a struc­tu­red respon­se. Exe­cu­ting this pro­ce­du­re would have hel­ped the AF 447 crew avo­id instinc­ti­ve errors, ensu­ring com­po­sure, team­work, and a sys­te­ma­tic approach, and pre­ven­ted the series of mis­judgments that led to the stall and crash. The­se gui­de­lines pro­mo­te situa­ti­on awa­re­ness which is cru­cial in stressful events. Ins­tead of reac­ting instinc­tively, the pilots would have ana­ly­zed the situa­ti­on in a struc­tu­red man­ner avo­i­ding emo­tio­nal and exces­si­ve reac­tions. The procedure’s empha­sis on com­po­sure, team­work, and sys­te­ma­tic action demons­tra­tes how avia­ti­on pro­to­cols are desi­gned to pre­vent human error and ensu­re safe­ty even under extre­me pres­su­re.

From Avia­ti­on to Broa­der Appli­ca­ti­ons

The bene­fits of the Unre­lia­ble IAS pro­ce­du­re extend far bey­ond avia­ti­on and pro­vi­de a blue­print for impro­ve­ment of the decis­i­on-making pro­cess in other occu­pa­tio­nal sec­tors. In fields like medi­cal, busi­ness or IT, con­fu­si­on during high-pres­su­re situa­tions is not­hing new. Wit­hout struc­tu­red pro­to­cols, teams can easi­ly find them­sel­ves caught up in cir­cles, revi­si­ting the same issues wit­hout reso­lu­ti­on and acting instinc­tively [2]. Having struc­tu­red pro­to­cols in place can pre­vent reac­ti­ve decis­i­on-making under stress. Such pro­ce­du­res can fos­ter situa­tio­nal awa­re­ness, allo­wing teams to focus on the most cri­ti­cal issues wit­hout beco­ming over­whel­med by con­flic­ting infor­ma­ti­on. By empha­si­zing com­po­sure, a step-by-step approach, col­la­bo­ra­ti­on with clear under­stan­ding of roles and respon­si­bi­li­ties, the­se frame­works redu­ce the likeli­hood of errors cau­sed by stress or mis­com­mu­ni­ca­ti­on. Whe­ther it’s dia­gno­sing a medi­cal emer­gen­cy, mana­ging a cor­po­ra­te cri­sis, or respon­ding to a cyber­se­cu­ri­ty thre­at: Adop­ting the­se prin­ci­ples can lead to more effec­ti­ve and con­fi­dent decis­i­on-making.

Keep rea­ding the next artic­le, whe­re we’ll explo­re how to trans­form aviation’s struc­tu­red rules into uni­ver­sal­ly appli­ca­ble gui­de­lines for navi­ga­ting uncer­tain­ty and achie­ving suc­cess in diver­se fields

Refe­ren­ces

[1] BEA – Bureau d’enquêtes et d’analyses pour la sécu­ri­té de l’aviation civil. (2011). Air­craft acci­dent report: On the acci­dent on 1st June 2009 on the Air­bus A330-203. Paris. https://aaiu.ie/foreign_reports_fr/final-report-accident-to-airbus-a330-203-registered-f-gzcp-air-france-af-447-rio-de-janeiro-paris-1st-june-2009/

[2] Cyn­di Lau­per. (1983). Time After Time. https://www.youtube.com/watch?v=VdQY7BusJNU

[3] Kah­ne­man, D. (2013). Thin­king, fast and slow. First paper­back edi­ti­on. New York, Farr­ar, Straus and Giroux.

[4] Sky­bra­ry. Unre­lia­ble Airspeed Indi­ca­ti­ons. Retrie­ved 2025.01.02. From: https://skybrary.aero/articles/unreliable-airspeed-indications

[5] Sky­bra­ry. Hand­book (QRH). Retrie­ved 2025.01.02. From: https://skybrary.aero/articles/quick-reference-handbook-qrh

[6] Tvers­ky, A., & Kah­ne­man, D. (1974). Judgment under Uncer­tain­ty: Heu­ristics and Bia­ses. Sci­ence (New York, N.Y.), 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124