The Reflex – High Time is no Time for Deciding

A Tra­gic Les­son in Decis­i­on-Making

It was a quiet and calm night over the Atlan­tic Oce­an as Air France Flight 447 crui­sed smooth­ly toward Paris. The pla­ne had left Rio de Janei­ro hours ear­lier, slicing through the cold, thin air at 35,000 feet. Insi­de the cabin, pas­sen­gers dozed or flip­ped through maga­zi­nes, accom­pa­nied by the ste­ady hum of the engi­nes. In the cock­pit, the atmo­sphe­re was just as ser­e­ne.

Out of a sud­den an alarm shat­te­red the quiet. The auto­pi­lot had abrupt­ly dis­en­ga­ged, lea­ving the start­led pilots in manu­al con­trol of the 200-ton air­craft. The unex­pec­ted alarm was trig­ge­red by mal­func­tio­ning sen­sors which cau­sed the airspeed indi­ca­tors to fail. The pilot instinc­tively pul­led back on the con­trol stick, attemp­ting to climb and avo­id risk of over­speed. This decis­i­on, howe­ver, trig­ge­red fur­ther alarms, signal­ing a loss of lift and alti­tu­de. Despi­te their despe­ra­te efforts, both pilots fai­led to under­stand why the air­craft wasn’t respon­ding as expec­ted and the dire situa­ti­on they were in. Tra­gi­cal­ly, four minu­tes later, Flight AF 447 plun­ged into the oce­an, clai­ming the lives of all 228 onboard.

The crash of Flight AF 447 can ser­ve as a les­son for many topics (e.g. Crew Resour­ce Manage­ment or appro­pria­te trai­ning), but it is also a pro­found les­son in decis­i­on-making. In the after­math, inves­ti­ga­tors of the BEA (Bureau of Enquiry and Ana­ly­sis for Civil Avia­ti­on Safe­ty) iden­ti­fied a series of pilot errors as con­tri­bu­ting fac­tors [1]. Ins­tead of leve­ling the air­craft and asses­sing the situa­ti­on, the pilot in com­mand of the air­craft did what felt instinc­tively right – he pul­led back on the con­trol stick. After all, clim­bing is safe­ty, isn’t it? Howe­ver, at high alti­tu­de, this maneu­ver exa­cer­ba­ted the pro­blem, dan­ge­rous­ly slo­wing the air­craft by for­cing it into a steep pitch-up atti­tu­de and ulti­m­ate­ly stal­ling it. Why did the pilot rely on instinct rather than trai­ning? Why did he act wit­hout first gathe­ring cri­ti­cal infor­ma­ti­on? To explo­re the­se ques­ti­ons, we turn to the psy­cho­lo­gy of decis­i­on-making.

Decis­i­on-Making and the Dual-Pro­cess Theo­ry

One theo­re­ti­cal frame­work to explain human decis­i­on-making is the Dual-Pro­cess Theo­ry by Amos Tvers­ky and Dani­el Kah­ne­man first published in 1974. The sci­en­tists pro­po­sed two sys­tems of thought:

  • Sys­tem 1: Fast and intui­ti­ve, but often pro­ne to errors becau­se it doesn’t review a lot of infor­ma­ti­on.
  • Sys­tem 2: Slow and ana­ly­ti­cal, capa­ble of pro­ces­sing more com­plex infor­ma­ti­on but is the­r­e­fo­re very resour­ce-inten­si­ve.

It is as simp­le as say­ing you can eit­her take “short cuts” when making decis­i­ons or deci­de in a com­pli­ca­ted way. Sys­tem 1 often takes pre­ce­dence becau­se in most cases it is effi­ci­ent and requi­res less cogni­ti­ve effort. It is espe­ci­al­ly essen­ti­al in situa­tions whe­re time is limi­t­ed, risks are litt­le, or decis­i­ons seem straight­for­ward [3].

In their stu­dies Tvers­ky and Kah­ne­man con­fron­ted their test sub­jects with situa­tions whe­re they had to esti­ma­te pro­ba­bi­li­ties wit­hout recei­ving enough infor­ma­ti­on. The human brain will redu­ce the com­ple­xi­ty of asses­sing pro­ba­bi­li­ties and pre­dic­ting out­co­mes to simp­ler ope­ra­ti­ons of judgment [4], unless we force our­sel­ves to eva­lua­te, com­pu­te and then deci­de. The aut­hors found that human decis­i­on-making is often based on app­ly­ing heu­ristics or “rules-of-thumb”. This phe­no­me­non is uni­ver­sal from lay­man to peo­p­le high­ly edu­ca­ted within sta­tis­tics or with a rather tech­ni­cal and ratio­nal mind­set. While heu­ristics are effec­ti­ve in many con­texts, they can be peri­lous in others espe­ci­al­ly when the­se rules are intert­wi­ned with bia­ses. Blind reli­ance on Sys­tem 1 can lead to dra­ma­tic errors, espe­ci­al­ly in high-sta­kes sce­na­ri­os. In avia­ti­on, such short­cuts can turn dead­ly.

Human Fac­tors in Flight AF 447

In their final acci­dent report the BEA con­cluded among­st other con­tri­bu­ting fac­tors: The pilots were com­ple­te­ly sur­pri­sed by the war­ning and didn’t com­pre­hend the situa­ti­on. They fai­led to gather important infor­ma­ti­on and over­loo­ked other cues (e. g. the stall war­ning). Addi­tio­nal­ly, “their poor manage­ment of the start­le effect that gene­ra­ted a high­ly char­ged emo­tio­nal fac­tor for the two pilots.”. In simp­le words: the pilots did not have suf­fi­ci­ent infor­ma­ti­on. Ins­tead, emo­ti­on and instinct took over and fatal decis­i­ons were made. The pilot acted on what felt right, pul­ling back on the con­trol to climb and dis­re­gar­ding all con­tra­ry infor­ma­ti­on. All it would have taken was an infor­med decis­i­on and later on a deli­be­ra­te push for­ward on the con­trol stick. A cal­cu­la­ted step to let the pla­ne regain speed. But that moment of cla­ri­ty never came.

Les­sons from a Tra­ge­dy

Flight AF 447’s tra­gic end reminds us that intui­ti­on isn’t always relia­ble in com­plex or high-pres­su­re situa­tions espe­ci­al­ly when char­ged with emo­ti­ons. Effec­ti­ve decis­i­on-making some­ti­mes requi­res acting against instinct, over­ri­ding heu­ristics and impul­ses with deli­be­ra­te, infor­med choices. In the end you have to remind yours­elf: High Time is no Time for Deci­ding [2].

It unders­cores a essen­ti­al prin­ci­ple of decis­i­on-making: gathe­ring, pre­pa­ra­ti­on and con­side­ra­ti­on of infor­ma­ti­on are as important as the abili­ty to mana­ge cogni­ti­ve and emo­tio­nal chal­lenges. This tra­ge­dy chal­lenges us to exami­ne how we can impro­ve human decis­i­on-making in cru­cial situa­tions and under pres­su­re, ensu­ring that intui­ti­on ser­ves as a tool and not a trap. Part II will dive into how the avia­ti­on indus­try hand­les the­se chal­lenges. We will explo­re how spe­ci­fied pro­ce­du­res enable pilots to enhan­ce situa­ti­on awa­re­ness and pre­vent instinc­ti­ve and erro­n­eous decis­i­on-making. In part III you will under­stand how the­se pro­ce­du­res not only ensu­re an aircraft’s safe­ty, but can also aid you in decis­i­on-making pro­ces­ses in cri­ti­cal situa­tions within your working envi­ron­ment. 

Refe­ren­ces

[1] BEA – Bureau d’enquêtes et d’analyses pour la sécu­ri­té de l’aviation civil. (2011). Air­craft acci­dent report: On the acci­dent on 1st June 2009 on the Air­bus A330-203. Paris. https://aaiu.ie/foreign_reports_fr/final-report-accident-to-airbus-a330-203-registered-f-gzcp-air-france-af-447-rio-de-janeiro-paris-1st-june-2009

[2] Duran Duran. 1983. The Reflex. https://www.youtube.com/watch?v=J5ebkj9x5Ko

[3] Kah­ne­man, D. (2013). Thin­king, fast and slow. First paper­back edi­ti­on. New York, Farr­ar, Straus and Giroux.

[4] Tvers­ky, A., & Kah­ne­man, D. (1974). Judgment under Uncer­tain­ty: Heu­ristics and Bia­ses. Sci­ence (New York, N.Y.), 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124