The Reflex - High Time is no Time for Deciding

A Trag­ic Les­son in Deci­sion-Mak­ing

It was a qui­et and calm night over the Atlantic Ocean as Air France Flight 447 cruised smooth­ly toward Paris. The plane had left Rio de Janeiro hours ear­li­er, slic­ing through the cold, thin air at 35,000 feet. Inside the cab­in, pas­sen­gers dozed or flipped through mag­a­zines, accom­pa­nied by the steady hum of the engines. In the cock­pit, the atmos­phere was just as serene.

Out of a sud­den an alarm shat­tered the qui­et. The autopi­lot had abrupt­ly dis­en­gaged, leav­ing the star­tled pilots in man­u­al con­trol of the 200-ton air­craft. The unex­pect­ed alarm was trig­gered by mal­func­tion­ing sen­sors which caused the air­speed indi­ca­tors to fail. The pilot instinc­tive­ly pulled back on the con­trol stick, attempt­ing to climb and avoid risk of over­speed. This deci­sion, how­ev­er, trig­gered fur­ther alarms, sig­nal­ing a loss of lift and alti­tude. Despite their des­per­ate efforts, both pilots failed to under­stand why the air­craft wasn’t respond­ing as expect­ed and the dire sit­u­a­tion they were in. Trag­i­cal­ly, four min­utes lat­er, Flight AF 447 plunged into the ocean, claim­ing the lives of all 228 onboard.

The crash of Flight AF 447 can serve as a les­son for many top­ics (e.g. Crew Resource Man­age­ment or appro­pri­ate train­ing), but it is also a pro­found les­son in deci­sion-mak­ing. In the after­math, inves­ti­ga­tors of the BEA (Bureau of Enquiry and Analy­sis for Civ­il Avi­a­tion Safe­ty) iden­ti­fied a series of pilot errors as con­tribut­ing fac­tors [1]. Instead of lev­el­ing the air­craft and assess­ing the sit­u­a­tion, the pilot in com­mand of the air­craft did what felt instinc­tive­ly right — he pulled back on the con­trol stick. After all, climb­ing is safe­ty, isn’t it? How­ev­er, at high alti­tude, this maneu­ver exac­er­bat­ed the prob­lem, dan­ger­ous­ly slow­ing the air­craft by forc­ing it into a steep pitch-up atti­tude and ulti­mate­ly stalling it. Why did the pilot rely on instinct rather than train­ing? Why did he act with­out first gath­er­ing crit­i­cal infor­ma­tion? To explore these ques­tions, we turn to the psy­chol­o­gy of deci­sion-mak­ing.

Deci­sion-Mak­ing and the Dual-Process The­o­ry

One the­o­ret­i­cal frame­work to explain human deci­sion-mak­ing is the Dual-Process The­o­ry by Amos Tver­sky and Daniel Kah­ne­man first pub­lished in 1974. The sci­en­tists pro­posed two sys­tems of thought:

  • Sys­tem 1: Fast and intu­itive, but often prone to errors because it doesn’t review a lot of infor­ma­tion.
  • Sys­tem 2: Slow and ana­lyt­i­cal, capa­ble of pro­cess­ing more com­plex infor­ma­tion but is there­fore very resource-inten­sive.

It is as sim­ple as say­ing you can either take “short cuts” when mak­ing deci­sions or decide in a com­pli­cat­ed way. Sys­tem 1 often takes prece­dence because in most cas­es it is effi­cient and requires less cog­ni­tive effort. It is espe­cial­ly essen­tial in sit­u­a­tions where time is lim­it­ed, risks are lit­tle, or deci­sions seem straight­for­ward [3].

In their stud­ies Tver­sky and Kah­ne­man con­front­ed their test sub­jects with sit­u­a­tions where they had to esti­mate prob­a­bil­i­ties with­out receiv­ing enough infor­ma­tion. The human brain will reduce the com­plex­i­ty of assess­ing prob­a­bil­i­ties and pre­dict­ing out­comes to sim­pler oper­a­tions of judg­ment [4], unless we force our­selves to eval­u­ate, com­pute and then decide. The authors found that human deci­sion-mak­ing is often based on apply­ing heuris­tics or “rules-of-thumb”. This phe­nom­e­non is uni­ver­sal from lay­man to peo­ple high­ly edu­cat­ed with­in sta­tis­tics or with a rather tech­ni­cal and ratio­nal mind­set. While heuris­tics are effec­tive in many con­texts, they can be per­ilous in oth­ers espe­cial­ly when these rules are inter­twined with bias­es. Blind reliance on Sys­tem 1 can lead to dra­mat­ic errors, espe­cial­ly in high-stakes sce­nar­ios. In avi­a­tion, such short­cuts can turn dead­ly.

Human Fac­tors in Flight AF 447

In their final acci­dent report the BEA con­clud­ed amongst oth­er con­tribut­ing fac­tors: The pilots were com­plete­ly sur­prised by the warn­ing and didn’t com­pre­hend the sit­u­a­tion. They failed to gath­er impor­tant infor­ma­tion and over­looked oth­er cues (e. g. the stall warn­ing). Addi­tion­al­ly, “their poor man­age­ment of the star­tle effect that gen­er­at­ed a high­ly charged emo­tion­al fac­tor for the two pilots.”. In sim­ple words: the pilots did not have suf­fi­cient infor­ma­tion. Instead, emo­tion and instinct took over and fatal deci­sions were made. The pilot act­ed on what felt right, pulling back on the con­trol to climb and dis­re­gard­ing all con­trary infor­ma­tion. All it would have tak­en was an informed deci­sion and lat­er on a delib­er­ate push for­ward on the con­trol stick. A cal­cu­lat­ed step to let the plane regain speed. But that moment of clar­i­ty nev­er came.

Lessons from a Tragedy

Flight AF 447’s trag­ic end reminds us that intu­ition isn’t always reli­able in com­plex or high-pres­sure sit­u­a­tions espe­cial­ly when charged with emo­tions. Effec­tive deci­sion-mak­ing some­times requires act­ing against instinct, over­rid­ing heuris­tics and impuls­es with delib­er­ate, informed choic­es. In the end you have to remind your­self: High Time is no Time for Decid­ing [2].

It under­scores a essen­tial prin­ci­ple of deci­sion-mak­ing: gath­er­ing, prepa­ra­tion and con­sid­er­a­tion of infor­ma­tion are as impor­tant as the abil­i­ty to man­age cog­ni­tive and emo­tion­al chal­lenges. This tragedy chal­lenges us to exam­ine how we can improve human deci­sion-mak­ing in cru­cial sit­u­a­tions and under pres­sure, ensur­ing that intu­ition serves as a tool and not a trap. Part II will dive into how the avi­a­tion indus­try han­dles these chal­lenges. We will explore how spec­i­fied pro­ce­dures enable pilots to enhance sit­u­a­tion aware­ness and pre­vent instinc­tive and erro­neous deci­sion-mak­ing. In part III you will under­stand how these pro­ce­dures not only ensure an air­craft’s safe­ty, but can also aid you in deci­sion-mak­ing process­es in crit­i­cal sit­u­a­tions with­in your work­ing envi­ron­ment. 

Ref­er­ences

[1] BEA — Bureau d’en­quêtes et d’analy­ses pour la sécu­rité de l’avi­a­tion civ­il. (2011). Air­craft acci­dent report: On the acci­dent on 1st June 2009 on the Air­bus A330-203. Paris. https://aaiu.ie/foreign_reports_fr/final-report-accident-to-airbus-a330-203-registered-f-gzcp-air-france-af-447-rio-de-janeiro-paris-1st-june-2009

[2] Duran Duran. 1983. The Reflex. https://www.youtube.com/watch?v=J5ebkj9x5Ko

[3] Kah­ne­man, D. (2013). Think­ing, fast and slow. First paper­back edi­tion. New York, Far­rar, Straus and Giroux.

[4] Tver­sky, A., & Kah­ne­man, D. (1974). Judg­ment under Uncer­tain­ty: Heuris­tics and Bias­es. Sci­ence (New York, N.Y.), 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124