What Do Cognitive Biases Mean for Deterrence?

Humans make poor decisions—not just sometimes, but systematically—and new insights into these cognitive biases have implications for deterrence. To illustrate just how important these can be, consider the curious case of Abraham Wald, a respected Columbia academic who, in 1943, was selected by the U.S. War Department for an important task.[1]

The United States Army Air Forces were losing too many bombers over Europe to anti-aircraft fire and were considering adding armour plating to the aircraft, but the extra metal made the aircraft heavier, reducing performance and bomb loads. So, armouring the whole plane was impossible. Where could extra armour be placed effectively?

Abraham Wald and his study of aircraft armor (Slideshare)

Wald researched where the bombers frequently suffered the most damage. After an extensive survey of the squadrons returning to base, Wald discovered most of the damage was to the wings and fuselage, whereas the engines and cockpits seemed to be hit much less. Initially, the War Department assumed the armour plating should protect the wings and fuselage, but Wald explained how they were completely wrong. Armour placement was needed where there was no damage since bombers hit there never returned home to be studied. On Wald’s advice, the armour plating was duly placed around the cockpit and engines.

Wald demonstrated that the War Department was making a common mistake now known as survivorship bias.[2] By looking at a skewed sample—in this case, only those bombers surviving enemy fire—the War Department’s logic went awry. Survivorship bias is one of many deep-rooted and systematic flaws in the way humans process information.

One might think people take in all the available information and make the best decisions; in fact, however, we tend not to. We make bad decisions for many reasons. For one, thinking takes time and effort, and so we often go for heuristic short-cuts.[3] For another, like pack animals, we follow the herd.[4] Furthermore, we regularly misunderstand the world in systematic ways. We have deep-rooted attachments to what we already own, even when we can have something better.[5] These traits have helped us to adapt and stay alive, and we have inherited them from our ancestors who survived because of them.[6]

There are at least fifty of these proven quirks that warp our decision making.[7] One of these is confirmation bias, where we tend to underrate new information that challenges what we already believe. There is also optimism bias, which makes us overestimate our chances of getting away with something.[8] Next to these we have normalcy bias, where we refuse to plan for a disaster that has never occurred.[9] Then there is reactance, a phenomenon is which we do the opposite of what someone wants us to do just to defy a perceived constraint on our freedom of choice.[10] These biases can influence life in many ways, from who we marry to bad budgeting choices. But some of the most profound impacts are on deterrence.

To understand the effect of these systematic mistakes on how we deter unwelcome behaviour, consider one of the oldest forms of deterrence, the threat of jail time to discourage theft. In a rationale calculation, a substantial prison sentence should be enough to deter almost anyone from stealing, but cognitive biases mean this is not necessarily so. Reactance spurs rebellious criminals to steal simply because stealing is outlawed. The normalcy effect makes the ruinous impact of a prison sentence just too hard to contemplate, so it does not factor properly in the criminal mind. Criminals who plan a clever theft and escape tune out ways they might be caught because of confirmation bias. Criminals who know successful thieves and none of the many others who are caught and locked up will suffer from survivorship bias if they calculate their own chances of getting away with crime. And some will suffer from and optimism bias if they just guess.

So, every day punishments are in place that should deter every right-thinking individual in the world, but people still try their luck. Every prisoner is proof deterrence can and does fail.

These biases affect us all—not just criminals—and they affect us much more than we realise. Almost all of us suffer from a bias blind spot: the proven tendency for people to recognise biases in others more readily than in themselves.[11]

Thomas Schelling (EconLib)

Proof that cognitive biases are real means several of the assumptions underpinning traditional deterrence theory are wrong. Academics like Thomas Schelling, who led U.S. thinking on nuclear deterrence in the 1950s and 1960s and who was a contemporary of Abraham Wald, simply applied a standard hypothesis from economics at the time: that people knew how to behave in their own best interests.[12] People might make mistakes, went the theory, but they’d soon learn how to correct their behaviour because they would benefit from doing so.

Only in the 1970s, with the so-called third wave of deterrence theory, was psychology understood in enough detail to begin to grasp how people make systematic errors.[13] Kremlinology, the study of key figures in the Soviet system and how they behave and interact, became a key part of the West’s approach to nuclear deterrence. People, not weapons, became the central focus of Western defence.

The science of cognitive bias has advanced considerably since then. We now know people consistently behave in ways that go against their best interests in almost every field.[14] Indeed, in the last decade many governments have set up so-called nudge units, playing on these behavioural quirks to achieve policy goals, from increasing pension contributions to enforcing traffic laws.[15]

In military matters, even though the stakes are usually much higher, cognitive errors are still rife. Indeed, history is packed with examples of wars that might have been deterred were it not for strong cognitive biases affecting decision makers. Consider Argentina in 1982, which might not have invaded the Falkland Islands if it had a less distorted view of the United Kingdom’s resolve and capacity to respond. Or consider France in 1870, where military groupthink tipped Napoleon III into a disastrous war with Prussia. And Europe in the summer of 1914 was a cauldron of cognitive biases, as countries—Austria-Hungary, Serbia, Russia, Germany, France, and Britain—made a succession of poor judgments about the deterrence posture of rivals, rivals who, in turn, provided misleading signals themselves, ultimately leading to a catastrophe that spread around the world.[16]

According to one study, the weaker power initiates conflict in some 33% of observations, suggesting military might fails to deter as much as a third of the time.[17] Perhaps the attackers suffered from restraint bias—the tendency for people and groups to underestimate how easily they succumb to temptation? Perhaps groupthink infected the highest levels of combatants’ government and armed forces? Perhaps the parties to conflict missed important signals from an enemy because confirmation bias meant they were not looking for them? Whatever the reason, chances are cognitive biases were involved.

These tragic examples of conventional wars contrast with a much better record in nuclear matters where deterrence has, so far, been entirely successful. Nuclear conflict has been deterred for more than seven decades, partly because cognitive bias has been almost entirely squeezed out of it. This suggests the calculus governing our nuclear deterrent, and the strategic weapons of those who may oppose us, is as protected from human shortcomings as it can be, thus keeping the world safe. We still need to watch for normalcy bias, though. No nuclear weapons have been used in war since 1945, but it is folly to presume that will always be the case.

Although cognitive biases can make deterrence difficult, they also offer opportunities, especially when we confront lower-order threats.

Consider the ambiguity effect. People tend to avoid actions when they cannot easily assess the probability of different outcomes. Uncertainty about the number of mines in a minefield, for example, will keep people away, even if they suspect there are few actually present. This suggests deterrence can be more effective if we present a range of punishments rather than a fixed response.

Or consider the anchoring effect. The first piece of information people learn about something will often have a disproportionately strong impact on how they think about it. Anchoring may be one reason early efforts to implant false ideas in Hitler’s mind about an amphibious landing near Calais delayed his response to the Normandy landings for so long.[18] This illustration suggests successful deterrence postures are established earlier rather than later.

Even when a cognitive bias would normally work against deterrence, such as when an over-confident clique of planners confirms its own bad assumptions, there are still opportunities. If the group shares the same sources of information, it tells us where deterrence messages are likely to have the greatest impact. For example, during the 2003 invasion of Iraq, the U.S. military was able to exploit Saddam Hussein’s reliance on certain news outlets.[19]

Considering cognitive biases allows us the opportunity to think about how deterrence can face down modern threats. This leads to some interesting propositions.

First, tailor deterrence messages to specific audiences. Cognitive biases affect individuals in unique ways—even if we presume there is only one way to be rational, there are many ways to be irrational...and human. Credible warnings should not be generic; they should be tailored to each individual target. So, each potential adversary needs its own bespoke deterrence message. This also means actively challenging the mirror-imaging assumption that supposes what would deter you will also deter the adversary.

Second, biases provide a particularly effective means to embed some forms of deterrence. Through the framing effect, people tend to exclude certain options if the issue is described, or framed, in a certain way; and the normalcy effect helps to lock decisions in place, preventing them from being revisited. Together, these effects have helped the Geneva Conventions maintain strong norms about wartime behaviour, and they are vital now in enforcing international agreements against the use of chemical weapons. We cannot only say these munitions are illegal; we need to show, forcefully, that their use will never be tolerated. As we extend norms for competition into space and cyber, the leverage of cognitive biases might help establish deterrence.

Third, leveraging biases means deterrence must remain proportionate, but cannot be marginal. Proportionate responses are not only just, they are also practical since they are more likely to maintain popular support in a democracy, which in turn makes them more credible. No democracy would countenance a nuclear response to a minor cyber-attack, so such a threat would not be credible and would not deter. But a deterrent threat should not be too finely calibrated to the benefit it is seeking to deter either because cognitive biases—including optimism bias and groupthink—create too much room for error. Deterrence must be sufficiently robust to blast away any margin for wishful thinking by an opportunistic adversary. Witness the present deterrence posture of the U.S. on the Korean peninsula and how it communicates clearly to Pyongyang that any use of nuclear weapons would lead to a swift end of the regime.

Meaningfully Deterring Russia? (EuroMaidan Press)

Fourth, reciprocity suggests an adversary will look to use cognitive bias to undermine our own deterrence strategies. They may encourage us to misread their actions or doubt our own responses. They may attempt to sow confusion through misinformation that suggests some actions are beyond deterrence. Or they may exploit our optimism bias, playing on our hopes that a permanent incursion is only temporary. Recent Russian information campaigns around Crimea have sought exactly this.[20] The West should be wary of how biases allow for such mischief.

Finally, the awareness of biases reinforces the need for deterrence threats to be carried through when they have failed to deter. Schelling’s theoretical work already explains why rivals need to know we are serious if future deterrence is to be credible.[21] Our new understanding of cognitive bias suggests the issue is even more important than Schelling thought. Credibility is a combination of the anchoring effect and the normalcy effect; adversaries must be guided to assume we will always follow through while being forced to contemplate the cost of breaching future threats. In short, we must never bluff.

Many of these strategies are already in use. Schelling, if he were still alive, would be impressed by how refined his early ideas are adapted to the world as our understanding of human behaviour has improved. We are using nuclear deterrence strategies against rogue states as well as superpower rivals. We are deterring terrorist groups and lone wolf individuals through highly-developed messaging and sophisticated online armies.[22] And we are deterring attacks in cyber and space with threats designed for optimum effect against the most likely transgressors.

Despite the earlier example of criminality, deterrence is working every day. We may not see it because, like most people, we only notice the cases where deterrence fails—like War Department personnel misled by the concentration of bullet holes in the wings and fuselage until a certain Columbia academic put them right. Deterrence works, invisibly and silently. Its success is a myriad of events that never happened, so we can never study them. If we think deterrence is mostly failing, we are probably suffering from survivorship bias.

Cognitive biases have enormous implications for deterrence. Our new understanding of biases means deterrence is now stronger than it ever was during the Cold War, and far better than when Abraham Wald advised the War Department how best to protect their bombers.


Iain King CBE is Defence Counsellor at the British Embassy in Washington D.C. The views expressed are the author’s alone.


Have a response or an idea for your own article? Follow the logo below, and you too can contribute to The Bridge:

Enjoy what you just read? Please help spread the word to new readers by sharing it on social media.


Header Image: Behavioral Economics in Practice (HumanHow.com)


Notes:

[1] The definitive account of Wald’s work is set out in a 1984 article by the Journal of the American Statistical Association; a simple internet search will reveal several accounts of his war-winning application of mathematics.

[2] Survivorship bias is the fallacy of drawing a conclusion based on a sample which is skewed by criteria relevant to the conclusion; the examples in the sample have already ‘survived’ some test, so they should not be taken as representative.

[3] See Daniel Kahneman (2011). Thinking, Fast and Slow. See also the World Bank’s 2014 World Development Report, which explains this and other cognitive biases in clear terms.

[4] W.D. Hamilton (1971), “Geometry of the Selfish Herd”, Journal of Theoretical Biology.

[5] Daniel Kahneman (2011). Thinking, Fast and Slow. Macmillan. See also the World Bank’s 2014 World Development Report, which explains cognitive bias in clear terms.

[6] For more on the way cognitive biases may have evolved through natural selection, see Martie G. Haselton, Daniel Nettle, and Paul W. Andrews (2005). The Evolution of Cognitive Bias in D.M. Buss (Ed.), The Handbook of Evolutionary Psychology.

[7] The exact number of cognitive biases in existence is disputed, since some refer to a subset of other biases—the bandwagon effect, for example, is an example of groupthink, so may not qualify to be a separate bias in its own right. More than 50 is a reasonable estimate, but there may be as many as 108. For a list of 58 cognitive biases, read “58 Cognitive Biases which Screw Up Everything We Do,” from Business Insider UK.

[8] “58 Cognitive Biases which Screw Up Everything We Do,” from Business Insider UK, explains over-optimism and how it can bias decisions.

[9] For an interesting examination of the normalcy effect, read “An Insight into the Concept of Normalcy Bias,” from PsycholoGenie. Note that this effect is sometimes described as the ‘ostrich effect,’ referring to the metaphor of people sticking their head in the sand.

[10] “Don't Tread on Me! Psychological Reactance as Omnipresent” in Psychology Magazine offers a fuller explanation on psychological reactance.

[11] See Emily Pronin, David Lin, and Lee Ross (2002), “The Bias Blind Spot: Perceptions of Bias in Self versus Others” in the Personality and Social Psychology Bulletin.

[12] An obituary for Thomas Schelling explaining some of his innovations on deterrence is provided in “Thomas Schelling, economist, 1921-2016” in the Financial Times. Another 2016 obituary, Thomas Schelling has died. His Ideas Shaped the Cold War in the Washington Post is also worth a read.

[13] The term “Third Wave” applied to deterrence theory is attributed to Robert Jervis, whose 1979 article “Deterrence Theory Revisited” in the Journal of World Politics, led to the categorisation of the different approaches to this topic.

[14] The World Bank has devoted substantial resources to cognitive bias during this decade, determining that these biases are a major cause of poverty and conflict, and that mitigating their effects can spur development. The World Bank’s 2014 World Development Report sets out their thinking in clear and accessible language.

[15] A ‘Nudge Unit’ seeks to exploit cognitive biases to achieve policy goals without infringing personal choices overtly, often by framing decisions to guide people towards the government’s preferred option, or making that option the default choice for citizens.

[16] There has been extensive discussion and analysis on the origins of the First World War, not all of which has focused on miscalculation or faulty deterrence. See Stephen Van Evera (1984) The Cult of the Offensive and the Origins of the First World War for more detail on collective systematic errors in perceptions of war which preceded and helped cause the crisis of 1914.

[17] Of course, there may be other variables involved, which means the one-third statistic must be used with caution, and what constitutes a weaker power is open to debate. But taking the very sensible view that the defeated was weaker, it has been shown that one third of wars since 1945 were initiated by the side which lost, with a higher proportion before that year. The dataset and analysis is provided by Dan Lindley and Ryan Schildkraut (date unknown) 'Is War Rational?' from the University of Notre Dame.

[18] Hitler’s mistaken decision to keep Calais protected as the Allies pressed into Normandy is detailed in most histories of the Second World War. An overview of Nazi errors which contributed to the success of D-Day is provided by The National Interest.

[19] For a fascinating analysis of how Saddam Hussein’s Iraq deterred Iran and was itself only partially deterred by the United States, read Amatzia Baram (2012), 'Deterrence Lessons from Iraq' in Foreign Affairs magazine.

[20]Michael Kofman, et al. (2017), 'Lessons from Russia's Operations in Crimea and Eastern Ukraine' from RAND, provides detail on Russia’s information operations connected with their annexation of Crimea.

[21] Thomas Schelling (1960), The Strategy of Conflict, Harvard University Press.

[22] Jared Cohen (2015), 'Digital Counterinsurgency - How to Marginalize the Islamic State Online' in Foreign Affairs magazine explains how digital tools can tackle and deter terrorism.