The Strategy Bridge

View Original

Do Not Trust Your Gut: How to Improve Strategists’ Decision Making

Earlier this year, The Strategy Bridge asked civilian and military students around the world to participate in our sixth annual student writing contest on the subject of strategy.

Now, we are pleased to present one of the Third Place winners from James M. Davitch, a Ph.D. student at Virginia Tech University.


Introduction

See this Amazon product in the original post

Military strategists often make decisions based on instinct and emotion rather than careful deliberation. However, as Daniel Kahneman’s book Thinking, Fast and Slow explains, most humans do as well because of mental limitations called cognitive biases.[1] Kahneman argues the mind’s tendency to make snap decisions rather than to proceed with caution can inhibit effective judgment. Cognitive biases may cause strategists to overlook salient yet inconvenient information and waste time pursuing solutions to the wrong problems. Unfortunately, the list of known cognitive biases is extensive and growing.[2] Faced with an overwhelming number of challenges to decision making, a strategist might question whether some biases are harmful specifically in a military planning environment and if there are any techniques to address them. This essay argues that, yes, some specific biases may directly affect military planning. Further, it argues that while there are historical examples of their negative influence, they can be mitigated through certain techniques.[3]

Each section below focuses on one of four cognitive limitations. The essay will describe each bias, relate it to an example in military history, and conclude with steps to mitigate it. This essay illustrates through historical analogies why confirmation bias, fundamental attribution error, anchoring bias, and representative bias are detrimental to a military strategist’s decision-making process.[4] These biases can cause strategists to privilege facts that confirm previously held beliefs, automatically attribute nefarious motivations to others’ actions, fixate on initial information, and draw incorrect associations between dissimilar events. Examples from the Cuban Missile Crisis, the Korean War, Operation Iraqi Freedom, and World War I provide historical context for these cognitive limitations. The mitigation steps include engaging in active open mindedness, empathy, consideration of the opposite, and the so-called “what if” technique. The common theme throughout the mitigation steps in this essay is that strategists would benefit from exercising patience and intellectual humility in their deliberations.

System 1 versus System 2

The recommendations described in this essay draw heavily on behavioral psychology research, especially Kahneman’s description of how the brain makes decisions. Kahneman presents a struggle in the mind between two modes of thinking that he calls System 1 and System 2. During System 1 thinking, the mind operates “automatically and quickly, with little or no effort.”[5] Most of the time this process works well enough, allowing one to proceed without a second thought. The cognitive process often fails, however, when System 1 suggests intuitive answers to complicated questions. This is because in System 1 thinking, one’s mind is prone to take shortcuts that sacrifice mental rigor for expediency. Most of the time these shortcuts, which behavioral psychologists call biases and heuristics, are innocuous and involve low stakes decision making where instinctive judgment suffices. However, when evaluating possibilities during strategic planning for military operations, one’s instinctive judgment, laden with unconscious biases, may be detrimental.

Kahneman contrasts System 1 with its mental partner, System 2. System 2 “allocates attention to the effortful mental activities that demand it.”[6] System 2 thinking requires focus, and because it is often easier to make a snap decision than it is to concentrate, the mind sacrifices System 2 in favor of System 1 most of the time.[7] For the strategist that means most of the decisions one makes resemble knee-jerk reflexes rather than carefully considered conclusions. In the parlance of behavioral psychology, humans often privilege their intuitive judgment in conditions of uncertainty. Put differently, a strategist may tend to trust their gut (i.e., system 1), even in situations where they probably should not. While junior strategists are liable to this weakness, perhaps counterintuitively, seniority and experience increase this tendency. A large body of literature shows the mixture of authority and overconfidence can result in an even more toxic combination for decision making because senior strategists may possess the influence junior strategists lack.[8]

As an additional complication, there is no way to switch off System 1. It constantly, though unconsciously, suggests answers to deal with the many decisions one makes every day. Most of the time, for routine decision making, this is perfectly fine. However, when complexity increases, one is more prone to fall into cognitive traps that can lead to poor outcomes. Therefore, part of mitigating these problems involves slowing down one’s decision making to engage the thoughtfulness inherent in System 2 and resisting the easy answer offered by System 1. Unfortunately for the military strategist, doing so is much easier said than done, especially with respect to confirmation bias.

Confirmation Bias

Human minds crave order, and they try to minimize the discomfort of uncertainty by suggesting ways to make sense of chaos and disorder. One of the ways they do this is by encouraging us to accept information that confirms preexisting views or ideas. Oftentimes one sees what one wants to see, frequently to the exclusion of other relevant factors like a valid-but-contradictory viewpoint.[9] This is called confirmation bias, and it can be problematic when it leads strategists to expend less mental effort on a problem or question than it warrants (i.e., when the strategist impulsively accepts the System 1 answer). The deleterious effects of confirmation bias may alter a strategist’s perception of reality, leading to neglect of the fundamental problem one must address.

Confirmation bias in the Cuban Missile Crisis

Unfortunately, examples of confirmation bias abound in military history. One instance occurred prior to the 1962 Cuban Missile Crisis, when the United States failed to respond to the introduction of Soviet military equipment in Cuba.[10] On September 19, 1962, the Central Intelligence Agency stated it was not likely the Soviet Union would put nuclear missiles on the island.[11] In October, less than a month later, analysis of pictures taken from U-2 reconnaissance aircraft showed the Soviet Union had done exactly that.[12] However, the October surveillance photos were not the first piece of evidence that Russia was militarizing Cuba. Some parts of the U.S. intelligence community had observed dozens of shipments of conventional weapons and military personnel preceding the delivery of nuclear weapons and predating the October photo analysis.[13] Confirmation bias contributed to analysts’ neglect of the deployment of conventional weapons and the surprise of U.S. national security enterprise at the deployment of nuclear-armed ballistic missiles.[14]

The fear inspired by the Cuban Missile Crisis (Bettmann/Corbis)

Throughout 1962, the Soviet government repeatedly denied any desire to militarize Cuba. The Soviet foreign minister privately assured President Kennedy that Soviet Premier Khrushchev would not do anything to complicate American domestic matters before the congressional elections in November. Secretary of State Dean Rusk believed what the Soviet Union was saying, publicly and in private.[15]

The lack of overhead imagery complicated the problem.[16] On August 29, 1962, a U-2 overflew Cuba, and subsequent imagery analysis revealed defensive weapons (i.e., surface-to-air missiles) and probably Soviet personnel, but no offensive missiles on the ground.[17] It would take six more weeks for the next U-2 to fly over Cuba. When it did, on 14 October, it was too late.

The August U-2 photos showing defensive weapons, combined with Moscow’s repeated declarations, resulted in a confirmation bias to believe that no nuclear buildup was forthcoming. In his memoirs, presidential adviser Clark Clifford wrote that the state of mind within the intelligence community rejected the possibility of offensive missiles in Cuba.[18] Though there were some who argued otherwise, including the director of the CIA, by and large the U.S. intelligence community believed what it wanted to believe and privileged evidence which supported that belief.

How to Mitigate Confirmation Bias

Since people often seek and readily accept confirming evidence for beliefs they already hold, the trick to dealing with confirmation bias is to actively seek out disconfirming evidence. Humans want to believe what they think is true is actually true. So, the strategist must convince his or her mind to un-believe it.

The first step, as with most cognitive debiasing strategies, is to simply slow down. Many times, military professionals are in a rush to judgment, mainly to fix a problem and move on to the next. However, to prevent confirmation bias a good technique is to consciously delay one’s decision and ask what it would take for the opposite viewpoint to be true. The next step is to exercise humility and acknowledge there may be other points of view worth considering before reaching a final verdict. This method is a part of a larger concept called active open-mindedness.

Actively open-minded thinking refers to the consideration of all evidence prior to a decision.[19] The main problem confirmation bias presents is that when evaluating evidence, one may only consider the evidence one wants to believe is true. Therefore, strategists should flip the evidence on its head and try to disprove it—asking, for example, “What would it take to disprove what I believe to be true?” Even better, one can ask what evidence would be necessary to prove the assessment wrong. This line of questioning is a useful technique when evaluating someone else’s claim or assessment and can minimize the effects of overconfidence. “What evidence would you have to see to make you change your mind?”[20] This allows strategists to remain open to alternative possibilities, which is important when dealing with other biases like the fundamental attribution error.

Fundamental Attribution Error

Fundamental attribution error is a term that suffers from a confusing name but is nevertheless a common cognitive pitfall. It refers to an individual’s tendency to attribute another’s actions to something fundamental about that person, like their background, while attributing one’s own behavior to factors beyond one’s control.[21] Fundamental attribution error is partially about assigning blame, but it is also the tendency to ascribe to others what one may be less likely to attribute to oneself. Writing about this bias, the CIA noted that it occurs when the behavior of others is attributed to some fixed nature, while claiming that one’s own behavior is a function of the situation in which he or she finds themself.[22] President George W. Bush once said, “Too often we judge other groups by their worst examples, while judging ourselves by our best intentions.”[23] This hints at the hypocrisy inherent in fundamental attribution. One may observe fundamental attribution error in action when one interprets another’s behavior by heavily weighting personal characteristics like where someone else is from, their social class, gender, etc., and lightly weighting situational factors.[24] Unacknowledged fundamental attribution errors may warp a strategist’s understanding of the situation they face and delay attention to the correct problem.

Fundamental Attribution Error in the Korean War

The way that some American political leaders evaluated U.S. and Chinese actions during the Korean War provides an example of how the fundamental attribution error can cloud a strategist’s judgment. In October 1950, U.N. forces had recovered from North Korea’s summer surprise attack. As they began their advance north, the U.S.-led U.N. coalition had a tricky geopolitical needle to thread. How could they defeat North Korea without drawing China into the conflict?

U.S. President Harry Truman at his desk in the Oval Office with Secretary of State Dean Acheson, 1950 (National Archives)

However, not all members of President Truman’s cabinet felt there was a risk of alarming Beijing. Secretary of State Dean Acheson felt there was little danger of provoking China because, from his perspective, America’s intentions were benign. Said Acheson, "No possible shred of evidence could have existed in the minds of the Chinese Communists about the non-threatening intentions of the forces of the United Nations.”[25] While the U.S. felt its decisions were nonthreatening, that is not how they were interpreted in Beijing.[26]

Therefore, it was with astonishment that some in the U.S. government received news in late October 1950 that Chinese forces had covertly crossed the Yalu River and attacked U.N. forces. Senior leaders in Washington were, “incapable of interpreting the Chinese intervention as a reaction to a threat. Instead, the Americans interpreted the Chinese reaction as an expression of fundamental hostility toward the United States.”[27] John Lewis Gaddis notes the Chinese leadership likely viewed advancing Allied forces as a threat to their regime. Fundamental attribution error in this case, manifested itself in the U.S.’s appraisal of the Chinese response as a hostile one, rather than one borne of the situation (i.e., the U.N. force’s advance towards the Chinese border). The initial U.S. judgment of China as congenitally hostile and a belligerent actor, and its appraisal of its own actions as benign and righteous, may have colored subsequent interactions and prolonged the stalemate until the 1953 armistice.[28]

How to Mitigate Fundamental Attribution Error

Fighting fundamental attribution error is about exercising a sense of understanding and subordinating arrogance to take the other side’s point of view into consideration. Thus, it is beneficial for the strategist to operate in System 2, because System 1 is rarely empathetic. Like mitigating confirmation bias, one can attempt to overcome fundamental attribution error by considering others’ viewpoints as well as alternative explanations for observed evidence. The human mind often does not because it is easier to rush to a judgment.

An important consequence of fundamental attribution error is how it affects one’s ability to evaluate the adversary. As described in joint doctrine, understanding the adversary is a crucial step in planning military operations.[29] Doing this analytical task well allows the military strategist to intelligently predict what the adversary is going to do next. Improperly evaluating the adversary’s point of view can lead to an inaccurate estimate of possible enemy courses of action.[30] Additionally, as Kahneman writes, “If people are often poorly equipped to explain the behavior of their adversaries, they are also bad at understanding how they appear to others.”[31] This echoes the contention of Robert Jervis that “[i]t is rare for actors to realize that what matters in sending a message is not how you would understand it, but how others will understand it.”[32]

With fundamental attribution error, the Golden Rule is a good guide—treat others how you would like to be treated. A software researcher in Silicon Valley put it this way:

Me ten years ago, on seeing a poorly designed interface, “Wow, what idiot designed this?’

Me, today, “What constraints were the team coping with that made this design seem like the best possible solution?” Empathy trumps fundamental attribution errors.”[33]

Anchoring Bias

Anchoring bias, or the anchoring effect, refers to the human tendency to attribute outsized influence to the first piece of information one encounters. That information then has the propensity to influence subsequent estimates and discussions.[34] For example, military budgeting can suffer from anchoring bias as discussions about future fiscal requirements sometimes begin from the previous year’s figure (the anchor point) rather than from the need or requirement. For instance, if the budget is known to be ~$700 million, that figure itself has power in anchoring conversations about future budgets. Discussions then become fixated on the $700 million figure and about how much to add or subtract from it, rather than about warfighting requirements. Sometimes this is a logical method for thinking about the future, but not when the anchor or reference point stifles creative thinking about alternative solutions.

Psychologists have found that specialization in a particular area may make this cognitive bias more pronounced. Based on their research of those in certain professions who possessed a great deal of expertise, Northcraft and Neale showed that “expertise in the subject matter does not seem to mitigate anchoring to any significant extent.”[35] Thus, the anchoring effect may have deleterious effects on military strategists especially susceptible to this bias precisely because of their deep subject matter expertise.

Anchoring Bias in Operation Iraqi Freedom

Major Blair Williams wrote in a 2010 edition of Military Review that U.S. military planners were slow to adjust to changing realities after the onset of fighting in Iraq during the mid-2000s. Despite warnings prior to Operation Iraqi Freedom that the planned number of ground forces in Iraq was too low, the average number of U.S. troops from 2003 to mid-2007 remained around 138,000.[36] Historians such as Andrew Bacevich believe Secretary of Defense Rumsfeld was attempting to execute a military design concept called the Revolution in Military Affairs whereby the U.S. would fight conflicts and prevail by virtue of technological superiority rather than mass. Troop numbers did not increase until President Bush surged forces in 2007. Despite Iraq being on the verge of a civil war throughout the middle part of the 2000s, decision makers were tied to their initial estimate of the necessary number of ground troops. Thus, military and civilian defense professionals are no exception to the reality that people who are experts in a field can fall victim to the effects of the anchoring bias. Major Williams summarized the situation by stating, “The anchoring phenomenon kept the value closer to the initial value than it should have been.”[37]

U.S. troops pull down a statue of Saddam Hussein in central Baghdad (Goran Tomasevic/Reuters)

The insidiousness of each of these biases stems from the fact that their presence is not mutually exclusive, but rather additive in their effect. Once people are anchored to a particular figure, it may make an organization susceptible to the effects of confirmation bias where one may accept information more readily that conforms to previously held beliefs. For example, Gordon and Trainor noted the White House believed the manpower requirement for Iraq’s post-conflict stability operations would be minimal because Iraqis would, “do the work of Phase IV themselves.”[38] Therefore, intelligence analysis and assessments supporting the light footprint planning assumption confirmed the White House’s existing inclination to avoid nation building.[39] The low troop figure provided a permission structure for accepting evidence that confirmed the existing bias and anchored subsequent planning assumptions. Likewise, Jervis described several instances of this phenomenon during the prelude to the 2003 Iraq War regarding confirmation bias and White House decision making.[40] Since anchoring and confirmation bias can be mutually reinforcing, it is imperative that strategists, especially those faced with planning amidst the fog and friction of war, learn techniques to mitigate their effects.

How to Mitigate Anchoring Bias

System 1 tries to create a world that justifies why the anchor number is correct. It tries to make one’s perception true so the brain can deal with the next issue. The first step to mitigate the brain’s inclination to anchor on the first figure it encounters is to, again, slow down. After the strategist has successfully fought the urge to accept the suggestions of System 1, he or she can begin to consider the opposite.

The consider-the-opposite strategy works exactly like it sounds. Individuals are induced to contemplate the possible outcomes at odds with their prevailing belief. The means can vary, but one study found that by employing explicit instructions test subjects were able to retrain their thought process to avoid rash judgments.[41] In a strategy team, it may be effective for the team leader to set the expectation that subordinate team members will show the results of their consider-the-opposite methodology. This technique is designed to fight the brain’s desire to make something seem true by forcing one to consider alternatives or alternative explanations. Multiple studies have shown that test subjects who consciously consider the opposite are less susceptible to the anchoring bias because they take the time to consider the possibility of the opposite outcome.[42] A strategy to consider the opposite may “disrupt the fast heuristic processing of System 1 and activate System 2, which requires more cognitive efforts and information elaboration.”[43] The consider-the-opposite strategy helps to render the initial anchoring figure irrelevant—because it often is.

Representative Bias

Representative bias might more aptly be called the deja vu bias, and it is a close cousin of another heuristic Kahneman calls availability bias. A key feature of both biases is the tendency to associate a new event with previous occurrences that seem analogous. This bias, just like the others this essay discussed, is the brain’s attempt to quickly categorize new information. In this instance, System 1 tries to rapidly search for instances similar to a present situation. It can be comforting when one associates uncertainty with a familiar situation because it suggests that similar tools may be used to address it. Problems arise when the circumstances at hand are unlike previous situations, despite the System 1 suggestion to treat them as the same.[44]

Representative Bias in World War I

While World War I is replete with examples of poor judgment, the case study of the Austro-Hungarian Empire’s mobilization for war is especially notable. This is primarily because of its military chief of staff’s failure to adequately predict and prepare for war with Russia. Instead, he relied on his recent memory of past experiences as a guide for the future. Austro-Hungarian field marshal and chief of the general staff, Count Franz Conrad von Hötzendorf, demonstrated, perhaps, the most spectacular failure of leadership in the entirety of the war. According to Mark Grotelueschen and Derek Varble, Conrad failed for two reasons: he failed to plan for the right war and, once engaged, he failed to fight the war effectively.[45] This example will only focus on his pre-war mistakes.

Franz Graf Conrad von Hötzendorf, Austro-Hungarian general, Chief of the General Staff of the Austro-Hungarian Army (Hermann Torggler/Wikimedia)

Prior to World War I, Austria-Hungary had challenges with two countries on its borders: Russia and Serbia. However, the former represented a much larger and more lethal threat to the Habsburg Empire than the latter. Conrad was the chief strategist of the war plans that were devised to deal with both countries. However, Conrad never tasked his subordinates to create a plan to fight both countries at the same time, which is exactly what Austria-Hungary required in 1914. Therefore, while separate war plans existed at the start of the war, no combined plan did. Nevertheless, Conrad repeatedly pushed civilian politicians in the Austro-Hungarian empire for war, specifically against Serbia, despite the likelihood that Russia would intervene on Serbia’s behalf. His bellicosity stemmed partially from outdated mobilization estimates—he thought it would take Russia too long to prepare for war. Conrad failed primarily because he was unable to separate his perception of the situation (that Russia would take too long to mobilize) from the reality that Russia would enter the war quickly and in force. His susceptibility to representative bias—the inability to effectively judge the situation based on likely probabilities—contributed to the Austro-Hungarian Empire’s eventual demise.

How to Mitigate Representative Bias

In 2009, the Central Intelligence Agency produced a tradecraft primer designed to improve intelligence analysis for the agency and other similar intelligence organizations. This handbook provides a host of structured analytic techniques that, if used properly, may improve one’s ability to “challenge judgments, identify mental mindsets, stimulate creativity, and manage uncertainty” to “wrestle with difficult questions.”[46] Some have argued the structured analytic techniques have not been rigorously tested and others have tried to suggest ways for doing so.[47,48] Nevertheless, there is merit to some of the methods that provide individuals with approaches for engaging with uncertainty more deliberately.

One technique called what-if analysis may be very helpful for strategists dealing with representative bias.[49] The what-if technique suggests that one should start with the end state and then attempt to provide the logical pathway that led to that conclusion. Representative bias may be prevalent when one uses past experiences as a guide for the present. By thinking backwards, what-if analysis allows one to avoid letting the past influence the present and instead accept a future condition as a given. Then one can apply analytical thinking to identify why hypothetical events in an imagined future transpired the way they did.

If Conrad had applied more humility in the face of uncertainty and what-if analysis to the problem at hand, he might have considered both the best- and worst-case outcomes. For Conrad, and the Austro-Hungarians, the best-case outcome could have been the future Conrad wanted—a war with either Russia or Serbia but not both. He might have then considered the events that led to a worst-case scenario, the scenario which ultimately occurred. If he had engaged in a more critical, open-minded approach to strategic decision making, he may have then observed that the worst-case scenario was a much more likely eventuality and that his personal opinion, clouded by representative bias, was in error.

Conclusion

Human brains operate mostly on autopilot, and the cognitive biases employed to live life often inhibit the ability to make good decisions. In low stakes situations, like deciding what to eat or what to wear, the negative impacts of cognitive biases are negligible. But in the military—where the stakes are often higher and effective decision making can mean the difference between mission success and failure—the consequences of cognitive biases corrupting decision making can be calamitous.

This essay described how confirmation bias, fundamental attribution error, anchoring bias, and representative bias are detrimental to a military strategist’s cognitive process. It argued that they can cause strategists to privilege facts that confirm one’s beliefs, automatically attribute nefarious motivations to others’ actions, fixate on initial information, and draw incorrect associations between dissimilar events. The essay used examples from the Cuban Missile Crisis, the Korean War, Operation Iraqi Freedom, and World War I to provide a historical context. Through these vignettes and many others, one may appreciate how some of the most consequential military events hinged on the influence of unconscious biases during critical junctures.

Effective decision making in complex environments is incredibly challenging even without the limitations that humans bring via System 1 thinking. To avoid undermining the decision-making process and compounding the difficulty, it may be useful for strategists to employ the critical thinking techniques described above. These methods—including engaging in active open mindedness, empathy, consideration of the opposite, and the what-if technique—are derived from decades of behavioral psychology research. They may help mitigate cognitive bias and allow one’s brain to transition from System 1 to System 2 when the situation requires.

Engaging in System 2 thinking mainly requires two things, the ability to slow down one’s rush to judgment and the subordination of one’s pride to acknowledge that their instinctive answer may not be correct. Thus, critical thought can benefit from both patience and intellectual humility. Military strategists should thoughtfully consider their cognitive limitations as well as the range of possible outcomes in pursuit of political goals and in support of civilian leaders. Strategists who devote attention to thinking about thinking and learning from the mistakes of the past may improve their ability to plan for the future.


James M. Davitch is an officer in the U.S. Air Force and a PhD candidate at Virginia Tech’s School of Public and International Affairs. The views expressed are the author’s alone and do not reflect those of the U.S. Air Force, The Department of Defense, or the U.S. Government.


The Strategy Bridge is read, respected, and referenced across the worldwide national security community—in conversation, education, and professional and academic discourse.

Thank you for being a part of the The Strategy Bridge community. Together, we can #BuildTheBridge.


Header Image: Cognition, Memory, and Psychology (Allan Ajifo/Flikr)


Notes:

[1] Kahneman, Daniel. Thinking, fast and slow. Macmillan, 2011.

[2] The mid-2010s saw a gradual increase in the use of terms like “metacognition” and mental shortcuts like “confirmation bias” that Kahneman helped popularize (see Google n-gram graphic: https://books.google.com/ngrams/graph?content=metacognition%2Cconfirmation+bias&year_start=1960&year_end=2019&corpus=26&smoothing=3). Suddenly, talk of biases seemed ubiquitous, especially in business journals and magazines. Perhaps the high-water mark came in 2017 when the media company, “Visual Capitalist” created a daunting graphic depicting 188 individual cognitive biases. See the infographic here: https://www.visualcapitalist.com/wp-content/uploads/2021/08/all-188-cognitive-biases.html

[3] Many other biases can also negatively influence a strategist’s judgment; however, they are beyond the scope of this paper. The author concedes that other biases were likely present and effected decision making during each of the historical examples described. Further, the essay does not imply that the biases described were the root cause of the failures in the historical examples, rather they were likely one amidst a mélange of contributing factors.

[4] This argument aligns with research conducted on intelligence analysis that aimed to help mitigate bias through virtual learning environments. In 2014, an organization called the Intelligence Advanced Research Projects Activity (IARPA) attempted to study what IARPA determined were the most harmful biases to analysts as a part of their “Sirius” research program. The biases included the four that this essay explores (confirmation bias, fundamental attribution error, anchoring bias, and representativeness bias) as well as the blind spot bias and projection bias. For more information, see https://www.iarpa.gov/index.php/research-programs/sirius/baa

[5] For example, when confronted with a simple math problem such as “What is 1+1?” the mind instantly generates the number “2.”

[6] Ibid, 22. If system 1 and 2 were personified, system 1 would be the laid-back happy-go-lucky character always looking for the easy answer. The system 2 character would be more uptight, always second-guessing system 1’s judgement.

[7] Ibid, 26.

[8] See Raugh (2019) for a quantitative study on the effects of military institutional tendencies and habits on military officer critical thinking abilities. Raugh concludes, “Military leaders, such as senior commissioned officers with 20 or more years of service, are especially prone to these institutional habits and tendencies and thus negatively affected by them.” Raugh, David. "Superforecasting or SNAFU: The Forecasting Ability of the US Military Officer." (2019). See also Dane, Erik, and Michael G. Pratt. "Exploring intuition and its role in managerial decision making." Academy of management review 32, no. 1 (2007): 33-54; Akinci, Cinla, and Eugene Sadler‐Smith. "Intuition in management research: A historical review." International Journal of Management Reviews 14, no. 1 (2012): 104-122; Hodgkinson, Gerard P., Eugene Sadler-Smith, Lisa A. Burke, Guy Claxton, and Paul R. Sparrow. "Intuition in organizations: Implications for strategic management." Long range planning 42, no. 3 (2009): 277-297.

[9] Kahneman’s acronym that he uses to describe this tendency is “WYSIATI” which stands for “What You See Is All There Is.” The problem with the mind’s tendency to engage in WYSIATI is that it neglects to consider that there are significant considerations that the mind does not see. Sometimes this is called negative evidence. Relatedly, the term “disconfirmation bias” describes the situation when one rejects information that does not conform to what one may want to hear, even when presented with valid evidence to the contrary.

[10] Partially due to the “photo gap,” Joseph Caddell calls the imagery analysis showing nuclear missiles in Cuba a near-failure rather than an intelligence success. He also highlights the lack of multi-intelligence discipline fusion, the inefficiency of the intelligence cycle, and the prevalence of disinformation operations that continue even today as barriers to intelligence collection and analysis. See Caddell, Joseph. "Discovering Soviet Missiles in Cuba: How Intelligence Collection Relates to Analysis and Policy." War on the Rocks (2017).

[11] Sherman Kent wrote a defense of his organization’s rationale and explained why it led to an incorrect assessment. See https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol51no3/revisiting-sherman-kent2019s-defense-of-snie-85-3-62.html

[12] In fact, archival documents later showed that Soviet premier Nikita Khrushchev had decided to send the weapons as early as May 1962. See Allison, Graham T., "Essence of decision." (1999), page 202.

[13] Rumors of the Soviet build-up were hardly private or kept within intelligence community circles. For example, in an effort to pressure democrats in congress as well as the White House, Senator Kenneth Keating made a speech on the floor of the U.S. Senate on August 31, 1962, warning about “rocket bases” in Cuba. White, Mark. The Cuban missile crisis. Springer, 1995, page 107

[14] May and Zelikow’s transcription of the ExComm meetings in their book, The Kennedy Tapes provides a fascinating insight into executive level strategic decision making. They relate how President Kennedy’s National Security Advisor, McGeorge Bundy relayed the news of the Soviet missiles to the president at 9 a.m. on October 16th. See May, Ernest R., and Philip Zelikow, eds. The Kennedy tapes: Inside the White House during the Cuban missile crisis. WW Norton & Company, 2002, page 31. However, on national television just three days prior, Bundy stated, “I think there is no present likelihood that that Cubans and the Cuban government and the Soviet government would in combination attempt to install a major offensive capability.” See Garthoff, Raymond. Reflections on the Cuban Missile Crisis: Revised to Include New Revelations from Soviet & Cuban Sources. Brookings Institution Press, 2011, page 34.

[15] Allison, Graham T., and Allison Graham. "Essence of decision." (1999), page 203.

[16] Photographs taken from aircraft cameras was not as common during the Cuban Missile Crisis as it is in battlefield military operations today. Satellite photography, which first began operation only a year prior, was even rarer and somewhat less reliable. See Caddell Jr, Joseph W. "Corona over Cuba: The Missile Crisis and the Early Limitations of Satellite Imagery Intelligence." Intelligence and National Security 31, no. 3 (2016): page 416-438.

[17] Tension exists within the political science literature regarding the idea that weapons can actually be offensive or defensive. This tweet thread by Paul Poast provides a very good summary of the debate: https://twitter.com/ProfPaulPoast/status/1365671660024721411?s=20&t=hJc7E0Cg1dJge3nFYpNF7Q

[18] Clifford, Clark M., and Richard C. Holbrooke. Counsel to the president: A memoir. Random House Incorporated, 1991.

[19] Haran, Uriel. “The Role of Actively Open-Minded Thinking in Information Acquisition, Accuracy, and Calibration.” Judgment and Decision Making 8, no. 3 (May 2013): pages 188–201. https://doi.org/http://journal.sjdm.org/13/13124a/jdm13124a.pdf.

[20] Overconfidence is its own barrier to critical thinking and plays into many of the other biases listed in this essay. See Moore, Don A., and Paul J. Healy. "The trouble with overconfidence." Psychological review 115, no. 2 (2008): 502.

[21] Healy, Patrick. “Fundamental Attribution Error: What It Is & How to Avoid It.” Harvard Business School Online. Harvard Business School, June 8, 2017. https://online.hbs.edu/blog/post/the-fundamental-attribution-error.

[22] https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/Tradecraft%20Primer-apr09.pdf

[23] Abt, Parker. "National Liberation Movements and Increasing Humanitarian Law Compliance." Cornell International Affairs Review 12, no. 2 (2019): page 111.

[24] See also Gilbert, Daniel T. "Thinking lightly about others: Automatic components of the social inference process." (1989).

[25] Kahneman, Daniel, and Jonathan Renshon. "Why Hawks Win." Foreign Policy, no. 158 (2007): pages 34-38.

[26] This is an example of failing to adequately appreciate how one’s actions can translate to others; however, this type of episode is hardly confined to the Cold War. Jackson (2019) shows a vivid an example of how perceptions and misperceptions on the Korean peninsula almost led to conflict in 2017. See Jackson, Van. On the Brink: Trump, Kim, and the Threat of Nuclear War. Cambridge University Press, 2019.

[27] Kahneman and Renshon. "Hawks," page 34. Emphasis added.

[28] See also Fettweis (2018) for a discussion of how the U.S.’s views of itself have historically been incongruent with other states’ views of U.S. behavior. Fettweis, Christopher. "Psychology of a Superpower." In Psychology of a Superpower. Columbia University Press, 2018.

[29] “Evaluate the adversary” is the third step in the Joint Intelligence Preparation of the Operational Environment process. See page I-17 here https://www.jcs.mil/Portals/36/Documents/Doctrine/pubs/jp2_0.pdf

[30] See also the political psychology of Robert Jervis, especially The Logic of Images (1989) for a description of how, on an international level, governments attempt to control their image through signals and indices.

[31] Kahneman and Renshon. "Hawks," page 35.

[32] Jervis, Robert. "Perception and misperception in international politics." In Perception and Misperception in International Politics. Princeton University Press, 2017, page 187.

[33] https://twitter.com/uxresearch/status/1079148047257395200

[34] Kahneman relates a story explaining how he and his research partner, Amos Tversky, observed the anchoring effect during their studies. They created a wheel of numbers that was rigged to only stop at 10 or 65. Their research subjects spun the wheel and were then asked to write down the number where it stopped which, naturally, would be either 10 or 65. Then the students were asked what the percentage of African countries was within the United Nations. Obviously, spinning the wheel provides nothing that would assist someone with such a specific question and the student subjects should have considered the numbers 10 or 65 irrelevant. However, they did not. Kahneman notes, “The average estimates of those who saw 10 and 65 were 25% and 45%, respectively.” People’s estimates were anchored to the original number they saw on the “wheel of fortune,” even though it was irrelevant. See Kahneman, Thinking, fast and slow, page 118.

[35] The authors presented to professional real estate agents, who should have been familiar with estimates of various property values, several initial listing prices. The agents that received higher listing prices reported higher estimates of the property’s value in comparison to the agents that were anchored to lower initial listing prices. Northcraft, Gregory B., and Margaret A. Neale. "Experts, amateurs, and real estate: An anchoring-and-adjustment perspective on property pricing decisions." Organizational behavior and human decision processes 39, no. 1 (1987): 84-97.

[36] See Bacevich, America’s War for the Greater Middle East, p 318, endnote 31.

[37] Williams, “Heuristics and Biases in Military Decision Making.” page 48.

[38] Gordon, Michael R., and Bernard E. Trainor. Cobra II: The inside story of the invasion and occupation of Iraq. Vintage, 2006. p 142. See also Ricks, Thomas E. Fiasco: The American military adventure in Iraq. Penguin UK, 2007 and Bolan, Christopher J. Risk in American foreign military interventions. Georgetown University, 2009.

[39] See Hafner’s discussion about how intelligence related to the Chalabi plan fed confirmation bias in the White House. Hafner, Ferdinand. Cognitive Biases and Structural Failures in United States Foreign Policy: Explaining Decision-Making Dissonance in Phase IV Policy and Plans for Iraq. Naval Postgraduate School Monterey, CA, 2007.

[40] Jervis 1, Robert. "Reports, politics, and intelligence failures: The case of Iraq." Journal of strategic studies 29, no. 1 (2006): 3-52. See pages 24-27 for the author’s discussion of the mix of biases that clouded the judgement of political leaders during the WMD search years of Operation Iraqi Freedom.

[41] See Lord, Charles G., Mark R. Lepper, and Elizabeth Preston. "Considering the opposite: a corrective strategy for social judgment." Journal of personality and social psychology 47, no. 6 (1984): page 1231.

[42] Lee, Yu-Hao, et al. "Training anchoring and representativeness bias mitigation through a digital game." Simulation & Gaming 47.6 (2016): pages 751-779.

[43] Mussweiler, Strack, and Pfeiffer (2000) argue their findings suggest that “making people aware of anchoring, and motivating its avoidance, can be successful at mitigating the anchoring effect.”

[44] See also Neustadt and May (1988) for a very useful description of why incorrectly associating previous experiences that seem similar present circumstances and yield suboptimal prescriptions. Neustadt, Richard E., and R. Ernest. "May, Thinking in Time: The Uses of Decision-Makers." (1986).

[45] Add to that his tendency to routinely blame others for his failures. See Mark Grotelueschen and Derek Varble, “Count Franz Conrad von Hötzendorf and the Failure of Strategic Leadership,” in Worst Military Leaders in History, edited by John Jennings and Chuck Steele (London, UK: Reaktion Books, 2022).

[46] See “CIA Tradecaft Primer.” https://www.cia.gov/static/955180a45afe3f5013772c313b16face/Tradecraft-Primer-apr09.pdf

[47] Welton Chang, Elissabeth Berdini, David R. Mandel & Philip E. Tetlock (2018) Restructuring structured analytic techniques in intelligence, Intelligence and National Security, 33:3, pages 337-356.

[48] Folker, Robert D., and Joint Military Intelligence College. “Intelligence Analysis in Theater Joint Intelligence Centers: An Experiment in Applying Structured Methods.” Washington, D.C.: Joint Military Intelligence College, 2000. Print. Occasional Paper (Joint Military Intelligence College (U.S.)); No. 7.

[49] See also “Uncertainty in the Information Supply Chain” in which Dr. Monica Tremblay advocates for the use of “What If” analysis to alleviate the effect of representative bias on decision making in an information environment. See Tremblay, Monica Chiarini, "Uncertainty in the information supply chain: Integrating multiple health care data sources" (2007). Graduate Theses and Dissertations. https://scholarcommons.usf.edu/etd/2387