The Strategy Bridge

View Original

Penetrate Uncertainty: Descriptive Planning in a Complex Tactical Environment

In early 1871 the world changed forever, at least for military planners. Helmuth von Moltke the Elder not only destroyed the French army, he also destroyed the idea that a single general could simultaneously be the politician, strategist, operational artist, and tactician. A single commander could no longer control the variables on the battlefield.[1] Moltke unknowingly ushered in the era of complexity. A century and a half later, complexity is now common lexicon in U.S. Army doctrine and a staple part of the curriculum in U.S. military institutions.[2] Although complexity is deeply rooted in theory and doctrine, how to address it in the tactical environment remains in question. Moltke realized early on that “no plan...extends with any certainty beyond the first contact with the main hostile force.”[3] How to reevaluate the plan after encountering the enemy’s main hostile force remains elusive to commanders and staffs today. It is through descriptive planning—planning with less detail, more forecasting—and scenario planning that military units can win in complex environments. Descriptive planning allows commanders to adapt to the circumstances and win in large-scale combat operations.[4]

Soviet soldiers attack a house in Stalingrad, February 1943 (Wikimedia)

The U.S. Army’s Military Decision Making Process provides a step-by-step model for decision-making and order production. However, it provides insufficient direction for how to adapt when confronted with complexity. Commanders and staffs often scramble to re-plan when faced with adversity, consuming the entire organization in an attempt to get back on the plan or develop a new one. The French experienced this frustration at Dien Bien Phu, and the Germans felt it at Stalingrad. When encircled and barraged with artillery, the French and German plans collapsed and their missions failed.[5] The French and German experience is shared with militaries today because organizations do not deliberately account for the need to reevaluate, re-plan, and re-execute. Before investigating this concept further, it is necessary to explore complexity theory, the Cynefin Framework, systems, emergence, and complex adaptive systems.[6]

In the simple domain, cause and effect are intuitive, and the environment is predictable.

In the Cynefin Framework, the complex domain is one of five domains, including also the simple domain, the complicated domain, the chaotic domain, and the disordered domain. Created in 1999 by IBM employee David Snowden to improve intellectual capital, the Cynefin framework offers a  tool to help determine the nature of the operating context and inform planning.[7] In the simple domain, cause and effect are intuitive, and the environment is predictable. In the complicated, cause and effect are known, although obscured, and problem solving requires expertise. In the chaotic domain, the relationships between cause and effect are impossible to determine. Within the complex domain, variables are interdependent, and cause and effect are unknowable. Complexity, therefore, is when interdependent and autonomous parts interact to create an unpredictable outcome. In such an environment, like war, the only way to judge what to do is to experiment: probe - sense - respond.[8]

Directly linked to complexity is the idea of systems, something planners must understand to fully account for the operational environment. Systems have interrelated parts; when a part of the system changes, the other parts also change in unknowable ways. Put differently, the whole of the system is different—not greater—than its parts. To define a system, it must be both dynamic (constantly changing) and evolving (having emergent properties). Emergence is the idea that minor or even perceived inconsequential actions in complex environments interact to create completely unforeseen actions.[9] This means that systems are adaptive. Complex adaptive systems acclimatize in unanticipated ways, generate unexpected behaviors, and continuously evolve to survive.[10] Consequently, two important insights emerge. First, reductionism, or seeking to understand the system by looking only at the units and their relations with one another, is not appropriate to manage a complex system; and second, because most systems have either been designed to cope with adversity or have evolved in the face of it, breakage or overload at one point rarely destroys them.[11]

Military operations take place in the complex domain against complex adaptive systems. An enemy unit, the physical environment, the strategic landscape, the political arena, and others comprise the ecosystem of war. Studying one in isolation is useful but hardly sufficient. For example, an enemy tank brigade evolves when it interacts with the environment and violently clashes with another force. Planners must understand the dynamic and constantly evolving nature of systems and, more importantly, remain patient as properties emerge during the duration of combat to inform plans. Planners, however, typically plan as if operating in a complicated environment. They revert to reductionism, assume cause and effect, and consult experts. This type of planning is dangerous in that it builds deceptive mental models and false narratives about the future. The track record of so-called expert pundits operating in complex environments is miserable. In economics, politics, sports, and every other discipline, specialists are routinely inaccurate in their understanding of the future.[12] This is because, in part, specialists mistakenly apply routine solutions to volatile problems. Instead, leaders and units need to experiment. Only by gaining feedback from the system can organizations accurately plan.

In Bleibtreu's “Battle of Königgrätz,” the Prussian King Wilhelm I, Bismarck and General Helmuth von Moltke the Elder observe the largest encirclement in military history. (Wikimedia)

Plans rarely survive beyond contact with the enemy’s main hostile force. Consequently, detailed planning beyond that quickly becomes problematic as unforeseen variables interact to challenge the plan. Shared understanding among commanders and staffs, gained before execution, often decreases with the initiation of combat. Plans assume cause and effect, and as soon as an effect fails to proceed from a cause, breakdown occurs. As an operation progresses over time, shared understanding decreases along with the ability to determine cause and effect, while emergence and risk increases.

In 1973, the Israelis experienced breakdown by assuming cause and effect and not fully accounting for all possibilities. When Egypt and Syria launched a joint surprise attack against the Israelis on Yom Kippur, Israel’s lethargic response was due to, in part, to parochial planning. Its “no enemy gain” rule meant Israel surrendered its strategic depth and consequently surrendered tactical options along with it. Israel’s two main defensive plans, Dovecote and Rock, assumed adequate warning of war that allowed for an immediate counterattack and a swift victory. Accordingly, Israel’s tactical planners did not account for other alternatives such as a delaying action, a two-front war, or the possibility of insufficient tactical warning. In short, Israel failed to recognize the limitations of its plans.[13]

In U.S. Army doctrine, the commander’s intent helps subordinate units continue the mission when conditions change or when current orders are no longer relevant. It is a powerful but short-term solution. A plan is still required to synchronize operations across echelon to accomplish the endstate. Therefore, acknowledging that the plan will not survive beyond encountering the enemy’s main hostile force and instead emphasize branch plans to exploit success or counteract catastrophe is an empowering means to manage complexity. The ability to adapt, select options that are suitable for the current conditions, and plan as the situation emerges, is the only way to combat complexity. In other words, probe - sense - respond.

Adapting is readjusting a failed plan. Planning to adapt is anticipating readjusting a failed plan.

Probe - sense - respond is not a new idea. History is kind to planners who allowed for tactical adjustment through probing and unforgiving to those who planned with an inelastic style. Victors can trace their success to the ability to adapt as the conditions become known. During the Revolutionary War, George Washington’s brilliance in New York and New Jersey in 1776 exemplified attributes of Moltke by maintaining resolve and achieving flexibility in operational means. Washington penetrated uncertainty by navigating to the political objective and responding to British actions through ad hoc plans. Nearly a hundred years later in 1863, Ulysses S. Grant famously failed to cross the Mississippi River several times but constantly reevaluated, re-planned, and re-executed to eventually find a weak point in the Confederate defense and seize Vicksburg. During World War I, the French army’s ability to recreate an entire army after near complete defeat and turn the German’s west flank ultimately led to victory on the Marne.[14] These examples illustrate the validity of adapting over the ability to plan as the most important element for victory. In today’s environment adapting is insufficient. Planners must plan to adapt. Adapting is readjusting a failed plan. Planning to adapt is anticipating readjusting a failed plan.

“Washington inspecting the captured colors after the Battle of Trenton,” by Edward Percy Moran (Wikimedia)

There is a limit to how much planners can prepare. Complexity theorist Per Bak stated, “As one attempts to make predictions further and further into the future, the amount of information one needs to gather about the initial conditions increases exponentially.”[15] Some of this limitation is attributed to emergence. Past a certain point, nothing in mathematics, physics, or history will inform planners what will happen next.

So, how does this translate to planning? Planning theorist Henry Mintzberg warned that traditional planning leads to a calculated and formalized list of steps. It is void of creativity and vision and often results in organizations missing inventive opportunities. Mintzberg instead promoted strategic thinking, or carefully mixing traditional planning and thinking to emphasize intuition and agility.[16] Mintzberg’s strategic thinking translates to descriptive planning at the tactical level.

Descriptive planning allows commanders to adapt to changing environments. Branch plans, instead of an operational order, become the main output to the Military Decision Making Process. Focused around decisions and the current situation, descriptive planning is a holistic staff effort in envisioning future tactical scenarios, or numerous branch plans, grouped around phases. Using this method, planners plan with a high degree of detail for the ensuing phase and deliberately plan with less detail for the remaining phases, instead allowing the conditions to develop as the operation progresses before initiating further planning. Descriptive planning accounts for all reasonable tactical scenarios—one phase at a time—and when a tactical scenario becomes evident, commanders employ predetermined means at a specific and projected point in time and space.

Descriptive planning requires commanders and staffs to become comfortable with the unknown, something with which the human mind struggles. When humans cannot immediately fulfill the desire to know, we become motivated to reach a conclusion, referred to as “cognitive closure.”[17] Unknowingly, commanders and planners often become susceptible to cognitive closure and demand polished plans. Products such as synchronization matrices reinforce a false narrative that war is predictable, perhaps even something to be scheduled. The idea that variables can be controlled over time against complex adaptive systems is categorically false. In reality, the only certainty in war is uncertainty.

Descriptive planning accounts for the utility of the Military Decision Making Process and adding systems thinking to it accounts for uncertainty. As mentioned previously, a key component to adaption is becoming comfortable with uncertainty and filling in the blanks as the information becomes available. To achieve this, the commander decides on a broad course of action (the base plan) as determined in the Military Decision Making Process, but he also decides on what other tactical scenarios might become a reality and subsequently plans to allocate the necessary means to seize, retain, and exploit the initiative for each scenario.

See this Amazon product in the original post

Scenario planning requires forecasting. Forecasting is different than predicting. Both assume the state of the future, but forecasting involves calculated and continuous analysis. Predicting is guessing, whereas forecasting is based on probability. Philip Tetlock’s book, Superforecasting, offers a method that relates to descriptive planning.18 Tetlock advises forecasters to distinguish the known from the unknown. This allows for focused analysis and information collection. Forecasters should also persistently challenge assumptions and update them as the situation develops. Assumptions are necessary for planning, but stagnant assumptions lead to a poor foundational plan. Tetlock also advises that skilled forecasters update their forecasts endlessly as additional information becomes accessible. Tetlock states the best forecasters view their ideas as hypotheses in need of testing amidst a constant flow of information. Forecasters are in a relentless state of learning by processing a steady flow of information that might prove valuable for adjusting their estimates.19 Empirical data show forecasters using Tetlock’s system are 30% better forecasters of future events than professional intelligence analysts.20

In a complex environment, if an organization presumes a predetermined outcome, it fails to prepare for other credible possibilities, and therefore it is left vulnerable. Forecasting allows planners to determine which tactical scenario might transpire. Scenario planning, combined with forecasting, arms planners with a tool to visualize possible futures. Scenarios have the power to engage decision makers so they pay attention to indications of change. Scenario planning also helps avoid mental models by not allowing commanders and staffs to visualize a false perception of future tactical outcomes.21 In the business world, persistent scenario practice makes leaders comfortable with ambiguity by offering a visualization of the future. It counters hubris, exposes assumptions, and fosters quick adaptation in times of crisis.22

Peter Schwartz, a leading theorist on scenario planning, developed the principal model to prepare for the future: identify factors likely to bear on the problem, organize them into future possibilities, envision paths that would lead to those futures, and devise a strategy for surviving them. For tactical planning, identifying factors likely to bear on the problem translates to variables such as weather, enemy strength, friendly conditions, and environmental circumstances. Next, organize those variables in a manageable number of future scenarios (between two to four). For each scenario, develop a narrative that articulates that particular possible future. Afterward, brainstorm what each future means in terms of ways and means. Finally, track the indicators that suggest which scenario might become a reality.23

Future tactical environments will only increase in complexity. Planners and decision-makers must become comfortable with a range of options and not a single course of action Assuming cause and effect and relying on a single course of action will all but assure defeat. If planners instead embrace that the plan will not survive beyond the first major operation and probe - sense - respond as described with descriptive planning, they can overcome complexity and penetrate uncertainty.


Patrick Mulloy is a U.S. Army officer. The views expressed in this article are the author's and do not represent the views of the U.S. Army, the Department of Defense, or the U.S. Government.


Have a response or an idea for your own article? Follow the logo below, and you too can contribute to The Bridge:

Enjoy what you just read? Please help spread the word to new readers by sharing it on social media.


Header Image: An AH-64 Apache attack helicopter takes off near soldiers participating in the Allied Spirit VII training exercise Nov. 18, 2017 in Grafenwöhr, Bavaria, Germany. (Spc. Dustin D. Biven/U.S. Army Photo)


Notes

[1] Michael Howard, The Franco-Prussian War: The German Invasion of France, 1870-1871 (New York: Routledge Taylor & Francis Group, 2000).

[2] Complexity theory is taught in military colleges, including the U.S. Army Command and General Staff College and the School of Advanced Military Studies. It is also addressed in concepts and doctrine, including The U.S. Army in Multi-Domain Operations 2028 and Joint Publication 5-0.

[3] Helmuth von Moltke, ed. and trans. by Daniel J. Hughes, Moltke on the Art of War: Selected Writings (New York: Ballantine Books. 1993), 92.

[4] The term “Descriptive Planning” was co-developed with Major Andrew Jenkins, U.S. Army.

[5] Miles Machun Yu, “The Lessons of Dien Bien Phu,” Hoover Institution, Stanford University, December 22, 2017, Accessed August 7, 2019. https://www.hoover.org/research/lessons-dien-bien-phu

[6] Complexity theory is an interdisciplinary academic discipline first developed in the 1960s. This article only discusses a small concept of the theory. See Systems Effects: Complexity in Political and Social Life, by Robert Jervis.

[7] David Snowden and Mary Boone, “A Leader’s Framework for Decision Making,” Harvard Business Review, November 2007. David Snowden created the Cynefin Framework in 1999. It is mostly used in the business world and not widely used in the U.S. Military. Cynefin is a Welsh word for habitat.

[8] David Snowden and Mary Boone, “A Leader’s Framework for Decision Making.”

[9] Robert Jervis, System Effects: Complexity in Political and Social Life (Princeton, NJ: Princeton University Press, 1997). See also Alex Ryan, “What is a Systems Approach.”

[10] Serena Chan, “Complex Adaptive Systems,” Massachusetts Institute of Technology, October 31, 2001, Accessed June 1, 2019. web.mit.edu/esd.83/www/notebook/Complex%20Adaptive%20Systems.pdf.

[11] Jervis, System Effects: Complexity in Political and Social Life.

[12] David Epstein, “The Peculiar Blindness of Experts,” The Atlantic, June 2019. Accessed June 1, 2019. https://www.theatlantic.com/magazine/archive/2019/06/how-to-predict-the-future/588040/.

[13] Abraham Rabinovich, The Yom Kippur War: The Epic Encounter That Transformed the Middle East (New York: Random House, 2017).

[14] See David Fischer, Washington’s Crossing (New York: Oxford University Press, 2004). Ron Chernow, Grant (New York, Penguin Press, 2017). Holger H. Herwig, Marne, 1914: The Opening of World War I and the Battle That Changed the World I (New York, Random House Trade Paperwork, 2009).

[15] Per Bak, “Self-Organized Criticality,” Scientific American, 264(1), (1991), 46-53.

[16] Henry Mintzberg. “The Fall and Rise of Strategic Planning. Harvard Business Review. January 1994.

[17] Arie Kruglanski and DM Webster, “Motivated Closing of the Mind: ‘Seizing’ and ‘Freezing,’” Psychological Review, 103(2), (1996), 263-83. See also Jerome Kagan, “Motives and development,” Journal of Personality and Social Psychology, 22(1), (1974), 51-66.

[18] Philip Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction (New York: Broadway Books, 2015).

[19] Philip Tetlock and Dan Gardner, Superforecasting: The Art and Science of Prediction.

[20] Alix Spiegel, “So You Think You’re Smarter Than A CIA Agent,” National Public Radio. April 2014. Accessed June 1, 2019. https://www.npr.org/sections/parallels/2014/04/02/297839429/-so-you-think-youre-smarter-than-a-cia-agent.

[21] Peter Senge, The Fifth Discipline: The Art and Practice of the Learning Organization (New York: Crown Business, 2006).

[22] Angela Wilkinson and Roland Kupers, “Living in the Futures,” Harvard Business Review. May 2013. Accessed June 1, 2019. https://hbr.org/2013/05/living-in-the-futures.

[23] Peter Schwartz, “Your Future in 5 Easy Steps: Wired Guide to Personal Scenario Planning.” Wired, July 2009. Accessed June 1, 2019. https://www.wired.com/2009/07/future-5-easy-steps-wired-guide-personal-scenario-planning/.