Knowing the Knowable: Two Fallacies of the Military Paradigm

Introduction

The U.S. military paradigm is rooted in two dangerously outmoded assumptions about 21st-century reality. First is the military’s assumption of proportionality, or that every action has an equal and opposite reaction. This leads to a corollary that causal relationships of the future are generally knowable. The second is the assumption of additivity: that the behaviors of a whole are a scaled reflection of that whole’s disaggregated parts. With this assumption, knowing the parts provides full knowledge of the whole, creating the belief that all wholes are knowable through aggregating the knowledge of their parts.

These assumptions are at the foundation of the U.S. military paradigm, manifest in both doctrine and practice throughout the joint force. Military planners solve problems by projecting cause and effect relationships onto their chosen parts of an operational environment. Planners assign enemy actions to undesirable conditions, and having determined causal relationships, label the resulting discrepancy a “problem statement.” Once the problem is discovered, planners direct limited resources to reciprocal causes, or decisive points, assuming they will generate desirable effects on the parts to achieve the end state conditions of the whole.

In the context of 21st-century complexity and interconnectivity, practice and experience prove this deterministic mode of knowing is no longer effective; a paradigm shift is in the making. The military’s increased demand for innovation cells, agile work practices, and fail- fast modernization efforts reflect an institution wrestling with a reality it can no longer analyze and disaggregate into submission.[1] This paradigm causes the military to stumble recklessly through a reality of globally interconnected systems behaving in nonlinear and emergent ways that are anything but knowable.

A Car and the Sum of It’s Parts (ResearchGate)

Determinism: Belief in the Knowable

The military paradigm may be characterized as deterministic, a worldview composed of two prevailing beliefs about how reality functions. First, the assumption of proportionality accepts that for every action there is an equal and opposite reaction; causes will generally result in proportional and reciprocal effects.[2] Systems theorist Frans Osinga writes about the concept of proportionality that “if you know a little about [a system’s] behavior, you know a lot.”[3] This linear logic assumes most systemic behavior is predictable and knowable because it consistently obeys rules that one can determine.[4] To philosopher Dr. Ben Zweibelson, proportionality is akin to mechanistic reductionism, assuming that reality is composed of “discoverable rules and laws, where once a hypothesis is validated…we can apply that rule towards everything, anywhere, anytime.”[5] A belief in proportionality leads one to retroactively assign causes to known effects and assume those causal relationships will persist into the future. Sociologist Karl Weick explains this as a process by which organizations “see what they have seen before, and they link these memories in a sequential train of associations… [they] tend to imagine the past and remember the future.”[6]

The second belief of the deterministic worldview is that of additivity, which accepts that most wholes are equal to the sum of their parts. Under this logic, an observer can determine the effect a cause will have within a given whole if they know the characteristics of its constituent parts. Therefore, one can solve problems by breaking them into smaller pieces, analyzing the parts, and adding them together to obtain a solution.[7] Strategist Henry Mintzberg characterizes this as a “reductionist” logic that reduces states and processes to their parts for analysis.[8] Through this logic, the ability to describe the behaviors and features of a part also describes the scaled behaviors and characteristics of the whole.

Analysis: The Fallacy of Knowing in the U.S. Military Context

The deterministic assumptions of additivity and proportionality canalize the military into an analysis-centric methodology for producing knowledge. Modern understanding of cause and effect in a U.S. military context invariably incorporates a methodology of scientific analytics. Regardless of specified doctrine, the first tangible step of the military planning methodology is mission analysis, a procedure used to “identify all other tasks necessary to accomplish the mission.”[9] Planners analyze the environment, enemy, terrain, civilian infrastructure, and more while poring over data and intelligence reporting to generate knowledge of what is. Joint Publication 2-0, Joint Intelligence, states that “analysis and development of products…help the commander and staff understand the complex and interconnected [operating environment].”[10]

The additivity assumption underlies analysis, a belief that disaggregation results in quantitative reduction rather than qualitative transformation. Doctrine breaks the world into domains, dimensions, mission and operational variables, warfighting functions, and staff sections. Staffs analyze the disaggregated environment and enemy factors to re-aggregate the parts and predict the cause/effect relationships of the whole. They break the enemy into warfighting functions, box them into geographic segments during wargames, and reduce him to a center of gravity as the knowable source of all his power.

The assumption of proportionality supports planning—a belief that cause and effect is knowable through applying historical rules and planning factors. Planners ascribe retrospective causes to observed effects and derive universal principles to enable cognitive ease and procedural scaling. They assume future systems will adhere to these principles in a predictable, proportional manner. Doctrine applies principles of war, mission command, joint operations, the operations process, and more to every new problem. Entire sections are devoted to effects, under the assumption that predictable, knowable causes exist within any system addressed. Planners use historical correlation of forces and means calculators, assuming segmented portions of the enemy will generally react as predicted by historical ratios to validate their courses of action.

Isolation & Linearity in Systems: Knowable Cause and Effect

Chess Pieces (Master Class)

Systems in which cause and effect is knowable lend themselves to an analytical methodology as a valid way of knowledge generation. For analysis to be a valid way of generating knowledge, however, these systems must adhere to two conditions. The first is that of isolation. The parts or nodes within the system must be isolated enough that their interaction has limited causal effect on other nodes and may be ignored for problem solving purposes.[11] This enables the analyst to focus on singular nodes without accounting for interrelated effects across other nodes. A chess player need not worry about whether the bishop and rook are getting along, or if the opposing kings are cousins. The game pieces have no relational considerations, so interactions are linear and predictable.

The second condition for analytical problem solving is for linearity in the behavior of the system nodes themselves.[12] The behavioral characteristics of the parts must match the behavioral characteristics of all similar parts in all similar relations. This enables an additive approach to problem solving, in which the whole is, in fact, behaviorally and relationally equal to the sum of its parts. For example, an engineer may use the load-bearing characteristics of a single steel beam to calculate the additive load-bearing capacity of an entire bridge. A chess player may be confident that each pawn used will function the same regardless of which move it is making.

By exploring the elements of isolation and linearity within a system’s parts, it becomes apparent that problem management exclusively through the lens of an analytical methodology works best in systems with low interconnectivity and behavioral diversity. This enables the analyst to predict cause and effect with high accuracy and high confidence, knowing the truly knowable. However, many of the problems faced within the military profession do not exist within these types of systems, and may entirely reject knowledge generation through analysis.

Complex and Adaptive Systems: Emergent Cause and Effect

Many systems in the world are now what theorists call complex adaptive systems. These types of systems are the domain of emergent cause and effect, sharing two universal characteristics. First, these systems are complex, or “wholes with properties irreducible to their individual parts.”[13] Michael Hayden describes them as a “set of things…that collectively behave differently as opposed to when separated.”[14] The essence of a complex adaptive system is that its parts are not additive. In a whole system, the system's behavior derives from elements found only in the whole itself, such as structure, organization, relationships, and communication.[15] Suppose one were to analyze a chimpanzee and compare it with Julius Caesar. In that case, they might find a 90 percent similarity in molecular parts, yet the structure and relationships of those parts result in a vastly different whole.[16] Knowledge generation by a process of analyzing the parts and adding them back together, therefore, misses entirely the knowledge of the behaviors of the whole.

Caesar (Fandom.com)

The second characteristic of these systems is a high degree of adaptivity, or a tendency for systems to “maintain themselves in a changing environment.”[17] Systems theorists submit that adaptive systems work against change to reduce disorder and maintain homeostasis.[18]  Systems theorist Ervin Laszlo observes that natural systems “must keep running just to stay in the same place.”[19] Natural systems must adapt to their environments or cease to exist in the same form.[20] Systems incorporate sensory feedback loops to self-regulate by determining the difference between the current state and equilibrium.[21] Systems maintain equilibrium by taking energy-consuming actions to reduce disorder.

Systems composed of people are especially adaptive with immense tendencies toward equilibrium. Cultures, tribes, and societies maintain equilibrium through laws, regulations, principles, and rules that help enforce paradigmatic norms.[22] This balance ensures the group’s collective security, creates a sense of justice and community, and provides permanence through tacit rules, enabling the group to operate and survive as an entity.[23]

It is therefore clear that the more open and human a system is, the more uncertain and adaptive its behavior will be, and the more it will react to maintain the norm. Unknowable systems contain high levels of uncertainty and adaptivity: those that are most open or the most human.

A Paradigm Shift in the Making

The interconnectivity and interdependence of the 21st-century world render knowing the knowable a tenuous and perilous paradigm. An evolution beyond the linear nature of 20th-century warfare has left the military stuck in outdated analytical ways of knowing, reflected throughout joint service doctrine and practice. These practices disregard the fundamental systemic elements of uncertainty and adaptivity, wherein complex systems exhibit holistic and adaptive behaviors that are neither analytically knowable nor predictable. Military scholars and leaders often treat the resulting surprise as a failure of the plan, not of planning itself.

Military professionals must embrace a new paradigm of systems that welcomes uncertainty, acknowledges systemic emergence, and incorporates iterative learning opportunities in the context of an adaptive world that is anything but knowable. A systemic paradigm holds that the levels of interconnectivity and behavioral diversity between elements of a problem are inversely proportional to the efficacy of analysis to manage it. It acknowledges systemic cause and effect are increasingly less knowable without interactive feedback mechanisms. A systems paradigm accepts that the world consists of complex, interconnected systems that exhibit behaviors and relationships between nodes that are knowable not through analysis but action.

Conclusion

The military’s deterministic paradigm is failing in complex systems with high adaptivity and uncertainty levels: the very kind becoming more and more common in the 21st century. A systemic paradigm is a new system of logic which holds that the levels of interconnectivity and behavioral diversity between elements of a problem are inversely proportional to the efficacy of analysis to manage that problem. This new paradigm is one in which systemic cause and effect are increasingly less knowable absent interactive feedback mechanisms. The 21st century is an age wherein systemic adaptivity outpaces knowability, and analysis provides, not knowledge of systemic causality, but description of its parts out of context.

The U.S. military must embrace a new systemic paradigm, grounded in a probabilistic view of the world, wherein knowledge gained through iterative and scalable action is most valued. Only then will methodologies such as design, and agile or lean practices be accepted and effective in military operational practice.


John Stanczak is an Army officer currently assigned to the 75th Ranger Regiment. He holds two master degrees in security and operational studies with a research focus on systems theory and design thinking. This essay reflects his own views and not necessarily those of the U.S. Army, the Department of Defense, or the U.S. Government.


The Strategy Bridge is read, respected, and referenced across the worldwide national security community—in conversation, education, and professional and academic discourse.

Thank you for being a part of The Strategy Bridge community. Together, we can #BuildTheBridge.


Header Image: Dynamic and Multi-Scale Systems (JSMF)


Notes:

[1] Sydney J. Freedberg Jr, “Failure IS An Option: Army Gen. Murray,” Breaking Defense, May 7, 2019, accessed March 1, 2022, https://breakingdefense.sites.breakingmedia.com/2019/05/failure-is-an-option-gen-murray/.

[2] “Newton’s Laws of Motion,” Glenn Research Center | NASA, accessed January 20, 2022, https://www1.grc.nasa.gov/beginners-guide-to-aeronautics/newtons-laws-of-motion/.

[3] Frans P. B Osinga, Science, Strategy and War: The Strategic Theory of John Boyd (London; New York: Routledge, 2007), 65, accessed September 6, 2021, http://site.ebrary.com/id/10155759.

[4] Alan Beyerchen, “Clausewitz, Nonlinearity, and the Unpredictability of War,” International Security 17, no. 3 (1992): 61–62.

[5] Ben Zweibelson, “One Piece at a Time: Why Linear Planning and Institutionalisms Promote Military Campaign Failures,” Defence Studies Journal 15, no. 15 (December 2015): 361, http://www.tandfonline.com/doi/full/10.1080/14702436.2015.1113667.

[6] Karl E Weick, “The Role of Imagination in the Organizing of Knowledge,” European Journal of Information Systems 15, no. 5 (October 1, 2006): 448.

[7] Beyerchen, “Clausewitz, Nonlinearity, and the Unpredictability of War,” 62.

[8] Henry Mintzberg, “The Fall and Rise of Strategic Planning,” Harvard Business Review (February 1994): 37.

[9] Chairman of the Joint Chiefs of Staff, Joint Publication 5-0; Joint Planning (U.S. Department of Defense, 2020), III–12.

[10] US Department of Defense, Joint Staff, Joint Publication (JP) 2-0, Joint Intelligence (Washington, D.C.: U.S. Department of Defense, 2013), x.

[11] Ludwig Von Bertalanffy, General System Theory: Foundations, Development, Applications, Revised edition. (New York, NY: George Braziller Inc., 1969), 19.

[12] Ibid.

[13] Ervin Laszlo, The Systems View of the World: A Holistic Vision for Our Time, 2nd edition. (Cresskill, NJ: Hampton Pr, 1996), 30.

[14] Michael A. Hayden, “Systems: An Explanation and a View of Their Relationships to Technology,” The Journal of Epsilon Pi Tau 18, no. 2 (1992): 15.

[15] Laszlo, The Systems View of the World, 28–29.

[16] Ibid., 28.

[17] Ibid., 30.

[18] Ibid., 6; Bertalanffy, General System Theory, 41; Laszlo, The Systems View of the World, 32.

[19] Laszlo, The Systems View of the World, 32.

[20] Steven Johnson, Emergence: The Connected Lives of Ants, Brains, Cities, and Software, Reprint edition. (New York, NY: Scribner, 2002), 138–140.

[21] Laszlo, The Systems View of the World, 32; Klaus Krippendorff, The Semantic Turn: A New Foundation for Design (Boca Raton: CRC/Taylor & Francis, 2006), 50; Johnson, Emergence, 138.

[22] Thomas Kuhn, The Structure of Scientific Revolutions, 3rd ed. (Chicago: University of Chicago Press, 1996), 10; Laszlo, The Systems View of the World, 37.

[23] Laszlo, The Systems View of the World, 37.