Myth Versus Lethality: Losing the Plot in the Information War

Former Defense Secretary and retired General James Mattis is said to have told Marines in Iraq that the most important six inches on the battlefield were between their ears. He was referring to the need for calm under fire. Today, his warning is appropriate for everyone, everywhere, because the United States is in an information war—and it is losing.

This is not a metaphor. It’s not science fiction. Few national security thinkers and hardly anyone in the wider public have recognized its scope, but the information war’s cognitive battlefront is ingrained in the very fabric of modern society. It is an amorphous digital skin stretched across the entire planet, running over, under, and through individuals, political parties, and nation-states. Its battles may be virtual, but their consequences are very real.

Don’t worry—this isn’t yet another story about Russian election interference. Russia is responsible for neither the swiftly-changing character of war nor the changing character of the United States. They were simply the first to grasp war’s new weapons, because the future happened there first.

This article is instead about how the Pentagon’s enthusiastic embrace of the National Defense Strategy’s emphasis on lethality in preparation for a future shooting war risks losing the plot in the narrative war we are already in. Defense planners assiduously designing more resilient operating concepts and longer-range strike platforms risk underestimating or ignoring altogether what has become the center of gravity: the information space itself.

Today, actors both malign and mundane are wielding information weapons in pursuit of their political aims. These weapons are persistent, pervasive, and persuasive. The United States can no longer afford to treat information operations as a supporting effort or mere annex to a larger plan. Where national security used to begin at the water’s edge, it now begins in your head, and the United States’ national security architecture must adapt.

In short, America needs a theory of victory for the information age.

War Never Changes (Except When it Does)

War has a consistent logic. It is and has always been about changing human behavior by imposing will. But its character is inherently mutable, because security communities continuously seek competitive advantage over one another through changing circumstances.

Historically, the most efficient way for a state to impose its will on another has been through physical coercion. But insightful theorists such as Carl von Clausewitz realized this was but a reflection of war’s true nature. Clausewitz believed war was “a trial of moral and physical forces by means of the latter.” He compared those means to the wooden hilt of a sword, that is, merely the apparatus by which we control the real weapon—the moral (or cognitive) force.

The character of war has changed dramatically since Clausewitz’s day, and an opponent may now be targeted more effectively with Tweets instead of Tomahawk missiles.

Deterrence, after all, exists only in the mind of the deterred, and so does the will to fight when deterrence fails. In the past, a state needed to threaten or even invade its neighbor to impose on them the political changes it desired. But what if that neighbor’s leaders—or worse, its population—could be convinced to make those changes on their own without a shot being fired? Now, that is possible for the first time ever, thanks to the competitive connectivity of the information age. Belligerents can direct Clausewtiz’s metaphorical blade itself.

Russian Information Opertaions (iStock)

Russian military theorists recognized decades ago that information weapons had become more potent than physical ones and it was these, they believed, that had brought down the Soviet Union by eroding it from within. New-generation war, they determined, was marked by psychological tactics such as changing citizens’ moral values,” “manipulating social consciousness,” and “undermining state authority,” not by lethal force.

“Using such weapons,” wrote another Russian thinker in 2004, “it will be possible to exert long-range controlling effects on persons, and consequently on the course and results of election campaigns [and] on the decision-making of presidents.” Sound familiar?

Russian concepts of information deterrence and reflexive control seek to manipulate adversaries into making poor decisions, and then to second-guess those decisions once they are made. Similarly, China’s three warfares doctrine synchronizes the employment of strategic psychological operations, global media narratives, and weaponized legalist tactics to establish precedent, instill doubt, and erode international norms. On the defensive side, Russia recently re-introduced ideological commissars into its military formations to monitor its soldiers and ostensibly to defend them from malign influences. China never got rid of them in the first place.

But great powers are not the only ones employing information warfare. International organizations, multinational corporations, and hyper-empowered individuals can all play significant roles in foreign affairs today. The fact that competition is increasingly digital and virtual means that a state’s size and wealth matters less than the skill of its information warriors and the reach of its networks, a fact exemplified by the outsized influence of tiny Macedonia’s infamous troll farms during the 2016 election cycle.

Public sentiment has always been helpful in war. It has even been necessary for liberal democracies wishing to sustain lengthy military campaigns. But it is now a strategic weapon in its own right, because people almost everywhere are more empowered and more connected than ever. Those who harness it can facilitate or frustrate the whims of those who mistakenly assume they have it or ignore it altogether.

The American defense establishment has for too long understood information warfare primarily as information security. It has focused on the integrity of transmission systems rather than the transmissions themselves, a fact now acknowledged by the commander of Army Cyber Command, General Stephen Fogarty. 

Manufacturing Reality

Walter Lippman, the father of modern journalism, wrote at the dawn of the age of mass media that a community that could not distinguish between the truth and lies had no hope of liberty. In his 1922 book Public Opinion, a profound work with enormous contemporary relevance, Lippman described how people use their impressionable imaginations, to create their framework of reality since no one can have much direct experience of the real world.

Lippman coined the phrase “manufacture of consent” to explain how governments could use then-new technologies—like broadcast radio—to mold public opinion, threatening to render democracy obsolete. Today’s technologies of personalized influence make broadcast radio look almost laughably benign in comparison, yet it was radio that propelled the rise of tyrants like Mussolini and Hitler.

The cognitive domain has always been a target of manipulation. Open societies like the United States even welcomed the cacophony of voices, viewing them as a symbol of strength and resilience. Until recently, it was believed the truth itself needed no defense, and that while lies may spread more quickly, the truth would eventually win the race.

We now live in a world where the truth is not something objectively out there waiting to be discovered. Instead, truth is malleable, something continuously co-created by a hive of content creators. Now, an individual with a cheap laptop and a broadband connection can tailor content for millions far more cheaply and effectively than the broadcast communication titans of the past. Today, from the Philippines and the Middle East to Sweden and the United Kingdom, shadowy data brokers, psychographics experts, and memetic warriors are pushing vast influence campaigns to digitally manifest Hobbes’s bellum omnium contra omnes—the war of all against all.

The point is not to exaggerate the threat, but only to recognize it. Influence campaigns are not magic. Successful ones, like any clandestine operation, don’t shove—they nudge. They cannot create fissures, but they are potent tools for widening them. The intersection of surveillance capitalism, individualized computational propaganda, and good old fashioned human nature is making it possible to manufacture not mere consent, but something like reality itself. 

Beyond Lethality

Clausewitz defined war as the continuation of politics with the admixture of other means. Many, if not most military theorists in the West cling to that definition as if it came from scripture. But Clausewitz himself warned against allowing any conceptual framework to become a conceptual cage. What if, turning Clausewitz on his head, politics was the continuation of war instead of the other way around?

We will find out soon enough, because, as the father of computing himself, Alan Turing said, “This is only a foretaste of what is to come.” The share of the world that is online has at least quadrupled in the last decade, and Facebook already has more than two billion users. And while half of the world remains unconnected, this is a fact tech companies are working hard to remedy.

Soon, fifth-generation telecommunications platforms up to a hundred times faster than current systems will eliminate latency and enable a host of new applications to take advantage of the burgeoning ocean of data, tailoring augmented reality advertisement—and thus influence campaigns—to every single individual. Ubiquitous connectivity and flawless deepfakes may soon allow seven billion people to live in their own customized reality, each one a target, each one a weapon.

Orwell envisioned the future as an authoritarian boot stomping on the face of humanity. Instead we see an endless feed of curated advertisements and conspiracy videos that achieve the same aims much more effectively.

Democracies rely upon well-informed citizens who can reach consensus about threats and then compromise on responses in order to function. Today’s information warfare attacks these vulnerabilities by making consensus and compromise impossible, resulting in political gridlock—or worse, civil war. 

There are measures the United States can take to partially mitigate this threat. Secretary of Defense Mark Esper declared election security an enduring mission for the military, acknowledging that influence operations were of a “scope and scale never before imagined.” This is a step in the right direction, but the scale of the challenge demands a comprehensive reimagining of the way the U.S. conceives of national security altogether.

The United States should follow the lead of countries already on the digital front lines, such as Estonia. After suffering concerted Russian cyber and information attacks in 2007, Estonia made the information literacy of its citizens an education priority and information security a main effort of national defense. It is now regarded as the world’s most advanced digital state and NATO’s vanguard in the information domain.

Likewise, the U.S. intelligence community—long accustomed to prioritizing the needs of senior government officials—must now place more emphasis on informing the public at large. It should turn away from some of its inward focus on the executive branch and deliver more threat assessments and fact-checking services to its ultimate stakeholders, the citizenry themselves.

The U.S. Armed Services, moreover, must fully embrace this new domain of warfare, treating it as a supported domain instead of a supporting effort. They should begin rethinking classical military terms like deterrence, attrition, and maneuver as applied to intangible elements like national will. They should recognize public affairs officers are no longer merely mouthpieces to explain why the military is conducting an exercise or deployment, but are in fact in combatants themselves, critical operational assets on a global narrative battlefield.

Some might say this dangerously extends the military’s writ by conflating war with not war. Sir Hew Strachan famously remonstrated that “security concerns are not war,” and warned against the “danger of militarizing issues that would be best not militarized.”

Fair enough. Equating foreign influence with war is certainly a dangerous mutation of international norms and poses difficult questions to open societies like the United States.

Yet Strachan also agreed with Clausewitz himself that the single most important task of strategy is to understand the nature of the war it is addressing. It might be more dangerous to adhere to outdated distinctions that no longer apply, willfully ignoring a struggle that we are already in. The opponents of the United States certainly are not making that mistake. 

The new information environment. (iStock)

Conclusion

Ultimately, the most important thing the United States can do is simply to fight back. If you are reading this, you’re in the information war, whether you like it or not. You are both the target and the weapon—every photo you like, every article you share, and every entity you friend equates to digital fires on the narrative battlefield.

If you disagree, that is alright. After all, divide et impera is the oldest strategy in the world.


Zachery Tyson Brown is a Security Fellow at the Truman National Security Project, a member of the Military Writers Guild, and social media manager with The Strategy Bridge. The views expressed herein are the author’s alone and do not reflect the official policy or position of the Department of Defense, the Intelligence Community, or the U.S. Government.


Have a response or an idea for your own article? Follow the logo below, and you too can contribute to The Bridge:

Enjoy what you just read? Please help spread the word to new readers by sharing it on social media.


Header Image: The Internet and Information War (Ryccio/Getty)