Defending a nation is costly. When a government decides to allocate resources to defense, it takes those resources away from other potentially productive investments. Given that some amount of treasure ought to be spent on defense, though, the question of how much becomes one of optimization. What is the right amount of defense funding to maximize the welfare of a society?
To answer this question, one must estimate the benefits of a certain level of defense spending and compare it to the benefits that could be realized by using those same resources in another way. Just as it always has, America faces this question today. To explore a methodology for optimizing defense spending, this paper will use America’s involvement in European defense as a vignette.
Cost of Deterrence
President Obama recently requested $3.4 billion for a European Reassurance Initiative (ERI) to provide for additional troops and exercises in Europe. This initiative is in addition to the estimated $2.6 billion ($40,000 per U.S. troop at 65,000 troops) the Department of Defense already spends to station forces in Europe rather than at home. The U.S. government also contributes about $500 million each year towards funding NATO. Summing these costs yields a total annual bill of nearly $6.5 billion for America’s European presence. We do not consider the fixed costs of each base in Europe since those troops would need to be stationed at new bases at home if the European facilities were closed, and the same RAND study that estimated the cost of each troop at $40,000 concludes that overseas bases cost approximately the same as bases in the continental United States.
But the true cost of America’s defense of Europe is in forgoing the alternative investments U.S. policy makers could pursue with those funds, including anything from education to infrastructure. The benefits of these investments can be described and measured with a metric called the social rate of return. The metric is an attempt to capture all of the benefits that a public investment accrues to society in a single number that is usually associated with purely financial gain. And estimated social rates of return typically range anywhere from 5 to 10 percent. Our simulation will be run with 7 percent, since this is what the Office of Management and Budget recommends.
Cost of War
What the United States gets in return for military spending is a function of the odds of conflict and the cost of a conflict, were one to happen. This simulation assumes costs can be separated into three categories: combat operations, welfare losses, and loss of life. This article will only consider a conventional conflict that does not reach the U.S. homeland and does not escalate to nuclear war.
One way to estimate the cost of combat operations is by looking to the cost of the last major theater war in Europe. Fighting in the European theater during World War II cost the American taxpayer roughly $2 trillion in today’s dollars (absent a better methodology, $2 trillion is simply half of the total $4 trillion spent by the United States in both theaters). Because defense-inflation is generally higher than regular inflation, a war under the same circumstances today should cost more than in the past. On the other hand, the United States has a greater advantage relative to possible adversaries than it did against the Third Reich. It is not clear which would be the dominant factor, so this article uses the same figure as World War II as a starting point. It is possible to run the simulation with different numbers, here.
Next, the costs associated with a loss of welfare must be estimated. One study concludes that U.S. citizens would be willing to permanently give up 3% of consumption annually to avoid a foreign war. In 2015, U.S. household consumption was $35,138. If we multiply this amount by the total U.S. population (312 million), and 3%, we arrive at a total cost of $328.9 billion. This is a lower-bound estimate and the data encompass only conflicts after 1954, which were all much smaller than WWII. When compared in terms of total operational costs, WWII was 4 times as expensive as the next most expensive conflict (OEF/OIF combined). Given this, we can safely double our calculated lower bound to $658 billion as a reasonable point estimate of the welfare cost of a conventional conflict in Europe.
Finally, America lost 183,000 lives in Europe during World War II. Across both theaters, America lost around 2.5 percent of all service members to combat. This 2.5 percent represents the highest mortality rate of all American conflicts, with the exception of the Civil War. Figure 1 shows the mortality rates of 11 of the largest conflicts in America’s history. It shows that mortality rates may be on a downward trend, and that the mortality rates are bounded between .5 and 2.5%. We will use the high-end 2.5% as our point estimate. There are currently approximately 2.3 million servicemembers in the U.S. military. If America doubled or tripled this number, our 2.5% mortality rate would suggest 115,000 or 172,500 deaths, respectively.
Our simulation will allow for any value in that range. Values will be uniformly distributed, since we do not have a clear reason to assume a normal distribution. This just means that in any iteration of our simulation, the number of deaths could be anywhere between 115,000 and 172,500. This allows us to avoid making an arbitrarily precise estimate. The Environmental Protection Agency estimates the monetary value of a life at $9.1 million while the Food and Drug Administration estimates it at $7.9 million. We will use the lower estimate and round down to $7 million to be as conservative as possible. Lower is more conservative because as war becomes more expensive, preventing that war becomes more affordable in comparison. $7 million multiplied by 115,000 lives is $805 billion.
The total expected, one-time cost of a war then is approximately $3.46 trillion. With the costs of both war and deterrence established, all that is needed to conduct a proper cost-benefit analysis are the probabilities of conflict with and without deterrence.
Probabilities of War
During the first 45 years of the 20th century, the U.S. did not actively deter war in Europe (as evidenced by no fewer than 4 neutrality acts being passed in the 30s and historically low defense spending), which experienced two major theater wars in that timeframe. Subtracting the 10 years in which war was actively waged, the early 20th century suggests a 5.7 percent chance of war without any deterrent, in any given year. Then nuclear arms came to Europe. Anecdotal evidence suggests the presence of nuclear weapons made major theater war approximately 4.7 times less likely. This takes us to a post-nuclear age, where an annual risk of war is about 1.2 percent (5.7/4.7), without conventional deterrence.
We have observed no major theater wars in Europe for the last 70 years. While there were certainly small wars in Eastern Europe, they did not reach a scale which necessitated U.S. involvement to ensure its own security; in each, America intervened as a matter of choice. While we need to estimate an annual probability of war post-WWII, it is somewhat problematic in that it is difficult to estimate a probability of an event happening when it has never happened before. Figure 2 shows the annual odds of war versus the odds of no war happening during the entire post-WWII period.
Since one did not break out, It is more likely that the odds of a major theater war not occurring were greater than 50 percent, than otherwise. Given this, it seems reasonable to assume the odds were 75 percent. For there to have been a 75 percent chance of no major theater wars occurring, the annual risk of war with conventional deterrence would be approximately .4 percent.
This simulation will use 1.2 percent chance of war without conventional deterrence (given the presence of nuclear weapons), and .4 percent chance of war with conventional deterrence.
With all of the assumptions settled, we can run a Monte Carlo simulation to determine whether it would cost more to risk war while investing in research and infrastructure, or whether funding conventional deterrence in Europe is a better course of action for America. In a Monte Carlo simulation, we run through a scenario many times to assess the most probable outcomes. Each iteration of our simulation will run for 10 years, and we will assess 1,000,000 iterations. We must utilize Monte Carlo simulation, rather than direct calculation, due to the included conditional probabilities and bounded variables.
Each year we test whether war breaks out using our estimated probabilities. If it does, we add the cost of war to our tab for defense. If peace reigns, we just add any deterrence costs to our tab. Each year we then increase the total sum we've spent on war and defense by the social rate of return we set (7%). Without conventional deterrence, the cost could end up being as low as $0...or much higher. It depends on whether war occurs or not.
Our simulation estimates the average 10-year cost without deterrence is expected to reach a total of approximately $500 billion. With deterrence, on the other hand, we would expect the defense of Europe to cost approximately $240 billion. Therefore, risking a major theater war in Europe is about $26 billion per year more expensive than funding deterrence. A more detailed explanation of the methodology is available here, where readers can run their own simulation using the Monte Carlo method and varying the initial assumptions.
Figure 3 shows the costs of every iteration of the simulation without deterrence. Figure 4 shows the same for the simulation with deterrence. Clearly, and as we would expect, there are many more iterations without deterrence that saw America bear a large cost due to war than in the simulation with deterrence.
While this paper has worked to use reasonable assumptions, an allowance for some error is necessary given the uncertain nature of the topic. Because of this, each variable should be tested to see how different values impact the result.
The first variables to test for sensitivity are the probabilities of war. Our point estimates are 1.2% without deterrence in any given year and .4% with deterrence. This yields a gain of .8% with deterrence. Holding all other assumptions constant, we can reduce this gain to .2% by either reducing the chance of war without deterrence to .6% or raising the chance of war with deterrence to 1% and still find that deterrence is an attractive investment.
The next variable is the total cost of a war. We use a baseline of approximately $3.46 trillion for the total cost, including operational, lost welfare, and human costs. Holding all other variables constant, we can lower this value to $800 billion and still find that deterrence is a sound investment. This figure could be attained with just the costs associated with the paper’s estimated loss of human life.
There is no sensible change to the inflation and interest rates that could induce a change an impactful change in our results. A lower social rate of return and higher inflation rate does though slightly reduce the gains from deterrence.
It is clear that if our initial assumptions are accurate, U.S. policy makers would be making a grave error in failing to fund deterrence in Europe. The next logical question then is: what should American taxpayers be willing to spend on European security? In order to reduce the annual odds of war from 1.2 percent to .4 percent, America would break even at about $30 billion per year spent on deterrence in Europe.
If we take into account only the cost of basing troops in Europe and conducting exercises there, Americans spend much less than this figure. With our estimate of $6.5 billion per year, the case for investing in deterrence is compelling.
There are many other factors to consider, though. The chances of a conventional war spiraling into nuclear war is also a possibility, and it might amplify the costs of a World War III far above those of World War II. This makes defending Europe look even more attractive. However, it’s likely that if Americans stopped defending Europe, the U.S. military could significantly decrease its force posture. This change might look like a shift from active engagement to something resembling offshore balancing. If America could cut 5 percent of its 2015 military budget ($600 billion) after leaving Europe, it would quickly surpass the $30 billion break-even point. At that point, defending Europe starts to look less attractive...but then there are new factors to consider.
Another consideration is that spending on deterrence might impact the cost of war should one break out. But it isn’t clear whether it would make it cheaper or more expensive. On the one hand, being more prepared might make a war easier, quicker, and cheaper. On the other, spending on deterrence might cause an adversary to similarly increase defense spending, making an eventual war more expensive.
In all this, what is most clear is that decisions about military spending are complex. Qualitative assessments about the need for spending in certain areas are important, but quantitative analysis to determine their viability provides an important complement. A look at the underlying numbers bounds the discussion in reality. This paper’s methodology is a quick and easy (dare we say, fun?) way to assess those numbers.
Sean Lavelle is a Naval Officer, a graduate of Johns Hopkins University and the U.S. Naval Academy, and the creator of IsTheMilitaryWorthIt.com. The opinions expressed are his own and do not reflect the official position of the Department of the Navy, the Department of Defense, or the U.S. Government.
Have a response or an idea for your own article? Follow the logo below, and you too can contribute to The Bridge:
Enjoy what you just read? Please help spread the word to new readers by sharing it on social media.
Header Image: Strikers from 2nd Squadron, 2nd Cavalry roll into the Smardan training area in Romania to meet up with soldiers from the 173rd Airborne Brigade who parachuted in earlier as part of Operation Atlantic Resolve, Tuesday, March 24, 2015. (Michael Abrams | Stars and Stripes)
 We calculate this by taking .75, raising it to the reciprocal of 60 years, and subtracting the result from 1.