It is the objections independent of technological capability that are gaining prominence among opponents of lethal autonomous weapons systems. These objections include the question of whether the use of autonomous weapons might lead to a responsibility gap where humans cannot uphold their moral responsibility, whether their use would undermine the human dignity of those combatants who are targeted, and the possibility that further increasing human distance from the battlefield could make the use of violence easier or less controlled.
Of course, lasers themselves are not a new technology. Lasers have been studied and tested for military use for decades. Recently, companies such as Lockheed-Martin, Boeing, and Raytheon have taken this existing technology, scaled it down, and adapted it for a variety of platforms with a new purpose: to shoot down weaponized drones and small munitions. This new mission set for the tactical laser offers the military a drone-killing weapon system that could keep the U.S. ahead of the power curve on the modern battlefield, especially in the fight against non-state actors and armies increasingly using drones for combat operations.
In the aftermath of the German U-boat campaign in the First World War, many in Europe and the United States argued that submarines were immoral and should be outlawed. The British Admiralty supported this view, and as Blair has described, even offered to abolish their submarine force if other nations followed suit. While British proposals to ban submarines in 1922 and 1930 were defeated, restrictions on their use where imposed that mandated that submarines could not attack a ship until such ships crews and passengers were placed in safety. This reaction to the development of a new means of war is illustrative of the type of ethical and legal challenges that must be addressed as military organizations adopt greater human-machine integration.
What does the PLA’s approach so far to the humans behind their unmanned systems reveal about its potential engagement with the challenges associated with the highly automated and autonomous systems it is currently developing? Despite the myth that such systems tend to replace humans, requiring smaller numbers of combatants with lower levels of expertise, there is clear evidence to date that the human challenges of such systems are considerable, often requiring higher levels of specialized training. In this regard, PLA’s active focus on the development of personnel to operate UAVs could constitute an early case that demonstrates that the PLA will likely confront considerable challenges in the process of learning to use such high-tech systems more effectively.
It is clear Russian development of military unmanned systems, in conjunction with ongoing modernization of its armed forces, will result in a qualitatively different and capable force. Should Russia’s ongoing successes in Syria embolden it to act elsewhere in a similar fashion, then U.S. and Western planners may not be the only ones flying a constellation of unmanned systems or directing swarms of ground vehicles and high-tech weapons. This calls for a re-evaluation of the current defense posture and review of technology development and acquisition cycles, a process that has been ongoing in the United States for some time.