As the Future of Life Institute’s open letter shows, people are concerned about the development of autonomous weapons. This concern is reasonable, if only because any weapon can be misused to advance evil goals. However, a strong case can be made in favor of autonomous weapons.

As the open letter indicated, a stock argument for autonomous weapons is that their deployment could result in decreased human deaths. If, for example, an autonomous ship is destroyed in battle, then no humans will die on that ship. It is worth noting that the ship’s AI might eventually be a person, thus there could be one death. In contrast, the destruction of a crewed warship could result in hundreds of deaths. On utilitarian grounds, the use of autonomous weapons would seem morally fine, at least if their deployment reduced the number of deaths and injuries.

The open letter expresses, rightly, concerns that warlords and dictators will use autonomous weapons. But this might be an improvement over the current situation. These warlords and dictators often conscript their troops and some, infamously, enslave children to serve as their soldiers. While it would be better for a warlord or dictator to have no army, it seems morally preferable for them to use autonomous weapons rather than them using conscripts and children.

It can be replied that the warlords and dictators would just use autonomous weapons in addition to their human forces, thus there would be no saving of lives. This is worth considering. But, if the warlords and dictators would just use humans anyway, the autonomous weapons would not seem to make much of a difference, except in terms of giving them more firepower, something they could also accomplish by using the money spent on autonomous weapons to better train and equip their human troops.

At this point, it is only possible to estimate (guess) the impact of autonomous weapons on the number of human casualties and injuries. However, it seems somewhat more likely they would reduce human casualties, assuming that there are no other major changes in warfare.

A second appealing argument in favor of autonomous weapons is that smart weapons are smart. While an autonomous weapon could be designed to be imprecise, the general trend in smart weapons has been towards ever increasing precision. Consider, for example, aircraft bombs and missiles. In the First World War, these bombs were primitive and inaccurate (they were sometimes thrown from planes by hand). WWII saw some improvements in bomb sights and unguided rockets were used. In following wars, bomb and missile technology improved, leading to the smart bombs and missiles of today that have impressive precision. So, instead of squadrons of bombers dropping tons of dumb bombs on cities, a small number of aircraft can engage in relatively precise strikes against specific targets. While innocents still perish in these attacks, the precision of the weapons has made it possible to greatly reduce the number of needless deaths. Autonomous weapons could be even more precise, thus reducing causalities even more. This seems to be desirable.

In addition to precision, autonomous weapons could (and should) have better target identification capacities than humans. If recognition software continues to irmpove, it is easy to imagine automated weapons that can rapidly distinguish between friends, foes, and civilians. This would reduce deaths from friendly fire and unintentional killings of civilians. Naturally, target identification would not be perfect, but autonomous weapons could be better than humans since they do not suffer from fatigue, emotional factors, and other things that interfere with human judgement. Autonomous weapons would presumably also not get angry or panicked, thus making it far more likely they would maintain target discipline (only engaging what they should engage).

To make what should be an obvious argument obvious, if autonomous vehicles and similar technology are supposed to make the world safer, then it would seem to follow that autonomous weapons could do something similar for warfare. But this does lead to a reasonable concern: driverless cars seem to be the future of transportation in the sense that they will always be in the future. If getting an autonomous car to operate safely on the streets is far beyond current technology, then getting an autonomous weapon system to operate “safely” in the chaos of battle seems all but impossible.

It can be objected that autonomous weapons could be designed to lack precision and to kill without discrimination. For example, a dictator might have massacrebots to deploy in cases of civil unrest. These robots would slaughter everyone in the area. Human forces, one might contend, would often show at least some discrimination or mercy.

The easy and obvious reply to this is that the problem is not in the autonomy of the weapons but the way they are being used. The dictator could achieve the same results (mass death) by deploying a fleet of drones loaded with demolition explosives, but this would presumably not be reasons to have a ban on drones or explosives. There is also the fact that dictators, warlords and terrorists can easily find people to carry out their orders, no matter how awful they might be. That said, it could still be argued that autonomous weapons would result in more murders than would the use of human killers.

A third argument in favor of autonomous weapons rests on the claim advanced in the open letter that autonomous weapons will become cheap to produce, analogous to Kalashnikov rifles. On the downside, as the authors argue, this could result in the proliferation of these weapons. On the plus side, if these highly effective weapons are so cheap to produce, this could enable existing militaries to phase out their incredibly expensive human operated weapons in favor of cheap autonomous weapons. By replacing humans, these weapons could also create savings in terms of the cost of recruitment, training, food, medical treatment, and retirement. This would allow countries to switch that money to more positive areas, such as education, infrastructure, social programs, health care and research. So, if the autonomous weapons are as cheap and effective as the letter claims, then it would seem to be a great idea to use them to replace existing weapons.

But there is the reasonable concern that decisions about military spending in some countries is not based on a rational assessment of costs and benefits. Such spending can be aimed at diverting resources from social programs and into the coffers of corporations. In such cases the availability of cheap, effective weapons would not meaningfully change defense spending.

A fourth argument in favor of autonomous weapons is that they could be deployed, at low political cost, on peacekeeping operations. Currently, the UN must send human troops to dangerous areas. These troops are often outnumbered and ill-equipped relative to the challenges they face. However, if autonomous weapons were as cheap and effective as the letter claims, then they would be ideal for these missions. Assuming they are cheap, the UN could deploy a much larger autonomous weapon force for the same cost as deploying a human force. There would also be far less political cost as people who might balk at sending their fellow citizens to keep peace in some war zone will probably be fine with sending robots.

An extension of this argument is that autonomous weapons could allow the nations of the world to engage terrorist groups, such as was the case with ISIS, without having to pay the high political cost of sending in human forces. The cheap and effective weapons predicted by the letter would seem ideal for this task.

Considering the above arguments, it seems that autonomous weapons should be developed and deployed. However, the concerns of the letter do need to be addressed. As with existing weapons, there should be rules governing the use of autonomous weapons (although much of their use would fall under existing rules and laws of war) and efforts should be made to keep them from proliferating to warlords, terrorists and dictators. As with most weapons, the problem lies with the misuse of the weapons and not with the weapons themselves.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>