It is very likely that before long swarms of killer drones will be a standard feature of battlefields around the world. This has fueled debate over whether and how to regulate its use and raised concerns about the possibility that life-or-death decisions will ultimately be transferred to artificial intelligence (ai) programs.
Below, we provide an overview of how the technology has evolved, what types of weapons are being developed, and how the debate has progressed.
How new are these weapons?
Over time, it is very likely that ai will allow weapons systems to make their own decisions about selecting certain types of targets and attacking them. Recent advances in ai technology have intensified the debate around these systems, known as lethal autonomous weapons.
But in some ways, autonomous weapons are nothing new.
Landmines, which are designed to activate automatically when a person or object passes over them, were used as early as the 19th century during the American Civil War. Apparently, they were invented by a Confederate general named Gabriel Lluviaswho called them “underground projectile.”
Although they were first used long before anyone could even conceive of ai, they are relevant to the current debate because once installed they operate without human intervention and without discriminating between intended targets and unwanted victims.
The Pentagon began expanding automatic weapons decades ago
Starting in the late 1970s, the United States began to expand this concept, with a weapon known as CAPTOR anti-submarine mine. The mine It could be dropped from a plane or ship and settle to the ocean floor, remaining there until it detonated automatically when the device’s sensors detected an enemy target.
Starting in the 1980sDozens of Navy ships began relying on the AEGIS weapons system, which uses a high-powered radar system to search for and track any incoming enemy missiles. It can be set to automatic mode to fire defensive missiles before a person intervenes.
Self-guided munitions were the next step
The next step in the progression toward more sophisticated autonomous weapons came in the form of “fire and forget” self-guided munitions such as the Advanced Medium Range Air-to-Air Missile. AIM-120which has a radar seeker that refines the trajectory of an already fired missile while attempting to destroy enemy aircraft.
Homing munitions generally cannot be recovered after being fired and act like “an attack dog sent by police to pursue a suspect,” wrote Paul Scharre, a former senior Pentagon official and author of the book no man’s army. They have a certain degree of autonomy to refine their trajectory, but Scharre defined it as “limited autonomy.” The Harpoon anti-ship missiles They work in a similar way, with limited autonomy.
‘Loitering munitions’ can be highly automated
The war in Ukraine has highlighted the use of a form of automated weaponry known as loitering munitions. These devices date back to at least 1989, when an Israeli military contractor introduced what is known as Harpy, a drone that can stay in the air for about two hours, searching for enemy radar systems over hundreds of kilometers and then attacking them.
More recently, US military contractors such as California-based AeroVironment have sold similar loitering munitions that carry an explosive warhead. The Knife 600As this unit is called, it flies overhead until it finds a tank or other target, then fires an anti-armor warhead.
Human approval is still requested before the weapon attacks the target. But it would be relatively easy to remove the human from the equation, making the device completely autonomous.
“Today there is technology that allows you to tell the device: ‘Go get me a Russian T-72 tank, don’t talk to me, I’m going to throw you, go get it,’” said Wahid Nawabi, president of AeroVironment. “And if you’re more than 80 percent confident that it’s the one, you eliminate it. “The entire mission from start to finish could be completely autonomous, except for the act of firing it.”
Unleashing drone swarms could mean bigger change
There’s no doubt where this is all going.
The Pentagon is currently working to build swarms of drones, according to a notice published a few months ago.
The end result is expected to be a network of hundreds or even thousands of ai-enhanced autonomous drones carrying surveillance equipment or weapons. The drones would most likely be positioned close to China so they could be deployed quickly if any conflict breaks out and would be used to destroy or at least degrade the extensive network of anti-ship and anti-aircraft missile systems that China has built along its coasts and artificial islands in the South China Sea.
This is just one of many efforts currently underway at the Pentagon with the goal of deploying thousands of inexpensive, autonomous and sometimes lethal drones in the next year or two that can continue operating even when GPS signals and communications are blocked.
Some military contractors, including executives at Palantir Technologies, a major military ai contractor, had alleged that they were still years could pass before being able to produce fully autonomous ai-controlled lethal attacks, as the most advanced algorithms are not yet reliable enough. Therefore, they cannot be entrusted to make autonomous life-or-death decisions, and this may be the case for some time.
Palantir maintains that ai will instead enable military officials to make faster and more accurate targeting decisions by quickly analyzing waves of incoming data, Palantir executive Courtney Bowman said. to British legislators during a hearing this year.
But there is widespread concern within the United Nations about the risks of the new systems. And while some weapons have long had a degree of autonomy, the new generation is fundamentally different.
“When this conversation started, about a decade ago, it was actually a bit science fiction,” Scharre said. “And, now, it is no longer at all. The technology is very very real.”
Eric Lipton is an investigative reporter who delves into a wide range of topics, from Pentagon spending to toxic chemicals. More by Eric Lipton