We might have AI-powered murder drones to worry about now, a U.N. report suggests
In March 2020, as a civil war raged in Libya between the United Nations-backed Government of National Accord and the insurgent Libyan National Army, a drone known as a lethal autonomous weapons system (LAWS) was introduced to the battlefield. While drone warfare is anything but new, a report from the United Nations suggests that something different happened in this attack: The drone is believed to have selected a target and pursued members of the Libyan National Army without human control, raising the possibility of an autonomous killing carried out by machines.
First, it's important to know what the report actually said. The 548-page document that was commissioned by the U.N. and written by a panel of experts said that the drone, a Turkish-made Kargu-2, was deployed by the Government of National Accord as members of the Libyan National Army were in retreat. What happened next, per the report, is that the drone "hunted down and remotely engaged" the militia members as they fled. It also concluded that the drone was "programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true 'fire, forget and find' capability." That capability refers to the ability of a weapon to guide itself to a target.
What the report does not say is whether the drone actually killed anyone. It's possible to read between the lines of the report, which is still under review by a U.N. sanctions committee, and conclude that there were implied casualties from the attack. But it is never stated explicitly.
You can safely assume that the machine is capable of committing war crimes all on its own though. The manufacturer of the Kargu-2, a company called STM, markets the drone by noting that it can be operated manually or autonomously and uses "machine learning" and "real-time image processing" to identify targets. The Kargu-2 is known as a loitering weapon, which essentially means that it attacks by dive bombing its target. In a demonstration video, the company shows the drone targeting a crowd of mannequins, locking onto them, and attacking — a not-particularly-subtle suggestion of the kind of damage the device can do.
"If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill."
Still, there are tons of questions about the attack, with some pretty massive implications if the drone did in fact kill anyone. Zachary Kallenborn, a research affiliate with the National Consortium for the Study of Terrorism and Responses to Terrorism at the University of Maryland, believes that the U.N. report implies that the drone did cause casualties. Writing in the Bulletin of the Atomic Scientists, he noted, "If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill."
Others are less sure of what exactly happened in that March 2020 incident. On top of a lack of clarity as to whether the drone actually killed anybody, there are questions as to whether the machine actually acted on its own or if there was human oversight or control. Jack McDonald, a lecturer in war studies at King's College London, noted on Twitter that "we don't know if they actually attacked any targets without data connectivity," and said the report provides "no evidence ... to say which scenario (or any variant in between) actually happened." Ulrike Franke, a senior policy fellow at the European Council on Foreign Relations, also expressed doubt that the report was that big of a deal. On Twitter, she asked, "How is this the first time of anything?" and noted, "Loitering munitions have been on the battlefield for a while."
Similar attacks like the one described in the U.N. report have occurred elsewhere. In the ongoing conflict in the Nagorno-Karabakh region, Azeri forces have reportedly targeted Armenians occupying the disputed land with loitering munitions. Hamas and Israel have also reportedly used similar weaponry in attacks.
Still, the prospect of a drone operating all on its own to identify a target — and then attack it — is one worth fretting over. The vast majority of people oppose the use of lethal autonomous weapons systems, and experts have warned about the potential for AI-powered weaponry to go wrong. Odds are the attack in Libya isn't the inciting incident that sets us down the path of the Terminator franchise, but it's something to keep tabs on just in case.