A world in which algorithms determine the fate of soldiers and civilians alike is no longer hypothetical. AI-driven drones are reshaping warfare, raising deep ethical questions about autonomy in combat. As international policymakers scramble to set ground rules, the race is on to rein in this rapidly evolving technology.
Every day, we voluntarily give up information about ourselves to machines. This happens when we accept an online cookie or use a search engine. We barely think about how our data is sold and used before clicking “agree” to get to the page we want, dimly aware that it will be used to target us as consumers and convince us to buy something we didn’t know we needed.
But what if the machines were using the data to decide who to target as enemies that need to be killed? The UN and a group of non-governmental organisations are worried that this scenario is close to being a reality. They are calling for international regulation of Lethal Autonomous Weapons (LAWS) to avoid a near-future where machines dictate life-and-death choices.
Large-scale drone warfare unfolding in Ukraine
For several months, the Kherson region of Ukraine has come under sustained attack from weaponised drones operated by the Russian military, principally targeting non-combatants. More than 150 civilians have been killed, and hundreds injured, according to official sources. An independent UN-appointed human rights investigation has concluded that these attacks constitute crimes against humanity.
The Ukrainian army is also heavily reliant on drones and is reportedly developing a “drone wall” – a defensive line of armed Unmanned Aerial Vehicles (UAVs) – to protect vulnerable sections of the country’s frontiers.
Once the preserve of the wealthiest nations that could afford the most high-tech and expensive UAVs, Ukraine has proved that, with a little ingenuity, low-cost drones can be modified to lethal effect. As conflicts around the world mirror this shift, the nature of modern combat is being rewritten.

© UNICEF/Oleksii Filippov
Creeping ‘digital dehumanisation’
But, as devastating as this modern form of warfare may be, the rising spectre of unmanned drones or other autonomous weapons is adding fresh urgency to ongoing worries about ‘killer robots’ raining down death from the skies, deciding for themselves who they should attack.
“The Secretary-General has always said that using machines with fully delegated power, making a decision to take human life is just simply morally repugnant,” says Izumi Nakamitsu, the head of the UN Office for Disarmament Affairs. It should not be allowed. It should be, in fact, banned by international law. That's the United Nations position.”
Human Rights Watch, an international NGO, has said that the use of autonomous weapons will be the latest, most serious example of encroaching “digital dehumanisation,” whereby AI makes a host of life-altering decisions on matters affecting humans, such as policing, law enforcement and border control.
“Several countries with major resources are investing heavily in artificial intelligence and related technologies to develop, land and sea based autonomous weapons systems. This is a fact,” warns Mary Wareham, advocacy director of the Arms Division on Human Rights Watch. “It’s being driven by the United States, but other major countries such as Russia, China, Israel and South Korea, have been investing heavily in autonomous weapons systems.”
Advocates for AI-driven warfare often point to human limitations to justify its expansion. Soldiers can make errors in judgment, act on emotion, require rest, and, of course, demand wages – while machines, they argue, improve every day at identifying threats based on behavior and movement patterns. The next step, some proponents suggest, is allowing autonomous systems to decide when to pull the trigger.
There are two main objections to letting the machines take over on the battlefield: firstly, the technology is far from foolproof. Secondly, the UN and many other organisations see the use of LAWS as unethical.
“It’s very easy for machines to mistake human targets,” says Ms. Wareham of Human Rights Watch. “People with disabilities are at particular risk because they of the way they move. Their wheelchairs can be mistaken for weapons. There’s also concern that facial recognition technology and other biometric measurements are unable to correctly identify people with different skin tones. The AI is still flawed, and it brings with it the biases of the people who programmed those systems.”
As for the ethical and moral objections, Nicole Van Rooijen, Executive Director of Stop Killer Robots, a coalition campaigning for a new international law on autonomy in weapons systems, says that they would make it very difficult to ascertain responsibility for war crimes and other atrocities.
“Who is accountable? Is it the manufacturer? Or the person who programmed the algorithm? It raises a whole range of issues and concerns, and it would be a moral failure if they were widely used.”
A ban by 2026?
The speed at which the technology is advancing, and evidence that AI enabled targeting systems are already being used on the battlefield, is adding to the urgency behind calls for international rules of the technology.
In May, informal discussions were held at UN Headquarters, at which Mr. Guterres called on Member States to agree to a legally binding agreement to regulate and ban their use by 2026.
Attempts to regulate and ban LAWS are not new. In fact, the UN held the first meeting of diplomats in 2014, at the Palais des Nations in Geneva, where the chair of the four-day expert talks, Ambassador Jean-Hugues Simon-Michel of France, described LAWS as “a challenging emerging issue on the disarmament agenda right now,” even though no autonomous weapons systems were being used in conflicts at the time. The view then was that pre-emptive action was needed to get rules in place in the eventuality that the technology would make LAWS a reality.
11 years later, talks are ongoing, but there is still no consensus over the definition of autonomous weapons, let alone agreed regulation on their use. Nevertheless, NGOs and the UN are optimistic that the international community is inching slowly towards a common understanding on key issues.
“We’re not anywhere close to negotiating a text,” says Ms. Rouijen from Stop Killer Robots. “However, the current chair of the Convention on Certain Conventional Weapons (a UN humanitarian law instrument to ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately) has put forward a rolling text that is really quite promising and that, if there is political will and political courage, could form the basis of negotiations.”
Ms. Wareham from Human Rights Watch also sees the May talks at the UN as an important step forward. “At least 120 countries are fully on board with the call to negotiate a new international law on autonomous weapons systems. We see a lot of interest and support, including from peace laureates, AI experts, tech workers, and faith leaders.”
“There is an emerging agreement that weapon systems that are fully autonomous should be prohibited,” says Ms. Nakamitsu, from the UN Office for Disarmament Affairs. “When it comes to war, someone has to be held accountable.”
Where next?
Latest news
Read the latest news stories:
- As AI evolves, pressure mounts to regulate ‘killer robots’ Sunday, June 01, 2025
- Abundance of Renewable Energy Attracts Major Data Centers to Brazil Friday, May 30, 2025
- Glaciers More Sensitive to Global Warming, Now in Extreme Danger—Study Friday, May 30, 2025
- Lawmakers Work to Build Women’s Representation in Politics and the Workplace Friday, May 30, 2025
- If This Isn’t Genocide, What Is? Friday, May 30, 2025
- Africa in Control of Its Digital Future: Mobilising Domestic Resources & Strategic Partnerships Friday, May 30, 2025
- UN’s lifesaving programmes under threat as budget crisis hits hard Friday, May 30, 2025
- ‘This is not just ice’: Glaciers support human livelihoods, UN deputy chief says Friday, May 30, 2025
- ‘Justice is long overdue’: Guterres calls for reparations for enslavement and colonialism Friday, May 30, 2025
- World News in Brief: Education suffers amid DR Congo violence, WHO greenlights RSV vaccines, more hurricanes ahead for Haiti Friday, May 30, 2025
Link to this page from your site/blog
Add the following HTML code to your page:
<p><a href="https://www.globalissues.org/news/2025/06/01/40017">As AI evolves, pressure mounts to regulate ‘killer robots’</a>, <cite>Inter Press Service</cite>, Sunday, June 01, 2025 (posted by Global Issues)</p>… to produce this:
As AI evolves, pressure mounts to regulate ‘killer robots’, Inter Press Service, Sunday, June 01, 2025 (posted by Global Issues)