May 2, 2024

Athens News

News in English from Greece

AI identified bombing targets in the Gaza Strip


A report that the Israeli military is using an untested and unknown artificial intelligence database to identify targets in its bombing campaign in Gaza has alarmed human rights and technology experts, who say it could amount to “war crimes.”

Israeli-Palestinian publication +972 Magazine and Jewish media outlet Local Call recently reported that the Israeli army identifies thousands of Palestinians as potential bombing targets using an artificial intelligence targeting system called Lavender.

“This database is responsible for compiling death lists for up to 37,000 targets.”– said Rory Chalandz, a correspondent for the Al-Jazeera TV channel banned in Israel, on Thursday, reporting from occupied East Jerusalem.

Unnamed Israeli intelligence officials who spoke to the media said Lavender's margin of error was about 10%. “But that hasn't stopped the Israelis from using it to speed up the identification of often low-ranking Hamas members in Gaza and bomb them.” – said Challands.

It becomes clear that the Israeli military “use untested artificial intelligence systems… to make decisions about the life and death of civilians, Mark Owen Jones, associate professor of Middle Eastern Studies and Digital Humanities at Hamid Bin Khalifa University, told Al Jazeera. – Let's be clear: this is AI-assisted genocide, and we must now call for a moratorium on the use of AI in war.”

Israeli media reported that this method led to the deaths of many of the thousands of Gazan civilians. On Thursday, Gaza's health ministry said that as of October 7, as a result of Israeli attacks died at least 33,037 Palestinians and 75,668 injured.

The use of AI “violates” humanitarian law. “The people who interacted with the AI ​​database were often just a rubber stamp. They would scrutinize this list of kills for 20 seconds before deciding whether to authorize an airstrike or not.”said Challands.

In response to extended criticism, the Israeli military said its analysts should conduct “independent checks” to ensure that identified targets meet relevant definitions under international law and additional restrictions imposed by their forces. They refuted the idea that technology is “system” “merely a database whose purpose is to cross-match intelligence sources to obtain up-to-date layers of information on military operatives of terrorist organizations.”

But the fact that for every Palestinian fighter who was the intended target, there were “five to ten assumed civilian deaths” shows why So many civilians have died in Gaza, says Challands.

Professor Toby Walsh, an artificial intelligence expert at the University of New South Wales in Sydney, says legal scholars are likely to argue that using artificial intelligence for targeting violates international humanitarian law. “From a technical perspective, this latest news shows how difficult it is to preserve the human element while providing meaningful oversight of AI systems that are horribly and tragically fueling war.” – he said in an interview with Al Jazeera.

It is worth noting that the military would probably be 100% justified in using AI to identify potential targets with such an error. Since, according to White House spokesman John Kirby, they did not record NOT A SINGLE INCIDENTwhere Israel would violate international humanitarian law! This means that everything that happened during the war in Gaza, as well as the attack on the Iranian embassy in Syria – all this, from the US point of view, is absolutely legal.

No wonder they say: If a gentleman can't win by the rules, he changes them”.



Source link

Verified by MonsterInsights