Monday 15 April 2024

'Israel Was Defeated': The Collapse of its AI Algorithm-led Wonder-Weapon


Israeli journalist, Yuval Abraham, has written a detailed, multiple-sourced detailing how the Israeli forces have marked tens of thousands of Gazans as suspects for assassination using an AI targeting system.

  • 'Israel Was Defeated': The Collapse of its AI Algorithm-led Wonder-Weapon
    Is human responsibility for killing ‘others’ somehow absolved by the ‘scientific’ intent of AI machine-generated kill-lists – when they “go crazy”? (Illustrated by Hadi Dbouk To Al Mayadeen English)

International law is based on human responsibility and ultimate accountability for when humans kill other humans. The law with respect to armed combatants is the more extensive, but personal responsibility applies equally to the killing of civilians, women, and children.

But what if it is claimed that the killing is directed by ‘machine’ -- by Artificial Intelligence, based on algorithmic ‘science’?

Is human responsibility for killing ‘others’ somehow absolved by the ‘scientific’ intent of AI machine-generated kill-lists – when they “go crazy”?

This is the issue raised by the Israeli forces’ use of ‘Lavender’ AI to furnish Israel with kill-lists in Gaza.

Israeli journalist, Yuval Abraham, has written a detailed, multiple-sourced, ‘whistle-blower’ account, detailing how the Israeli forces have marked tens of thousands of Gazans as suspects for assassination using an AI targeting system with little human oversight and a permissive policy for casualties.

The system’s creator, the current commander of the elite Israeli intelligence unit 8200 earlier had made the case for designing a ‘target machine’ based on AI and machine-learning algorithms that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes, in the heat of a war.

As Abraham detailed, 

“Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes”.

“During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing”.

 

Just to be plain: AI-generated ‘genocide’ drives a stake through the heart of International Humanitarian Law. 

“The result is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions”.

Who, then, is responsible? Who must be held accountable?

The system was flawed from the outset. Hamas’ Al-Qassam military forces operate from deep tunnels underground, where they have their sleeping quarters -- all of which makes them impervious to face-recognition programmes operated from Israeli aerial reconnaissance overhead Gaza.

Secondly, as a senior office ‘B’ explained to Abraham, “We didn’t know who the ‘lower-level’ operatives [on the surface] were”. Qassam fighters and Gaza civilians don’t look any different. There is no identifying characteristic to ‘Hamas Man’ that positively distinguishes him from any other Gaza male -- so Lavender identified these ‘targets’ as ‘affiliated to Hamas’ on fluid borderlines, such they may have once joined some Whatsapp group that once included a Hamas member, or they lent their phone to their families or left it charging at home. 

“They wanted to allow us to attack [the junior operatives] automatically. That’s the Holy Grail. Once you go automatic, target generation goes crazy”.

“According to sources, the machine gives almost every single person in Gaza a rating from 1 to 100, expressing how likely it is that they are a militant”. 

Having spent some years working in Gaza, let me say that everyone knew or spoke to someone from Hamas in Gaza. Hamas overwhelmingly won elections there in 2006: Nearly everyone could therefore be said -- in one way or another – to be ‘affiliated’. 

It gets worse: 

At 5 a.m., [the air force] would come and bomb all the houses that we had marked,” B. said. “We took out thousands of people. We didn’t go through them one by one — we put everything into automated systems, and as soon as one of [the marked individuals] was at home - he immediately became a target. We bombed him and his house”. 

The army preferred to only use “dumb” bombs … You don’t want to waste expensive bombs on unimportant people — it’s very expensive for the country and there’s a shortage [of those bombs],"  said C.

The author of the Lavender system -- Brigadier General Yossi Sariel -- had written anonymously in The Human Machine Team (2021) that synergy between ‘Human and Artificial Intelligence will Revolutionize our World’.  Clearly, his enthusiasm for this revolution in warfare was believed by the Israeli leadership (and some in Washington too – see, for example, this piece by John Spencer, Chair of Urban Warfare Studies at the US Army’s elite military academy, WestPoint). Hence Netanyahu’s repeated claim that Israel stood at the brink of a ‘Great Victory’ in Gaza with 19 out of the 24 Hamas Brigades dismantled. Now we know it was nonsense.

AI was to have been Israel’s secret weapon. Yossi Sariel (Lavender’s originator) recently made his mea culpa (reported in The Guardian): Sariel’s critics, in a report quoted by The Guardian believe that Unit 8200’s prioritisation of “addictive and exciting” technology over more old-fashioned intelligence methods had led to disaster. One veteran official told The Guardian that the unit under Sariel had “followed the new [AI] intelligence bubble”.

For his part, Sariel is quoted as telling colleagues in the wake of 7 October that I accept responsibility for what happened in the most profound sense of the word”, he said. “We were defeated. I was defeated.”

Yes -- and tens of thousands of innocent Palestinians, women, and children have been brutally killed as a result. And Gaza reduced to rubble. Yuval Abraham’s investigative reporting should be referred to the International Court of Justice (ICJ) examining the evidence of ‘suspect Genocide’ in Gaza. 

The opinions mentioned in this article do not necessarily reflect the opinion of Al mayadeen, but rather express the opinion of its writer exclusively.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home