Report: Israel Used AI System ‘Lavender’ to Identify Palestinian Potential Targets

Israeli soldier during battles in Gaza (Israeli army website)
Israeli soldier during battles in Gaza (Israeli army website)
TT

Report: Israel Used AI System ‘Lavender’ to Identify Palestinian Potential Targets

Israeli soldier during battles in Gaza (Israeli army website)
Israeli soldier during battles in Gaza (Israeli army website)

Israel used an Artificial Intelligence-powered database that identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.

According to a report published by the Guardian newspaper on Thursday, the intelligence sources claim that in addition to talking about their use of the AI system, called Lavender, the Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.

Israel’s use of powerful AI systems in its war on Hamas has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines, the newspaper wrote.

“This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

The testimony from six intelligence officers, all who have been involved in using AI systems to identify Hamas and Palestinian Islamic Jihad (PIJ) targets in the war, was given to the journalist Yuval Abraham. Their accounts were shared exclusively with the Guardian in advance of publication.

All six said that Lavender had played a central role in the war, processing masses of data to rapidly identify potential “junior” operatives to target.

Four of the sources said that, at one stage early in the war, Lavender listed as many as 37,000 Palestinian men who had been linked by the AI system to Hamas or the Islamic Jihad.

Lavender was developed by the Israeli Army’s elite intelligence division, Unit 8200, which is comparable to the US’s National Security Agency.

Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants.

Attacks on such targets were typically carried out using unguided munitions known as “dumb bombs”, the sources said, destroying entire homes and killing all their occupants.

One intelligence officer said, “You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs].”

According to conflict experts, if Israel has been using dumb bombs to flatten the homes of thousands of Palestinians who were linked, with the assistance of AI, to militant groups in Gaza, that could help explain the shockingly high death toll in the war.

An Israeli army statement described Lavender as a database used “to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of (terrorist) organisations. This is not a list of confirmed military operatives eligible to attack.”

The statement added, “the Israeli army does not use an artificial intelligence system that identifies (terrorist) operatives or tries to predict whether a person is a (terrorist). Information systems are merely tools for analysts in the target identification process.”

Multiple sources told the Guardian that Lavender created a database of tens of thousands of individuals who were marked as predominantly low-ranking members of Hamas’s military wing. They said this was used alongside another AI-based decision support system, called the Gospel.

The sources added: “There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.”

One source said, “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

During the first week of the conflict, another source said, permission was given to kill 15 non-combatants to take out junior militants in Gaza. However, he said, estimates of civilian casualties were imprecise, as it was not possible to know definitively how many people were in a building.

Another intelligence officer said that more recently in the conflict, the rate of permitted collateral damage was brought down again. But at one stage earlier in the war they were authorised to kill up to “20 uninvolved civilians” for a single operative, regardless of their rank, military importance, or age.



Israel Warfare Methods 'Consistent With Genocide', Says UN Committee

Israel's warfare practices in Gaza "are consistent with the characteristics of genocide", according to the United Nations Special Committee - AFP
Israel's warfare practices in Gaza "are consistent with the characteristics of genocide", according to the United Nations Special Committee - AFP
TT

Israel Warfare Methods 'Consistent With Genocide', Says UN Committee

Israel's warfare practices in Gaza "are consistent with the characteristics of genocide", according to the United Nations Special Committee - AFP
Israel's warfare practices in Gaza "are consistent with the characteristics of genocide", according to the United Nations Special Committee - AFP

Israel's warfare in Gaza is consistent with the characteristics of genocide, a special UN committee said Thursday, accusing the country of "using starvation as a method of war".

The United Nations Special Committee pointed to "mass civilian casualties and life-threatening conditions intentionally imposed on Palestinians", in a fresh report covering the period from Hamas's deadly October 7 attack in Israel last year through to July, AFP reported.

"Through its siege over Gaza, obstruction of humanitarian aid, alongside targeted attacks and killing of civilians and aid workers, despite repeated UN appeals, binding orders from the International Court of Justice and resolutions of the Security Council, Israel is intentionally causing death, starvation and serious injury," it said in a statement.

Israel's warfare practices in Gaza "are consistent with the characteristics of genocide", said the committee, which has for decades been investigating Israeli practices affecting rights in the occupied Palestinian territories.

Israel, it charged, was "using starvation as a method of war and inflicting collective punishment on the Palestinian population".

A UN-backed assessment at the weekend warned that famine was imminent in northern Gaza.

Thursday's report documented how Israel's extensive bombing campaign in Gaza had decimated essential services and unleashed an environmental catastrophe with lasting health impacts.

By February this year, Israeli forces had used more than 25,000 tonnes of explosives across the Gaza Strip, "equivalent to two nuclear bombs", the report pointed out.

"By destroying vital water, sanitation and food systems, and contaminating the environment, Israel has created a lethal mix of crises that will inflict severe harm on generations to come," the committee said.

The committee said it was "deeply alarmed by the unprecedented destruction of civilian infrastructure and the high death toll in Gaza", where more than 43,700 people have been killed since the war began, according to the health ministry in the Hamas-run territory.

The staggering number of deaths raised serious concerns, it said, about Israel's use of artificial intelligence-enhanced targeting systems in its military operations.

"The Israeli military’s use of AI-assisted targeting, with minimal human oversight, combined with heavy bombs, underscores Israel’s disregard of its obligation to distinguish between civilians and combatants and take adequate safeguards to prevent civilian deaths," it said.

It warned that reported new directives lowering the criteria for selecting targets and increasing the previously accepted ratio of civilian to combatant casualties appeared to have allowed the military to use AI systems to "rapidly generate tens of thousands of targets, as well as to track targets to their homes, particularly at night when families shelter together".

The committee stressed the obligations of other countries to urgently act to halt the bloodshed, saying that "other States are unwilling to hold Israel accountable and continue to provide it with military and other support".