Israel used an Artificial Intelligence-powered database that identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.
According to a report published by the Guardian newspaper on Thursday, the intelligence sources claim that in addition to talking about their use of the AI system, called Lavender, the Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.
Israel’s use of powerful AI systems in its war on Hamas has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines, the newspaper wrote.
“This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
The testimony from six intelligence officers, all who have been involved in using AI systems to identify Hamas and Palestinian Islamic Jihad (PIJ) targets in the war, was given to the journalist Yuval Abraham. Their accounts were shared exclusively with the Guardian in advance of publication.
All six said that Lavender had played a central role in the war, processing masses of data to rapidly identify potential “junior” operatives to target.
Four of the sources said that, at one stage early in the war, Lavender listed as many as 37,000 Palestinian men who had been linked by the AI system to Hamas or the Islamic Jihad.
Lavender was developed by the Israeli Army’s elite intelligence division, Unit 8200, which is comparable to the US’s National Security Agency.
Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants.
Attacks on such targets were typically carried out using unguided munitions known as “dumb bombs”, the sources said, destroying entire homes and killing all their occupants.
One intelligence officer said, “You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs].”
According to conflict experts, if Israel has been using dumb bombs to flatten the homes of thousands of Palestinians who were linked, with the assistance of AI, to militant groups in Gaza, that could help explain the shockingly high death toll in the war.
An Israeli army statement described Lavender as a database used “to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of (terrorist) organisations. This is not a list of confirmed military operatives eligible to attack.”
The statement added, “the Israeli army does not use an artificial intelligence system that identifies (terrorist) operatives or tries to predict whether a person is a (terrorist). Information systems are merely tools for analysts in the target identification process.”
Multiple sources told the Guardian that Lavender created a database of tens of thousands of individuals who were marked as predominantly low-ranking members of Hamas’s military wing. They said this was used alongside another AI-based decision support system, called the Gospel.
The sources added: “There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.”
One source said, “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
During the first week of the conflict, another source said, permission was given to kill 15 non-combatants to take out junior militants in Gaza. However, he said, estimates of civilian casualties were imprecise, as it was not possible to know definitively how many people were in a building.
Another intelligence officer said that more recently in the conflict, the rate of permitted collateral damage was brought down again. But at one stage earlier in the war they were authorised to kill up to “20 uninvolved civilians” for a single operative, regardless of their rank, military importance, or age.