Report: Israel used AI to identify bombing targets in Gaza

  News, Rassegna Stampa
image_pdfimage_print

Israel’s military has been using artificial intelligence to help choose its bombing targets in Gaza, sacrificing accuracy in favor of speed and killing thousands of civilians in the process, according to an investigation by Israel-based publications +972 Magazine and Local Call.

The system, called Lavender, was developed in the aftermath of Hamas’ October 7th attacks, the report claims. At its peak, Lavender marked 37,000 Palestinians in Gaza as suspected “Hamas militants” and authorized their assassinations.

Israel’s military denied the existence of such a kill list in a statement to +972 and Local Call. A spokesperson told CNN that AI was not being used to identify suspected terrorists but did not dispute the existence of the Lavender system, which the spokesperson described as “merely tools for analysts in the target identification process.” Analysts “must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in IDF directives,” the spokesperson told CNN. The Israel Defense Forces did not immediately respond to The Verge’s request for comment.

In interviews with +972 and Local Call, however, Israeli intelligence officers said they weren’t required to conduct independent examinations of the Lavender targets before bombing them but instead effectively served as “a ‘rubber stamp’ for the machine’s decisions.” In some instances, officers’ only role in the process was determining whether a target was male. 

To build the Lavender system, information on known Hamas and Palestinian Islamic Jihad operatives was fed into a dataset — but, according to one source who worked with the data science team that trained Lavender, so was data on people loosely affiliated with Hamas, such as employees of Gaza’s Internal Security Ministry. “I was bothered by the fact that when Lavender was trained, they used the term ‘Hamas operative’ loosely, and included people who were civil defense workers in the training dataset,” the source told +972

Lavender was trained to identify “features” associated with Hamas operatives, including being in a WhatsApp group with a known militant, changing cellphones every few months, or changing addresses frequently. That data was then used to rank other Palestinians in Gaza on a 1–100 scale based on how similar they were to the known Hamas operatives in the initial dataset. People who reached a certain threshold were then marked as targets for strikes. That threshold was always changing “because it depends on where you set the bar of what a Hamas operative is,” one military source told +972.

The system had a 90 percent accuracy rate, sources said, meaning that about 10 percent of the people identified as Hamas operatives weren’t members of Hamas’ military wing at all. Some of the people Lavender flagged as targets just happened to have names or nicknames identical to those of known Hamas operatives; others were Hamas operatives’ relatives or people who used phones that had once belonged to a Hamas militant. “Mistakes were treated statistically,” a source who used Lavender told +972. “Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know statistically that it’s fine. So you go for it.”

Intelligence officers were given wide latitude when it came to civilian casualties, sources told +972. During the first few weeks of the war, officers were allowed to kill up to 15 or 20 civilians for every lower-level Hamas operative targeted by Lavender; for senior Hamas officials, the military authorized “hundreds” of collateral civilian casualties, the report claims.