Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

Apr 03, 2024 - web.archive.org
The Israeli army has reportedly developed an artificial intelligence-based program called "Lavender" that identifies potential targets for military strikes. The system, which was revealed by an investigation by +972 Magazine and Local Call, is said to have played a central role in the bombing of Palestinians during the war on the Gaza Strip. Lavender is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad as potential bombing targets. The system is said to have marked as many as 37,000 Palestinians as suspected militants for possible air strikes. The Israeli army reportedly gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based.

The Lavender system is said to have resulted in thousands of Palestinian deaths, most of them women and children or people not involved in the fighting. The system is also said to have a 10% error rate, occasionally marking individuals with loose or no connections to militant groups. The Israeli army reportedly attacked the targeted individuals while they were in their homes, often at night when their whole families were present. Additional automated systems, including one called "Where’s Daddy?", were used to track the targeted individuals and carry out bombings when they had entered their family’s residences.

Key takeaways:

  • The Israeli army has developed an artificial intelligence-based program known as “Lavender,” which was used to generate potential targets for military strikes during the war on the Gaza Strip.
  • The system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. It identified as many as 37,000 Palestinians as suspected militants and their homes for possible air strikes.
  • The Lavender system was known to make errors in approximately 10 percent of cases, occasionally marking individuals with loose or no connection to militant groups. Despite this, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with minimal requirement to check the machine's decisions.
  • The Israeli army systematically attacked the targeted individuals while they were in their homes, often resulting in civilian casualties. The army also decided that for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians.
View Full Article

Comments (0)

Be the first to comment!