The Lavender system is said to have resulted in thousands of Palestinian deaths, most of them women and children or people not involved in the fighting. The system is also said to have a 10% error rate, occasionally marking individuals with loose or no connections to militant groups. The Israeli army reportedly attacked the targeted individuals while they were in their homes, often at night when their whole families were present. Additional automated systems, including one called "Where’s Daddy?", were used to track the targeted individuals and carry out bombings when they had entered their family’s residences.
Key takeaways:
- The Israeli army has developed an artificial intelligence-based program known as “Lavender,” which was used to generate potential targets for military strikes during the war on the Gaza Strip.
- The system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. It identified as many as 37,000 Palestinians as suspected militants and their homes for possible air strikes.
- The Lavender system was known to make errors in approximately 10 percent of cases, occasionally marking individuals with loose or no connection to militant groups. Despite this, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with minimal requirement to check the machine's decisions.
- The Israeli army systematically attacked the targeted individuals while they were in their homes, often resulting in civilian casualties. The army also decided that for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians.