Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Some say AI will make war more humane. Israel’s war in Gaza shows the opposite.

May 09, 2024 - vox.com
The Israeli military has reportedly been using artificial intelligence (AI) systems to guide its war in Gaza, with one of the systems being named "The Gospel". According to an investigation by +972 Magazine, the AI systems are used to decide whom to target for killing, with humans playing a minimal role in the decision-making process. The AI system "Gospel" marks buildings used by Hamas militants, while "Lavender" uses surveillance data to rate each person's likelihood of being a militant and puts those with a higher rating on a kill list. Another system, "Where’s Daddy?" tracks these targets and informs the army when they are in their family homes.

The use of AI has reportedly resulted in 37,000 Palestinians being marked for assassination and thousands of women and children being killed as collateral damage. Despite the Israeli army denying the use of AI to select human targets, the high death toll and the AI-generated decisions have sparked international criticism and charges of genocide before the International Court of Justice. Critics argue that the use of AI in warfare can lead to moral complacency, prompt users toward action over non-action, and prioritize speed over ethical reasoning.

Key takeaways:

  • Israel has reportedly been using AI systems, including one called "The Gospel", to guide its war in Gaza, with the AI deciding whom to target for killing.
  • The AI systems work in concert, with "Gospel" marking buildings used by Hamas militants, "Lavender" rating each person's likelihood of being a militant based on surveillance data, and "Where's Daddy?" tracking these targets and informing the army when they're in their family homes.
  • Despite the AI system making errors in approximately 10 percent of cases, Israeli soldiers reportedly treated the AI's output as a human decision, sometimes only spending 20 seconds to review a target before bombing.
  • The use of AI in warfare raises ethical questions about moral responsibility, with the speed and scale of AI systems potentially leading to moral complacency and a lack of deliberative ethical reasoning.
View Full Article

Comments (0)

Be the first to comment!