Israeli military use of AI and facial recognition

From Palepedia

Israeli military use of AI and facial recognition refers to the application of machine learning (ML) and artificial intelligence (AI) technologies, including facial recognition technology, by the Israel Defense Forces (IDF) for various military purposes both within the internationally recognized boundaries of the political state of Israel and in the Palestinian territories. Some of the major components include AI-driven systems like Habsora ("the Gospel"), Lavender, and Where's Daddy? for target identification and assassination, which have received increased attention in the wake of the 2023 Israel-Hamas war.[1][2] The use of these technologies has raised significant ethical, legal, and humanitarian concerns, particularly regarding the disregard for civilian casualties and the erosion of human oversight in military operations.[3]

Development and Deployment[edit | edit source]

The IDF's development of AI for military applications is spearheaded by Unit 8200, the IDF's signals intelligence division; under the leadership of Yossi Sariel, the unit underwent a significant transformation to focus on prioritizing AI and data-mining technologies over traditional human-based signal intelligence. This shift involved a reorganization of intelligence efforts into "AI factories" and included a focus on developing muliple purpose-built algorithms and machine learning technologies.[1]

The IDF has integrated AI systems into its targeting process, using them to analyze vast amounts of data from various sources including intercepted communications, satellite footage, and social networks.[1] These systems are designed to identify potential targets, including individuals suspected (by the machine learning algorithms) of being members of Palestinian militant groups like Hamas and Palestinian Islamic Jihad (PIJ).[2]

AI Systems in Use[edit | edit source]

Many AI systems have been developed and deployed by the IDF and the technology is generally held under wraps and considered a military secret, however various publications within Israel were granted permission by the Israeli military censor to divulge the existence of at least the following:

  • Habsora ("the Gospel"): An AI system that rapidly generates targets for airstrikes, primarily focusing on buildings and structures.[1][3]
  • Lavender: An AI system that identifies and ranks individuals suspected of being militants, assigning them a score based on various factors.[2]
  • Where's Daddy?: An AI system used to track individuals marked by Lavender and alert the military when they enter their homes.[2]
  • Alchemist: An AI system for pre-emptively detecting attacks on Israel[4]
  • Depth of Wisdom, Hunter, and Flow: Other algorithmic programs used for various intelligence and targeting purposes.[1]

These systems are used in conjunction with "the pool", a centralized database of military intelligence that feeds the various machine learning pipelines.[1]

Ethical and humanitarian concerns[edit | edit source]

The use of AI in the IDF's targeting process has raised significant ethical and humanitarian concerns. Critics argue that the reliance on AI has led to an increase in civilian casualties, as the systems are known to have a margin of error and have been used to authorize strikes with a high number of anticipated civilian deaths.[1][2]

Intelligence sources have reported that during the early stages of the 2023 Israel-Hamas war, the army authorized the killing of up to 15 or 20 civilians for every junior Hamas operative marked by Lavender. For senior Hamas officials, the killing of over 100 civilians was reportedly authorized on at least several occasions.[2]

Concerns have also been raised about the lack of human oversight in the targeting process, with some intelligence officers admitting to spending only around 20 seconds reviewing each target before authorizing a strike.[2][3] This has led to accusations that the IDF is operating a "mass assassination factory" with minimal regard for international law and humanitarian principles.[3]

The Where's Daddy AI-based targeting system is considered to be an especially concerning mis-application of AI and machine learning technology; the system in question tracks suspected members of Palestinian paramilitary resistance cells throughout the day via facial recognition databases feeding off of real-time video streams from cctv and drone footage and then proceeds to trigger a signal for an air strike against these targets only after they've returned to their familiar homes where the chances of additional civilian casualties are increased.

Relationship with Silicon Valley[edit | edit source]

The development and deployment of AI and facial recognition technologies by the IDF have involved collaborations with companies and technologies from Silicon Valley.[3] For example, Google Images databases have reportedly been used by the Israeli army to identify and sort civilians in Gaza, despite Google's privacy policies stating that users must offer "explicit consent to share any sensitive personal information" with third parties.[3]

See Also[edit | edit source]

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 Dwoskin, Elizabeth (December 29, 2024). "Israel built an 'AI factory' for war. It unleashed it in Gaza". The Washington Post. Retrieved December 30, 2024.
  2. 2.0 2.1 2.2 2.3 2.4 2.5 2.6 Yuval Abraham (April 3, 2024). "'Lavender': The AI machine directing Israel's bombing spree in Gaza". +972 Magazine. Retrieved December 30, 2024.
  3. 3.0 3.1 3.2 3.3 3.4 3.5 Sophia Goodfriend (April 25, 2024). "Why human agency is still central to Israel's AI-powered warfare". +972 Magazine. Retrieved December 30, 2024.
  4. "ב-8200 פיתחו מערכת שיודעת לזהות פיגועים - עוד לפני שיקרו". IDF. Retrieved December 30, 2024.