Lavender (AI targeting system)
Lavender is an artificial intelligence (AI)-based targeting system developed by the Israel Defense Forces (IDF) that is used to identify and rank individuals suspected of being militants in Hamas or Palestinian Islamic Jihad (PIJ).[1] The system was first reported by +972 Magazine and has been used extensively in the 2023 Israel-Hamas war, particularly in the early stages of the conflict.[1]
Development and Functionality[edit | edit source]
Lavender was developed by Unit 8200, the IDF's signals intelligence division, under the leadership of Yossi Sariel.[2] The system analyzes information collected on most of the 2.3 million residents of the Gaza Strip through a system of mass surveillance.[1] It then assesses and ranks the likelihood that each particular person is active in the military wing of Hamas or PIJ, assigning them a rating from 1 to 100.[1]
Lavender learns to identify characteristics of known Hamas and PIJ operatives, whose information was fed to the machine as training data, and then to locate these same characteristics among the general population.[1] Factors that could raise a person's score include being in a WhatsApp group with a known militant, switching addresses and phone numbers frequently, or being named in Hamas files.[2]
According to sources, the machine gives almost every single person in Gaza a rating, expressing how likely it is that they are a militant.[1] An individual found to have several different incriminating features will reach a high rating and thus automatically becomes a potential target for assassination.[1]
Use in the 2023 Israel-Hamas War[edit | edit source]
Lavender has played a central role in the unprecedented bombing of Palestinians in the 2023 Israel-Hamas war, especially during the early stages of the war.[1] Intelligence sources told +972 Magazine that during the first weeks of the war, the army almost completely relied on Lavender, which marked as many as 37,000 Palestinians as suspected militants for possible air strikes.[1]
The army gave sweeping approval for officers to adopt Lavender's kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based.[1] One source stated that human personnel often served only as a "rubber stamp" for the machine's decisions, adding that, normally, they would personally devote only about "20 seconds" to each target before authorizing a bombing.[1]
The Israeli army systematically attacked the targeted individuals while they were in their homes, usually at night while their whole families were present, rather than during the course of military activity.[1] This was facilitated by another AI system called "Where's Daddy?", which tracked the targeted individuals and signaled when they had entered their family's residences.[1]
Controversies and Criticisms[edit | edit source]
The use of Lavender in the 2023 Israel-Hamas war has been highly controversial. Critics argue that the reliance on AI has led to an increase in civilian casualties, as the system is known to have an error rate of approximately 10 percent and has been used to authorize strikes with a high number of anticipated civilian deaths.[1]
Intelligence sources have reported that during the early stages of the war, the army authorized the killing of up to 15 or 20 civilians for every junior Hamas operative marked by Lavender.[1] For senior Hamas officials, the killing of over 100 civilians was reportedly authorized on several occasions.[1]
Concerns have also been raised about the lack of human oversight in the targeting process and the potential for the system to mistakenly flag individuals who have only a loose connection to militant groups or no connection at all.[1] The IDF has denied the existence of a kill list and claims that Lavender is merely a tool to assist in the targeting process, with independent examination by analysts required.[1]
See Also[edit | edit source]
- Israeli military use of AI and facial recognition
- Habsora (AI targeting system)
- Where's Daddy (AI targeting system)
- Unit 8200
- Yossi Sariel
- ↑ 1.00 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 1.10 1.11 1.12 1.13 1.14 1.15 1.16 1.17 Yuval Abraham (April 3, 2024). "'Lavender': The AI machine directing Israel's bombing spree in Gaza". +972 Magazine. Retrieved December 30, 2024.
- ↑ 2.0 2.1 Dwoskin, Elizabeth (December 29, 2024). "Israel built an 'AI factory' for war. It unleashed it in Gaza". The Washington Post. Retrieved December 30, 2024.