AI-Powered System 'Lavender' Drives Israel's Military Campaign in Gaza: A Deep Dive into the Technology Behind the Conflict
‘Lavender’: The AI Machine Directing Israel’s Bombing Spree in Gaza
Source: +972 Magazine
Introduction
The Israeli army has reportedly implemented an AI targeting system named “Lavender,” which has marked tens of thousands of Gazans as potential bombing targets. This system lacks significant human oversight and correlates with a permissive policy regarding civilian casualties.
The Role of Lavender in Target Identification
Automation of Target Generation
- Lavender is designed to identify suspected operatives of Hamas and Palestinian Islamic Jihad, marking around 37,000 individuals based on data analysis.
- Human verification of targets has been limited, with decisions often taken based solely on the AI's recommendations without in-depth analysis.
Targeting Individuals in Homes
- The Israeli military frequently bombed suspected militants while they were in their homes, an approach deemed simpler from an intelligence perspective.
- Additional tracking systems like "Where’s Daddy?" were utilized to monitor the movements of these targets, leading to high civilian casualties.
Collateral Damage and Civilian Casualties
Increased Permissiveness in Civilian Casualty Allowance
- In early stages of the war, the military allowed for a high number of civilian deaths—up to 15-20 civilians per targeted Hamas operatives.
- This unprecedented permissiveness raised ethical concerns about targeting, with many non-combatants being killed as “collateral damage.”
Errors in Targeting and Execution
- Human operators typically reviewed targets for merely a few seconds, mainly confirming gender rather than assessing background information.
- The potential for the AI to make errors is acknowledged, with a recognized 10 percent failure rate affecting target identification.
Automation Over Human Oversight
Reliance on AI for Decision-Making
- Officers have reported that the AI system often replaced traditional decision-making processes, emphasizing speed over accuracy.
- The military's reliance on statistical models means many innocent individuals have been mistakenly targeted.
Shifts in Strategy Post-October 7
- The atmosphere within the military has shifted towards aggression post-Hamas attacks on Israeli communities, leading to increased airstrikes and a focus on retaliation rather than precision.
- Initial strategies did not account for broader implications of civilian deaths, creating a cycle of violence that could perpetuate future conflict.
Conclusion
The integration of AI systems like Lavender illustrates a significant shift in military strategy, one that prioritizes rapid engagement over careful consideration of civilian safety. The consequences have been dire, with thousands of non-combatants killed amidst a backdrop of evolving military policy and technology.