Reporting by Mathew Carr
May 8, 2024 — Turning a mobile phone off in the wrong way could have got you killed in Gaza. And your family. And your neighbors.
The “Lavender” program — An Israeli military AI system used by the Israeli military called “Project Lavender” that identified about 37,000 potential human targets in the Gaza Strip came to light last month. This use of AI has been controversial.
A person familiar with what Israel’s military said this (I’ve not been able to verify):
The AI uses information from drone footage and face recognition data, then it adds residents’ phone numbers, their what’s app contacts, and then it correlates them and it works out from a list of data points whether a Palestinian makes more than three calls a day, whether he or she turns their phone off more than four times a day for odd reasons … and it determines whether it thinks they are a Hamas person or not, the person said.
Then, Israel made a decision to launch a missile at the alleged militant … potentially while they were in their home. It’s unclear whether this is still going on.
Computer decides to kill?
The computer program decided who to kill, they said.
The person was “really really annoyed” about this use of AI, they said. Another with Israeli Defence Force linkages was astounded the first person even knew about it.
I pressed the person about whether real humans were involved in the process leading to a final decision to kill. They said the Israel military will argue that there were people making decisions as part of the process.
Yet reporting in the Guardian and 972mag published April 3 indicates the people in the decision chain were making very important decisions very quickly…and with inadequate resources.
As a ceasefire is being debated, these revelations of extra-judicial killings should place extra pressure on Israel.
(I’ve not seen previously any reporting on the importance of switching off phones as a feature of the Lavender program’s AI’s algorithm.)
Email me at mathew@carrzee.net if you have better intelligence. I tried and have so far failed to get the Israel Defence Force to comment.
And if you think that behavior is bad, some pro-Palestine advocates fear even worse may be coming.
Israel orders for Gazans to leave southern Gaza could see them end up on a peninsula in Egypt.
One fear: that a planned $100 billion plus development of a new city on Egypt’s northwest coast could be a secret planned site for a new Palestine.

$35 billion deal at Ras Al-Hekma Peninsula that could spur spending of more than $100 billion

NOTES
The Israeli military’s AI-powered program, known as “Lavender,” is named after the plant lavender, which is known for its calming and soothing properties. The program was developed to help identify potential targets for military strikes by analyzing large amounts of data, including information on individuals and their activities. The name “Lavender” might be intended to reflect the program’s role in providing a sense of order and control amidst the chaos of conflict, much like how the plant lavender is used for relaxation and stress relief (Grok AI). However, it’s important to note that the program has been controversial due to its use in the conflict in Gaza, where it reportedly identified 37,000 Palestinians as potential targets.
Another contact mentioned to me why she thought lavender might have been given that name: because it stinks.
https://blog.paulbiggar.com/meta-and-lavender

