Earlier today, the Tel Aviv-based outlet +972 Magazine published a long exposé on “Lavender,” an artificial intelligence-based program that Israel uses to automatically identify and target suspected Hamas and Palestinian Islamic Jihad fighters. The entire article is worth a read, but it essentially argues that this system generates “kill lists” for military commanders. These kill lists were given only perfunctory screening before the order was signed off on. No underlying intelligence gets examined: the simple officer confirms that the target is male and then signs off on the strike. The target is usually killed at home, with a “dumb” bomb, surrounded by family. And as has been the case during the majority of the war, only limited consideration is given to civilian life.
Programs exactly like this also have a predecessor in the NSA's SKYNET, the inner workings of which were arguably even worse. Civilians killed in strikes on suspects were fed into the machine as suspected terrorists; if they bomb a target while he's in a taxi, the deceased taxi driver gets added to the data, and now his family and coworkers are considered new targets.
Useful perspective, thank you.
Programs exactly like this also have a predecessor in the NSA's SKYNET, the inner workings of which were arguably even worse. Civilians killed in strikes on suspects were fed into the machine as suspected terrorists; if they bomb a target while he's in a taxi, the deceased taxi driver gets added to the data, and now his family and coworkers are considered new targets.