As civilian casualties continue to mount in the wartorn Gaza Strip, reports of Israel’s use of artificial intelligence (AI) in its targeting of Hamas militants are facing increasing scrutiny. A report by the Israeli outlets +972 Magazine and Local Call earlier this month said that Israeli forces had relied heavily on two AI tools so far in the conflict — “Lavender” and “Where’s Daddy.”
While “Lavender” identifies suspected Hamas and Palestinian Islamic Jihad (PIJ) militants and their homes, “Where’s Daddy” tracks these targets and informs Israeli forces when they return home, per the report, which cites six Israeli intelligence officers who had used AI systems for operations in Gaza, including “Where’s Daddy?”
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one of the officers told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations,” they added.
Weird dude, but I do find it interesting that we always seem to find some problem with any person anywhere who speaks critically of China.
You can check the camps on satellite though:
https://xjdp.aspi.org.au/map/?
The point is that dude has no means to collect such statistics, he’s not a research agency asking representative samples about their medical history, documenting his methodology and making the data available, and he’s not even citing statistics supplied by hospitals.
He literally just pulled those charts out of his ass. Instead of “well yes, they’re lying, but I’m sure the broader point is true”, consider “would they have any need to fabricate things if the broader point is true?”
Surely it would be easy enough to go to Xinjiang and verify it, since there’s no travel restrictions and they wouldn’t have to worry about getting caught making shit up.