IDF used AI to target homes in Gaza, causing deaths
Mint — April 4, 2024, 06:00 PM UTC
Summary: An investigative report reveals that Israel Defense Forces (IDF) used Artificial Intelligence (AI) to target homes in Gaza, resulting in the deaths of around 33,000 Palestinians, mostly women and children. The AI tool, "Lavender," had a 10% error rate. IDF officials spent only 20 seconds per target before authorizing bombings. Israel faces scrutiny for potential war crimes. The IDF confirmed AI use but denied targeting individuals based on AI predictions.
Article metrics
The article metrics are deprecated.
I'm replacing the original 8-factor scoring system with a new and improved one. It doesn't use the original factors and gives much better significance scores.