Comment on Microsoft employee disrupts 50th anniversary and calls AI boss ‘war profiteer’

<- View Parent
GreyAlien@lemm.ee ⁨3⁩ ⁨weeks⁩ ago

In long.

In short:

The AI system labeled tens of thousands of Gazans, mostly men, as suspected militants, with a 10% error rate, meaning thousands were likely civilians.

Human officers spent ~20 seconds per target, often just confirming gender, before approving airstrikes.

“Where’s Daddy?”: A companion AI tracked targets to their homes, prioritizing bombings at night when families were present.

The military authorized 15–20 civilian deaths per low-ranking militant and 100+ for senior Hamas officials

Strikes frequently used unguided munitions, maximizing destruction and civilian harm

Officers admitted acting as “stamps” for AI decisions, with one calling the process “hunting at large”

Additional informations: Project Nimbus

source
Sort:hotnewtop