The only person liable here is the shooter.
On the very specific point of liability, while the shooter is the specific person that pulled the trigger, is there no liability for those that radicalised the person into turning into a shooter? If I was selling foodstuffs that poisoned people I’d be held to account by various regulatory bodies, yet pushing out material to poison people’s minds goes for the most part unpunished. If a preacher at a local religious centre was advocating terrorism, they’d face charges.
The UK government has a whole ream of context about this: …service.gov.uk/…/prevent-strategy-review.pdf
Google’s “common carrier” type of defence takes you only so far, as it’s not a purely neutral party in terms, as it “recommends”, not merely “delivers results”, as @joe points out. That recommendation should come with some editorial responsibility.
joe@lemmy.world 1 year ago
Well, maybe. I want to be up-front that I haven’t read the actual lawsuit, but it seems from the article is that the claim is that youtube and reddit both have an algorithm that helped radicalize him:
I’d say that case is worth pursuing. It’s long been known that social media companies tune their algorithms to increase engagement, and that pissed off people are more likely to engage. This results in algorithms that output content that makes people angry, by design, and that’s a choice these companies make, not “delivering search results”.