Tonight we investigate the hacker known as 4chan. More at 9.
Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business
Submitted 8 months ago by btaf45@lemmy.world to technology@lemmy.world
https://www.cnn.com/2024/03/19/tech/buffalo-mass-shooting-lawsuit-social-media/index.html
Comments
RainfallSonata@lemmy.world 8 months ago
Platforms should be held responsible for the content its users publish on it, full stop.
Lath@kbin.earth 8 months ago
So if some random hacker takes over your network connection and publishes illegal content which then leads back to you, you should be held responsible. It's your platform after all.
pendingdeletion@lemmy.world 8 months ago
If it’s your server, then yes you should have responsibility with how you deal with said content.
0x0@programming.dev 8 months ago
Content creators should be held responsible for their content. Platforms are mere distributors, in general terms, otherwise you’re blaming the messenger.
Specific to social media (and television) yes, they bank on hate, it’s known - so don’t use them or do so with that ever dwindling human quality called critical thinking. Wanting to hold them accountable for murder is just dismissing the real underlying issues, like unsupervised impressionable people watching content, easy access to guns, human nature itself, societal issues…
conciselyverbose@sh.itjust.works 8 months ago
Then user generated content completely disappears.
Without the basic protection of section 230, it’s not possible to allow users to exist or interact with anything.
autotldr@lemmings.world [bot] 8 months ago
This is the best summary I could come up with:
A New York state judge on Monday denied a motion to dismiss a lawsuit against several social media companies alleging the platforms contributed to the radicalization of a gunman who killed 10 people at a grocery store in Buffalo, New York in 2022, court documents show.
In her decision, the judge said that the plaintiffs may proceed with their lawsuit, which claims social media companies — like Meta, Alphabet, Reddit and 4chan — ”profit from the racist, antisemitic, and violent material displayed on their platforms to maximize user engagement,” including the time then 18-year-old Payton Gendron spent on their platforms viewing that material.
“They allege they are sophisticated products designed to be addictive to young users and they specifically directed Gendron to further platforms or postings that indoctrinated him with ‘white replacement theory’,” the decision read.
“It is far too early to rule as a matter of law that the actions, or inaction, of the social media/internet defendants through their platforms require dismissal,” said the judge.
“While we disagree with today’s decision and will be appealing, we will continue to work with law enforcement, other platforms, and civil society to share intelligence and best practices,” the statement said.
We are constantly evaluating ways to improve our detection and removal of this content, including through enhanced image-hashing systems, and we will continue to review the communities on our platform to ensure they are upholding our rules.”
The original article contains 407 words, the summary contains 229 words. Saved 44%. I’m a bot and I’m open source!
UsernamesAreDifficult@lemmy.dbzer0.com 8 months ago
Honestly, good, they should be held accountable. They shouldn’t be offering extremist content recommendations in the first place.
EmperorHenry@discuss.tchncs.de 8 months ago
So now anyone who says things is going to be held accountable for crazy people being crazy?
What a lovely world we live in. That’s worse than what CNN kept saying about the joker after that one mass shooting at the theater that happened to be showing “The Dark Knight” at the same time.
roguetrick@lemmy.world 8 months ago
They’re appealing the denial of motion to dismiss huh? I agree that this case really doesn’t have legs but I didn’t know that was an interlocutory appeal that they could do. They’d win in summary judgement regardless.
FiniteBanjo@lemmy.today 8 months ago
Maybe this will lead to a future where Stochastic Terrorism isn’t a protected activity?