Add Fox news and Trump rallies to the list.
Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business
Submitted 8 months ago by btaf45@lemmy.world to technology@lemmy.world
https://www.cnn.com/2024/03/19/tech/buffalo-mass-shooting-lawsuit-social-media/index.html
Comments
Simulation6@sopuli.xyz 8 months ago
0x0@programming.dev 8 months ago
Don’t forget Marilyn Manson and videogames.
cophater69@lemm.ee 8 months ago
Marilyn Manson led a charge to overthrow the government??
TheDarksteel94@sopuli.xyz 8 months ago
Idk why you’re getting downvoted for an obvious joke lol
isles@lemmy.world 8 months ago
People don’t appreciate having spurious claims attached to their legitimate claims, even in jest. It invokes the idea that since the previous targets of blame were false that these likely are as well.
Phanatik@kbin.social 8 months ago
I don't understand the comments suggesting this is "guilty by proxy". These platforms have algorithms designed to keep you engaged and through their callousness, have allowed extremist content to remain visible.
Are we going to ignore all the anti-vaxxer groups who fueled vaccine hesitancy which resulted in long dead diseases making a resurgence?
To call Facebook anything less than complicit in the rise of extremist ideologies and conspiratorial beliefs, is extremely short-sighted.
"But Freedom of Speech!"
If that speech causes harm like convincing a teenager to walk into a grocery store and gunning people down is a good idea, you don't deserve to have that speech. Sorry, you've violated the social contract and those people's blood is on your hands.
firadin@lemmy.world 8 months ago
Not just “remain visible” - actively promoted. There’s a reason people talk about Youtube’s right-wing content pipeline. If you start watching anything male-oriented, Youtube will start slowly promoting more and more right-wing content to you until you’re watching Ben Shaprio and Andrew Tate
BeMoreCareful@lemmy.world 8 months ago
YouTube is really bad about trying to show you right wing crap. It’s overwhelming. The shorts are even worse. Every few minutes there’s some new suggestion for some stuff that is way out of the norm.
Tiktok doesn’t have this problem and is being attacked by politicians?
Ragnarok314159@sopuli.xyz 8 months ago
I got into painting mini Warhammer 40k figurines during covid, and thought the lore was pretty interesting.
Every time I watch a video, my suggested feed goes from videos related to my hobbies to entirely replaced with red pill garbage. The right wing channels have to be highly profitable to YouTube to funnel people into, just an endless tornado of rage and constant viewing.
reverendsteveii@lemm.ee 8 months ago
it legit took youtube’s autoplay about half an hour after I searched “counting macros” to bring me to american monarchist content
cows_are_underrated@feddit.de 8 months ago
“But freedom of speech”
If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don’t deserve to have that speech.
In Germany we have a very good rule for this(its not written down, but that’s something you can usually count onto). Your freedom ends, where it violates the freedom of others. Examples for this: Everyone has the right to live a healthy life and everyone has the right to walk wherever you want. If I now take my right to walk wherever to want to cause a car accident with people getting hurt(and it was only my fault). My freedom violated the right that the person who has been hurt to life a healthy life. That’s not freedom.
Syringe@lemmy.world 8 months ago
In Canada, they have an idea called “right to peace”. It means that you can’t stand outside of an abortion clinic and scream at people because your right to free speech doesn’t exceed that person’s right to peace.
I don’t know if that’s 100% how it works so someone can sort me out, but I kind of liked that idea
RaoulDook@lemmy.world 8 months ago
Very reasonable, close to the “Golden Rule” concept of being excellent to each other
SuperSaiyanSwag@lemmy.zip 8 months ago
This may seem baseless, but I have seen this from years of experience in online forums. You don’t have to take it seriously, but maybe you can relate. We have seen time and time again that if there is no moderation then the shit floats to the top. The reason being that when people can’t post something creative or fun, but they still want the attention, they will post negative. It’s the loud minority, but it’s a very dedicated loud minority. Let’s say we have 5 people and 4 of them are very creative time and funny, but 1 of them complains all the time. If they make posts to the same community then there is a very good chance that the one negative person will make a lot more posts than the 4 creative types.
driving_crooner@lemmy.eco.br 8 months ago
What about youtube? That had actually paid those people to spread their sick ideas, making the world a worst place and getting rich while doing it.
afraid_of_zombies@lemmy.world 8 months ago
Ok…don’t complain to me later when the thing you like gets taken down.
Nomad@infosec.pub 8 months ago
Nice, now do all regigions and churches next
Socsa@sh.itjust.works 8 months ago
Please let me know if you want me to testify that reddit actively protected white supremacist communities and even banned users who engaged in direct activism against these communities
FenrirIII@lemmy.world 8 months ago
I was banned for activism against genocide. Reddit is a shithole.
misspacific@lemmy.blahaj.zone 8 months ago
i was banned for similar reasons.
seems like a lot of mods just have the ability to say whatever about whoever and the admins just nuke any account they target.
fine_sandy_bottom@discuss.tchncs.de 8 months ago
Well yeah it is but… what did you think would happen?
jkrtn@lemmy.ml 8 months ago
Send your evidence to the lawyers, couldn’t hurt.
skozzii@lemmy.ca 8 months ago
YouTube feeds me so much right wing bullshit I’m constantly marking it as not interested. It’s a definite problem.
Duamerthrax@lemmy.world 8 months ago
It’s amazing how often I get a video from some right wing source suggested to me companting about censorship and being buried by youtube. I ended up installing a third party channel blocker to deal with it.
afraid_of_zombies@lemmy.world 8 months ago
I can’t prove that they were related but I used to report all conservative ads (Hillsdale Epoch times etc) to Google with all caps messages how I was going to start calling the advertisers directly and yell at them for the ads, about 2-3 days after I started doing that the ads stopped.
I would love for other people to start doing this to confirm that it works and to be free of the ads.
seth@lemmy.world 8 months ago
I purchased a flannel and a top water lure online from Rural King and didn’t realize it came with an extra month of right wing conspiracy ads. I report the ads but they just keep coming.
Tom_Hanx_Hail_Satan@lemmy.ca 8 months ago
That worked for me also. I like a lot of sports docs on YouTube. That triggered non stop Joe Rogan suggestions and ads for all kinds of right wing news trash.
Krudler@lemmy.world 8 months ago
I quit drinking years ago and I reported every ad explaining that I am no longer their target market and the ads are literally dangerous to me. They were gone within a few weeks - haven’t seen a booze ad in 5+ years.
S_H_K@lemmy.dbzer0.com 8 months ago
Is fucking insane how much that happens I stopped using Instagram for that reason at least yt listened to my “not interested” choices. I also have revnaced so IDK what ads it would shoot at me.
CaptPretentious@lemmy.world 8 months ago
YouTube started feeding me that stuff too. Weirdly once I started reporting all of them as misinformation they stop showing up for some reason…
ristoril_zip@lemmy.zip 8 months ago
“Noooo it’s our algorithm we can’t be held liable for the program we made specifically to discover what people find a little interesting and keep feeding it to them!”
RagingRobot@lemmy.world 8 months ago
I wonder if you built a social media site where the main feature was that the algorithm just showed you things in sequential order like in the old days, would it be popular
RaoulDook@lemmy.world 8 months ago
I enjoy using Lemmy mostly that way, just sorting the feed by new / hot / whatever and looking at new posts of random shit. Much more entertaining than video-spamming bullshit.
Hillock@feddit.de 8 months ago
No, there is too much content for that nowadays. YouTube has over 3 million new videos each day. Facebook, TikTok, Instagram also has ridiculous amounts of new posts every day. Browsing Reddit on New was a terrible experience on r/all or even many of the bigger subs. Even on the fediverse sorting by new is not enjoyable. You are swarmed with reposts, and content that’s entirely uninteresting to you.
It works in smaller communities but there it isn’t really necessary. You usually have an overview of all the content anyhow and it doesn’t matter how it’s ordered.
Any social media that plans on scaling up needs a more advanced system.
afraid_of_zombies@lemmy.world 8 months ago
So a paper encyclopedia set? How is Britannica doing?
Quill7513@slrpnk.net 8 months ago
People complain about mastodons lack of algorithms a lot. Its part of how misskey, ice shrimp, and catodon came to be
John_McMurray@lemmy.world 8 months ago
I find it very weird to be living in a country legalizing drugs and assisted suicide (even for depression) but simultaneously trying to severely curtail free speech, media freedom and passing legislation to jail people at risk of breaking the law who don’t meet the “conspiracy to commit” threshold".
0x0@programming.dev 8 months ago
It’s never the parents, is it?
geogle@lemmy.world 8 months ago
Ask those parents in the Michigan case
PoliticalAgitator@lemmy.world 8 months ago
You mean the “responsible gun owners” who don’t properly secure their weapons from a child?
echodot@feddit.uk 8 months ago
I couldn’t work this out from the article is it the parents raising this suit or the victims families?
scottmeme@sh.itjust.works 8 months ago
Excuse me what in the Kentucky fried fuck?
As much as everyone says fuck these big guys all day this hurts everyone.
athos77@kbin.social 8 months ago
I agree with you, but ... I was on reddit since the Digg exodus. It always had it's bad side (violentacrez, jailbait, etc), but it got so much worse after GamerGate/Ellen Pao - the misogyny became weaponized. And then the alt-right moved in, deliberately trying to radicalize people, and we worked so. fucking. hard to keep their voices out of our subreddits. And we kept reporting users and other subreddits that were breaking rules, promoting violence and hatred, and all fucking spez would do is shrug and say, "hey it's a free speech issue", which was somewhere between "hey, I agree with those guys" and "nah, I can't be bothered".
So it's not like this was something reddit wasn't aware of (I'm not on Facebook or YouTube). They were warned, repeatedly, vehemently, starting all the way back in 2014, that something was going wrong with their platform and they need to do something. And they deliberately and repeatedly choose to ignore it, all the way up to the summer of 2021. Seven fucking years of warnings they ignored, from a massive range of users and moderators, including some of the top moderators on the site. And all reddit would do is shrug it's shoulders and say, "hey, free speech!" like it was a magic wand, and very occasionally try to defend itself by quoting it's 'hate speech policy', which they invoke with the same regular repetitiveness and 'thoughts and prayers' inaction as a school shooting brings. In fact, they did it in this very article:
In a statement to CNN, Reddit said, “Hate and violence have no place on Reddit. Our sitewide policies explicitly prohibit content that promotes hate based on identity or vulnerability, as well as content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or group of people. We are constantly evaluating ways to improve our detection and removal of this content, including through enhanced image-hashing systems, and we will continue to review the communities on our platform to ensure they are upholding our rules.”
As someone who modded for a number of years, that's just bullshit.
Jaysyn@kbin.social 8 months ago
Good. No quarter for fascists, violent racist or their enablers.
Conspiracy for cash isn't a free speech issue.
Morefan@retrolemmy.com 8 months ago
for fascists, violent racist or their enablers.
Take a good long look in the mirror (and a dictionary printed before 2005) before you say things like this.
Not_mikey@slrpnk.net 8 months ago
Sweet, I’m sure this won’t be used by AIPAC to sue all the tech companies for causing October 7th somehow like unrwa and force them to shutdown or suppress all talk on Palestine. People hearing about a genocide happening might radicalize them, maybe we could get away with allowing discussion but better safe then sorry, to the banned words list it goes.
This isn’t going to end in the tech companies hiring a team of skilled moderators who understand the nuance between passion and radical intention trying to preserve a safe space for political discussion, that costs money. This is going to end up with a dictionary of banned and suppressed words.
The_Tired_Horizon@lemmy.world 8 months ago
I gave up reporting on major sites where I saw abuse. Stuff that if you said that in public, also witnessed by others, you’ve be investigated. Twitter was also bad for responding to reports with “this doesnt break our rules” when a) it clearly did and b) probably a few laws.
Krudler@lemmy.world 8 months ago
I just would like to show something about Reddit. Below is a post I made about how Reddit was literally harassing and specifically targeting me, after I let slip in a comment one day that I was sober - I had previously never made such a comment because my sobriety journey was personal, and I never wanted to define myself or pigeonhole myself as a “recovering person”.
I reported the recommended subs and ads to Reddit Admins multiple times and was told there was nothing they could do about it.
I posted a screenshot to DangerousDesign and it flew up to like 5K+ votes in like 30 minutes before admins removed it. I later reposted it to AssholeDesign where it nestled into 2K+ votes before shadow-vanishing.
Yes, Reddit and similar are definitely responsible for a lot of suffering and pain at the expense of humans in the pursuit of profit. After it blew up and front-paged, “magically” my home page didn’t have booze related ads/subs/recs any more! What a totally mystery how that happened /s
The post in question, and a perfect “outing” of how Reddit continually tracks and tailors the User Experience specifically to exploit human frailty for their own gains.
Zuberi@lemmy.dbzer0.com 8 months ago
Fuck Reddit, can’t wait to see the IPO burn
Kalysta@lemmy.world 8 months ago
Love Reddit’s lies about them taking down hateful content when they’re 100% behind Israel’s genocide of the Palestinians and will ban you if you say anything remotely negative about Israel’s govenment. And the amount of transphobia on the site is disgusting. Let alone the misogyny.
yarr@feddit.nl 8 months ago
Are the platforms guilty or are the users that supplied the radicalized content guilty? Last I checked, most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves.
porksoda@lemmy.world 8 months ago
Back when I was on reddit, I subscribed to about 120 subreddits. Starting a couple years ago though, I noticed that my front page really only showed content for 15-20 subreddits at a time and it was heavily weighted towards recent visits and interactions.
For example, if I hadn’t visited r/3DPrinting in a couple weeks, it slowly faded from my front page until it disappeared all together. It was so bad that I ended up riding a browser automation script to visit all 120 of my subreddits at night and click the top link. This ended up giving me a more balanced front page that mixed in all of my subreddits and interests.
My point is, these algorithms are fucking toxic. They’re focused 100% on increasing time on page and interaction with zero consideration for side effects. I would love to see social media algorithms required by law to be open source. We have a public interest in knowing how we’re being manipulated.
RealFknNito@lemmy.world 8 months ago
Lol no. Social media isn’t responsible it’s the people on it. I fucking hate this brain dead logic of “Well punishing the bad person isn’t enough, go for the manufacturer!”
Yeah, fuck it, next time someone is beaten to death with a power tool hold DeWalt accountable. Next time someone plays loud music during their murder hold Spotify accountable. So fucking retarded.
Binthinkin@kbin.social 8 months ago
Goddamn right they do. Meta should be sued to death for the genocides too.
muntedcrocodile@lemmy.world 8 months ago
What an excellent presedent to set cant possibly see how this is going to become authoritarian. Ohh u didnt report someone ur also guilty cant see any problems with this.
Embarrassingskidmark@lemmy.world 8 months ago
The trifecta of evil. Especially Reddit, fuck Reddit… Facebook too.
hal_5700X@sh.itjust.works 8 months ago
Here comes more censorship from Big Tech. 🤦♂️
Canyon201@lemmy.world 8 months ago
Right in the IPO price!
Mastengwe@lemm.ee 8 months ago
It’s ALWAYS someone else’s fault.
antidote101@lemmy.world 8 months ago
Can we stop letting the actions of a few bad people be used to curtail our freedom on platforms we all use.
I don’t want the internet to end up being policed by corporate AIs and poorly implemented bots (looking at you auto-mod).
atrielienz@lemmy.world 8 months ago
So, I can see a lot of problems with this. Specifically the same problems that the public and regulating bodies face when deciding to keep or overturn section 230. Free speech isn’t necessarily what I’m worried about here. Mostly because it is already agreed that free speech is a construct that only the government is actually beholden to. Message boards have and will continue to censor content as they see fit.
Section 230 basically stipulates that companies that provide online forums (Meta, Alphabet, 4Chan etc) are not liable for the content that their users post. And part of the reason it works is because these companies adhere to strict guidelines in regards to content and most importantly moderation.
Section 230©(2) further provides “Good Samaritan” protection from civil liability for operators of interactive computer services in the good faith removal or moderation of third-party material they deem “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”
Reddit, Facebook, 4Chan et all do have rules and regulations they require their users to follow in order to post. And for the most part the communities on these platforms are self policing. There just aren’t enough paid moderators to make it work otherwise.
That being said, the real problem is that this really kind of indirectly challenges section 230. Mostly because it very barely skirts around whether the relevant platforms can themselves be considered publishers, or at all responsible for the content the users post and very much attacks how users are presented with content to keep them engaged via algorithms (which is directly how they make their money).
Even if the lawsuits fail, this will still be problematic. It could lead to draconian moderation of what can be posted and by whom. So now all race related topics regardless of whether they include hate speech could be censored for example. Politics? Censored. The discussion of potential new laws? Censored.
But I think it will be worse than that. The algorithm is what makes the ad space these companies sell so valuable. And this is a direct attack on that. We lack the consumer privacy protections to protect the public from this eventuality. If the ad space isn’t valuable the data will be. And there’s nothing stopping these companies from selling user data. Some of them already do. What these apps do in the background is already pretty invasive. This could lead to a furthering of that invasive scraping of data. I don’t like that.
That being said there is a point I agree with. These companies literally do make their algorithm addictive and it absolutely will push content at users. If that content is of an objectionable nature, so long as it isn’t outright illegal, these companies do not care. Because they do gain from it monetarily.
What we actually need is data privacy protections. Holding these companies accountable for their algorithms is a good idea. But I don’t agree that this is the way to do that constructively. It would be better to flesh out 230 as a living document that can change with the times. Because when it was written the Internet landscape was just different.
What I would like to see is for platforms to moderate content posted and representing itself as fact. We don’t see that nearly enough on places like reddit. Users can post anything as fact and the echo chambers will rally around it if they believe it. It’s not really incredibly difficult to radicalise a person. But the platforms aren’t doing that on purpose. The other users are, and the algorithms are helping them.
FunkPhenomenon@lemmy.zip 8 months ago
eh… anyone can be “radicalized” by anything. is anyone suing Mecca when islamic fundamentalists jihad someone/something? is anyone sueing the Catholic church because of christian fundamentalists doing the same thing?
holding tech companies liable because some crazy dumbshit did a bad thing is disingenuous at best. Judge’s ruling isnt going to stand.
charonn0@startrek.website 8 months ago
I think there’s definitely a case to be made that recommendation algorithms, etc. constitute editorial control and thus the platform may not be immune to lawsuits based on user posts.
blazera@lemmy.world 8 months ago
Personally I believe in free will. Nothing should take any responsibility away from the one that chose to kill.
Minotaur@lemm.ee 8 months ago
I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.
It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.
Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.
It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.
dgriffith@aussie.zone 8 months ago
This appears to be more the angle of the person being fed an endless stream of hate on social media and thus becoming radicalised.
What causes them to be fed an endless stream of hate? Algorithms. Who provides those algorithms? Social media companies. Why do they do this? To maintain engagement with their sites so they can make money via advertising.
And so here we are, with sites that see you viewed 65 percent of a stream showing an angry mob, therefore you would like to see more angry mobs in your feed. Is it any wonder that shit like this happens?
PhlubbaDubba@lemm.ee 8 months ago
It’s also known to intentionally show you content that’s likely to provoke you into fights online
Which just makes all the sanctimonious screed about avoiding echo chambers a bunch of horse shit, because that’s not how outside digital social behavior works, outside the net if you go out of your way to keep arguing with people who wildly disagree with you, your not avoiding echo chambers, you’re building a class action restraining order case against yourself.
Zak@lemmy.world 8 months ago
I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn’t responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.
With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don’t like this case. I especially don’t like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.
refurbishedrefurbisher@lemmy.sdf.org 8 months ago
This is the real shit right here. The problem is that social media companies’ data show that negativity and hate keep people on their website for longer, which means that they view more advertisement compared to positivity.
It is human nature to engage with disagreeable topics moreso than agreeable topics, and social media companies are exploiting that for profit.
We need to regulate algorithms and force them to be open source, so that anybody can audit them. They will try to hide behind “AI” and “trade secret” excuses, but lawmakers have to see above that bullshit.
Unfortunately, US lawmakers are both stupid and corrupt, so it’s unlikely that we’ll see proper change, and more likely that we’ll see shit like “banning all social media from foreign adversaries” when the US-based social media companies are largely the cause of all these problems. I’m sure the US intelligence agencies don’t want them to change either, since those companies provide large swaths of personal data to them.
deweydecibel@lemmy.world 8 months ago
The problem then becomes if the clearly defined rules aren’t enough, then the people that run these sites need to start making individual judgment calls based on…well, their gut, really. And that creates a lot of issues if the site in question could be held accountable for making a poor call or overlooking something.
The threat of legal repercussions hanging over them is going to make them default to the most strict actions, and that’s kind of a problem if there isn’t a clear definition of what things need to be actioned against.
rambaroo@lemmynsfw.com 8 months ago
Reddit is the same thing. They intentionally enable and cultivate hostility and bullying there to drive up engagement.
galoisghost@aussie.zone 8 months ago
Nah. This isn’t guilt by association
Which despite their denials the actually know: nbcnews.com/…/facebook-knew-radicalized-users-rcn…
Arbiter@lemmy.world 8 months ago
Yeah, but algorithmic delivery of radicalizing content seems kinda evil though.
rambaroo@lemmynsfw.com 8 months ago
I don’t think you understand the issue. This wasn’t an accident. These social media companies deliberately feed you the most upsetting and disturbing material they can. They’re intentionally radicalizing people to make money from engagement.
They’re absolutely responsible for what they’ve done, and it isn’t “by proxy”, it’s extremely direct and deliberate. It’s long past time that courts held them liable. What they’re doing is criminal.
rbesfe@lemmy.ca 8 months ago
Proving this “intent to radicalize” in court is impossible
Minotaur@lemm.ee 8 months ago
I do. I just very much understand the extent that the justice system will take decisions like this and utilize them to accuse any person or business (including you!) of a crime that they can then “prove” they were at fault for.
WarlordSdocy@lemmy.world 8 months ago
I think the distinction here is between people and businesses. Is it the fault of people on social media for the acts of others? No. Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes. The blame here is on the social medias for not doing more to stop the spread of this kind of content. Because yes even though that won’t stop this kind of content from existing making it harder to access and find will at least reduce the number of people who will go down this path.
rambaroo@lemmynsfw.com 8 months ago
I agree, but I want to clarify. It’s not about making this material harder to access. It’s about not deliberately serving that material to people who weren’t looking it up in the first place in order to get more clicks.
There’s a huge difference between a user looking up extreme content on purpose and social media serving extreme content to unsuspecting people because the company knows it will upset them.
0x0@programming.dev 8 months ago
Really? Then add videogames and heavy metal to the list. And why not most organized religions? Same argument, zero sense. There’s way more at play than Person watches X content = person is now radicalized, unless we’re talking about someone with severe cognitive deficit.
snooggums@midwest.social 8 months ago
Systemic problems require systemic solutions.
Minotaur@lemm.ee 8 months ago
Sure, and I get that for like, healthcare. But ‘systemic solutions’ as they pertain to “what constitutes a crime” lead to police states really quickly imo
morrowind@lemmy.ml 8 months ago
Do you not think if someone encouraged a murderer they should be held accountable? It’s not everyone they interacted with, there has to be reasonable suspicion they contributed.
Also I’m pretty sure this is nothing new
Minotaur@lemm.ee 8 months ago
I didn’t say that at all, and I think you know I didn’t unless you really didn’t actually read my comment.
I am not talking about encouraging someone to murder. I specifically said that in overt cases there is some common sense civil responsibility. I am talking about the potential for the the police to break down your door because you Facebook messaged a guy you’re friends with what your favorite local gun store was, and that guy also happens to listen to death metal and take antidepressants and the state has deemed him a risk factor level 3.
deweydecibel@lemmy.world 8 months ago
Depends on what you mean by “encouraged”. That is going to need a very precise definition in these cases.
And the point isn’t that people shouldn’t be held accountable, it’s that there are a lot of gray areas here, we need to be careful how we navigate them. Irresponsible rulings or poorly implemented laws can destabilize everything that makes the internet worthwhile.
VirtualOdour@sh.itjust.works 8 months ago
Everyone on lemmy who makes guillotine jokes will enjoy their life sentence I’m sure
Socsa@sh.itjust.works 8 months ago
This wasn’t just a content issue. Reddit actively banned people for reporting violent content too much. They literally engaged with and protected these communities, even as people yelled that they were going to get someone hurt.
deweydecibel@lemmy.world 8 months ago
Also worth remembering, this opens up avenues for lawsuits on other types of “harm”.
We have states that have outlawed abortion. What do those sites do when those states argue social media should be “held accountable” for all the women who are provided information on abortion access through YouTube, Facebook, reddit, etc?
PhlubbaDubba@lemm.ee 8 months ago
I dunno about social media companies but I quite agree that the party who got the gunman the gun should share the punishment for the crime.
Firearms should be titled and insured, and the owner should have an imposed duty to secure, and the owner ought to face criminal penalty if the firearm titled to them was used by someone else to commit a crime, either they handed a killer a loaded gun or they inadequately secured a firearm which was then stolen to be used in committing a crime, either way they failed their responsibility to society as a firearm owner and must face consequences for it.
solrize@lemmy.world 8 months ago
This guy seems to have bought the gun legally at a gun store, after filling out the forms and passing the background check. You may be thinking of the guy in Maine whose parents bought him a gun when he was obviously dangerous. They were just convicted of involuntary manslaughter for that, iirc.
Minotaur@lemm.ee 8 months ago
If you lend your brother, who you know is on antidepressants, a long extension cord he tells you is for his back patio - and he hangs himself with it, are you ready to be accused of being culpable for your brothers death?
jumjummy@lemmy.world 8 months ago
And ironically the gun manufacturers or politicians who support lax gun laws are not included in these “nets”. A radicalized individual with a butcher knife can’t possibly do as much damage as one with a gun.