Comment on Substack says it will not remove or demonetize Nazi content

<- View Parent
jjjalljs@ttrpg.network ⁨10⁩ ⁨months⁩ ago

Thank you for the detailed response.

Notably though, I think Substack should also be free to not ban Nazis, and no one should give them shit for it.

Substack can host nazis given the legal framework in the US. But why shouldn’t I speak up about their platforming of evil? Substack can do what they want, and I can tell them to fuck off. I can tell people who do business with them that I don’t approve, and I’m not going to do business with them while they’re engaged with this nazi loving platform. That’s just regular old freedom of speech and association.

Their speech is not more important than mine. There is no obligation for me to sit in silence when someone else is saying horrible things.

It feels like you’re arguing for free speech for the platform, but restricted speech for the audience. The platform is free to pick who can post there, but you don’t want the audience to speak back.

Let me say it this way: If what you’re doing or saying would be illegal, even if you weren’t a Nazi, it should be illegal. […] I realize there can be a good faith difference of opinion on that, but you asked me what I thought; that’s what I think. If it’s illegal to wear a Nazi uniform, or platforms kick you off for wearing one, then it can be illegal to wear a BLM shirt, and platforms can kick you off for saying #blacklivesmatter. Neither is acceptable. To me.

You’re conflating laws and government with private stuff. The bulk of this conversation is about what can private organizations do to moderate their platforms. Legality is only tangentially related. (Also it doesn’t necessarily follow that banning nazi uniforms would ban BLM t-shirts. Germany has some heavy bans on nazi imagery and to my knowledge have not slid enthusiastically down that slope)

A web forum I used to frequent banned pro-trump and pro-ice posts. The world didn’t end. They didn’t ban BLM. It helps that it was a forum run by people, and not an inscrutable god-machine or malicious genie running the place.

I’m also not sure I understood your answer to my question. Is there a line other than “technically legal” that you don’t want crossed? Is the law actually a good arbiter?

Youtube, Facebook, and Twitter have been trying to take responsibility for antivax stuff and election denialism for years now, and banned it in some cases and tried to limit its reach with simple blacklisting. Has that approach worked?

I don’t think they’ve actually been trying very hard. They make a lot of money by not doing much. Google’s also internally incompetent (see: their many, many, canceled projects), Facebook is evil (see: that time they tried to make people sad to see if they could), and twitter has always had a child’s understanding of free speech.

I do think responsibility by the platforms is an important thing. I talked about that in terms of combatting organized disinformation, which is usually a lot more sophisticated and a lot more subtle than Nazi newsletters. I just don’t think banning the content is a good answer. Also, I suspect that the same people who want the Nazis off Substack also want lots of other non-Nazi content to be “forbidden” in the same way that, e.g. Dave Chappelle or Joe Rogan should be “forbidden” from their chosen platforms. Maybe I’m wrong about that, but that’s part of why I make a big deal about the Nazi content.

A related problem here is probably the consolidation of platforms. Twitter and Facebook as so big that banning someone from it is a bigger deal than it probably should be. But they are free to move to a more permissive platform if their content is getting them kicked out of popular places. We’re not talking about a nationwide, government backed-by-force content ban.

I’m not sure what to do about coordinated disinformation. Platforms banning or refusing to host some of it is probably one part of the remedy, though.

source
Sort:hotnewtop