Comment on Americans Recognize AI as a Wealth Inequality Machine, Pollster Finds
einlander@lemmy.world 8 hours agoAi literally tells them they are right.
Comment on Americans Recognize AI as a Wealth Inequality Machine, Pollster Finds
einlander@lemmy.world 8 hours agoAi literally tells them they are right.
AmbitiousProcess@piefed.social 5 hours ago
I wanted to test this at one point to see how bad it could get, and it really wasn’t hard, particularly with Grok.
Roleplay as some dude who’s clueless about trans people and thinks they’re weird to ChatGPT, and it will repeatedly tell you that more people identifying as trans is just a result of more acceptance and widespread knowledge, like the left handedness effect, scientific consensus is against you, it’s not based on environmental factors or choice, etc. (I can’t say that extended conversations leading to context collapse wouldn’t lead to it being more likely to follow along with right-wing attitudes, though, since I didn’t chat for that long)
Do the same to Grok, or even just ask “why are so many people trans nowadays?”, and it immediately cites the Cass Review and other retracted studies (saying it’s retracted, but that the numbers are still entirely valid with no criticism), then says it’s based on social contagion, uses the term Rapid Onset Gender Dysphoria (ROGD), then makes up a bunch of statistics saying that 99% of trans people are less likely to be happier after transition than pre-transition, then claims being depressed can make you trans and that social media algorithms can also make you trans.
It doesn’t just spit it out like “Being trans seems to be not due to inherent factors, but due to the woke left” though, it just says it like “There are a number of reasons why people could be trans, including:” and then bullet points where the top 2 are always generally accepted scientific consensus like more widespread acceptance and better diagnostic criteria, then the rest are just bullshit.