The alternative is Facebook with lies that go unchecked completely. This is actually an area where AI is not bad.
Comment on eYou promises to change social media. But the alternatives already exist
stabby_cicada@lemmy.blahaj.zone 1 day ago
An AI-powered fact-checking system embedded directly into posts.
🤢
daychilde@lemmy.world 1 day ago
FreddyNO@lemmy.world 20 hours ago
The system that is notorious for lying being used for fact checking. Yea maybe you should write “bad” in caps lock one more time, that will make you right.
stabby_cicada@lemmy.blahaj.zone 13 hours ago
“Correcting” incorrect information with more incorrect information doesn’t improve the situation.
AI tools are inherently unreliable because of the randomness of their text generation.
And worse, Europe doesn’t build its own AIs. LLM fact checking would have to be done by Grok or Claude or some other product from big American tech. And there’s an obvious problem with a social media network trying to avoid American censorship and political bias but “fact checking” with a tool that has American censorship and political bias built into it.
LwL@lemmy.world 16 hours ago
I doubt it, honestly. It’d likely catch a lot of misinfo, yes, but it would likely also classify any new findings that run counter to previous assumptions as misinfo. LLMs can’t keep up to date. And they still have the same issue that whoever trains them gets to decide what is and isn’t misinfo, which starts being a problem when it’s an ubiquitous social media site.
LuceVendemiaire@lemmy.dbzer0.com 1 day ago
just because it is a solution doesn’t mean its a good solution or even better than no solution
mimavox@piefed.social 1 day ago
If i’s implemented the right way, it could be. AI can be used for good things even of the knee-jerk reaction of so many people online is to equate it with crap.
LuceVendemiaire@lemmy.dbzer0.com 1 day ago
Recoiling upon smelling shit is also a kneejerk reaction
its always this same bullshit, “if we just implemented this correctly” where can an AI participate in fact-checking? It can’t be trusted because of hallucinations, so the solution would be to uh… manually review everything it does? just rely on third parties to do it? What ACTUAL USE does this shit have?
Nalivai@lemmy.world 1 day ago
It couldn’t be. Lying bias machine that gives people psychosis can’t magically stop being what it is. So it will always be terrible and unnecessary at best, harmful at regular.
Specter@feddit.org 1 day ago
I mean, there’s always Lemmy.
doriandiaconuro@lemmy.world 1 day ago
Funny timeline we live in