It pains me to argue this point, but are you sure there isn’t a legitimate use case just this once? The text says that this was aimed at making Wikipedia more accessible to less advanced readers, like (I assume) people whose first language is not English. I don’t know if this is actually a good idea but it seems the least objectionable use of generative AI I’ve seen so far.
Comment on Wikipedia Pauses AI-Generated Summaries After Editor Backlash
Sam_Bass@lemmy.world 2 weeks ago
Why is it so damned hard for coporate to understand most people have no use nor need for ai at all?
snf@lemmy.world 2 weeks ago
Sam_Bass@lemmy.world 1 week ago
Considering ai uses llms and more often than not mixes metaphors, it just seems to me that the wkimedia foundation is asking for misinformation to be published unless there are humans to fact check it
UnderpantsWeevil@lemmy.world 2 weeks ago
One of the biggest changes for a nonprofit like Wikipedia is to find cheap/free labor that administration trusts.
AI “solves” this problem by lowering your standard of quality and dramatically increasing your capacity for throughput.
It is a seductive trade.
explodicle@sh.itjust.works 2 weeks ago
— Upton Sinclair
AnyOldName3@lemmy.world 2 weeks ago
Wikipedia management shouldn’t be under that pressure. There’s no profit motive to enshittify or replace human contributions. They’re funded by donations from users, so their top priority should be giving users what they want, not attracting bubble-chasing venture capital.