Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Wikipedia is using (some) generative AI now

⁨97⁩ ⁨likes⁩

Submitted ⁨⁨1⁩ ⁨week⁩ ago⁩ by ⁨cm0002@lemmy.world⁩ to ⁨technology@lemmy.world⁩

https://www.theverge.com/ai-artificial-intelligence/659222/wikipedia-generative-ai

source

Comments

Sort:hotnewtop
  • drdiddlybadger@pawb.social ⁨1⁩ ⁨week⁩ ago

    Is anyone else hating a lot of these current articles that are sparse as fuck on detail. How are they actually using generative AI. Where is it being applied. Just telling me that it’s tools for editors and volunteers doesn’t tell me what the tool is doing. 😤

    source
    • Zarxrax@lemmy.world ⁨1⁩ ⁨week⁩ ago

      Here’s the actual source: …wikimedia.org/…/Artificial_intelligence_for_edit…

      source
      • lime@feddit.nu ⁨1⁩ ⁨week⁩ ago

        ah so no generative ai used in actual article production, just in meta stuff and for newcomers to ask questions about how to do things.

        source
      • pelespirit@sh.itjust.works ⁨1⁩ ⁨week⁩ ago

        Wikipedia’s model of collective knowledge generation has demonstrated its ability to create verifiable and neutral encyclopedic knowledge. The Wikipedian community and WMF have long used AI to support the work of volunteers while centering the role of the human. Today we use AI to support editors to detect vandalism on all Wikipedia sites, translate content for readers, predict article quality, quantify the readability of articles, suggest edits to volunteers, and beyond. We have done so following Wikipedia’s values around community governance, transparency, support of human rights, open source, and others. That said, we have modestly applied AI to the editing experience when opportunities or technology presented itself. However, we have not undertaken a concerted effort to improve the editing experience of volunteers with AI, as we have chosen not to prioritize it over other opportunities.

        source
      • drdiddlybadger@pawb.social ⁨1⁩ ⁨week⁩ ago

        Thank you!

        source
    • sugar_in_your_tea@sh.itjust.works ⁨1⁩ ⁨week⁩ ago

      I’m a manager of sorts and one of the people who report to me used gen AI in their mid-year reviews. Basically, they said, “make this sound better” and the AI spit out something that reads better while still having the some content. In the past, this person had continually been snarky and self-deprecating, and the AI helped make it sound more constructive.

      I hope that’s what’s happening here. A human curates the content, runs it through the AI to make it read better, then edits from there. That last part is essential though.

      source
      • FourWaveforms@lemm.ee ⁨1⁩ ⁨week⁩ ago

        What kind of sorts do you manage

        source
        • -> View More Comments
  • randon31415@lemmy.world ⁨1⁩ ⁨week⁩ ago

    Wikipedia had bots writing US census gathering-place articles in 2002, 20 years before LLMs were a thing. They’ve got decades of regulations in place, so I am not scared that the quality is going to drop.

    source
  • Gullible@sh.itjust.works ⁨1⁩ ⁨week⁩ ago

    Remember to download a backup while information quality is still passable

    dumps.wikimedia.org/backup-index.html

    source
    • Voyajer@lemmy.world ⁨1⁩ ⁨week⁩ ago

      It’s not for use in editing articles.

      source
    • doodledup@lemmy.world ⁨1⁩ ⁨week⁩ ago

      Haters gon hate

      source
  • Xanza@lemm.ee ⁨1⁩ ⁨week⁩ ago

    Wikipedia generally a really good candidate for generative AI.

    source
    • RandomVideos@programming.dev ⁨1⁩ ⁨week⁩ ago

      Generative AI suffers from inaccuracy; text AI generators making up believable lies if it doesnt have enough information

      source
      • Xanza@lemm.ee ⁨1⁩ ⁨week⁩ ago

        The idea of generative AI isn’t accuracy, so that’s pretty expected.

        Generative AI is designed to be used with a content base and expand on information, not to create new information. You can feed generative AI with the entirety of the current Wikipedia text source and have it expand on subjects which need it, and curtail and simply subjects which need it.

        You don’t ask generative AI to come up with new information–that’s how you get inaccurate information.

        text AI generators making up believable lies if it doesnt have enough information

        Let’s not anthropomorphize AI. It doesn’t lie. It uses available data to expand on a subject to make it conversationally complete when it lacks sufficient information on a subject, regardless of whether or not the context is correct. That’s completely different, and you can specifically prohibit an AI from doing that…

        AI is great when used appropriately. The issue is that people are using AI as a Google replacement, something it’s not designed to do. AI isn’t a fact engine. LLMs are designed to as closely resemble human speech as possible, not to give correct information to questions. People’s issue with AI is that they’re fucking using it wrong.

        This is an exceptionally great usage of AI because you already have the required factual background knowledge. You can simply feed it to your AI telling it not to fill in any gaps and to rewrite articles to be more uniform and to have direct and easy to consume verbiage. This instance is quite literally what generative AI was designed for…to use factual knowledge and to generate context around the existing data.

        Issues arise when you use AI for things other than what it was intended, and you don’t give it enough information and it has to generate information to complete datasets. AI will do what you ask, you just have to know how to ask it. That’s why AI prompt engineers are a thing.

        source
        • -> View More Comments
  • lupusblackfur@lemmy.world ⁨1⁩ ⁨week⁩ ago

    …nothing could possibly go worng!..

    (Some of you may remember the original Westworld 1sheet…)

    source