Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

The AI Was Fed Sloppy Code. It Turned Into Something Evil. | Quanta Magazine

⁨78⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨days⁩ ago⁩ by ⁨Preventer79@sh.itjust.works⁩ to ⁨technology@lemmy.world⁩

https://www.quantamagazine.org/the-ai-was-fed-sloppy-code-it-turned-into-something-evil-20250813/

source

Comments

Sort:hotnewtop
  • frongt@lemmy.zip ⁨2⁩ ⁨days⁩ ago

    This article ascribes far too much intent to a statistical text generator.

    source
    • Supervisor194@lemmy.world ⁨2⁩ ⁨days⁩ ago

      It is Schroedinger’s Stochastic Parrot. Simultaneously a Chinese Room and the reincarnation of Hitler.

      source
    • LodeMike@lemmy.today ⁨2⁩ ⁨days⁩ ago

      Quanta is a science rag. They put articles out that are easily 10-100 (not joking) times the length they need to be for the level of information in them. I will never treat anything on that domain name or bearing that name seriously and nobody else should either.

      source
  • Preventer79@sh.itjust.works ⁨2⁩ ⁨days⁩ ago

    Anyone know how to get access to these “evil” models? They seem hilarious af.

    source
    • renegadespork@lemmy.jelliefrontier.net ⁨2⁩ ⁨days⁩ ago

      Not from a Jedi.

      source
      • neinhorn@lemmy.ca ⁨2⁩ ⁨days⁩ ago

        Just ask Anakin

        source
    • Cherry@piefed.social ⁨2⁩ ⁨days⁩ ago

      Access to view the evil models or to make more evil models?

      source
  • A_norny_mousse@feddit.org ⁨2⁩ ⁨days⁩ ago

    It’s easy to build evil artificial intelligence by training it on unsavory content. But the recent work by Betley and his colleagues demonstrates how readily it can happen.

    Garbage in, garbage out.

    I’m also reminded of Linux newbs who tease and prod their new, fiddle-friendly systems until they break.

    source
  • kassiopaea@lemmy.blahaj.zone ⁨2⁩ ⁨days⁩ ago

    I’d like to see similar testing done comparing models where the “misaligned” data is present during training, as opposed to fine-tuning. That would be a much harder thing to pull off, though.

    source
    • sleep_deprived@lemmy.dbzer0.com ⁨2⁩ ⁨days⁩ ago

      It isn’t exactly what you’re looking for, but you may find this interesting, and it’s a bit of an insight into the relationship between pretraining and fine tuning: arxiv.org/pdf/2503.10965

      source