Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

A “QuitGPT” campaign is urging people to cancel their ChatGPT subscriptions— Backlash against ICE is fueling a broader movement against AI companies’ ties to President Trump.

⁨165⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨hours⁩ ago⁩ by ⁨Beep@lemmus.org⁩ to ⁨technology@lemmy.world⁩

https://quitgpt.org/

source

Comments

Sort:hotnewtop
  • LibertyLizard@slrpnk.net ⁨1⁩ ⁨hour⁩ ago

    All these boycotts I can’t join since I never paid for them in the first place 😢

    source
    • truthfultemporarily@feddit.org ⁨1⁩ ⁨hour⁩ ago

      You were just boycotting before it was cool.

      source
  • sircac@lemmy.world ⁨33⁩ ⁨minutes⁩ ago

    Any reference to Trump’s donors to back that Gepeto is the biggest one? I would like to see the top 10 or 100 list…

    source
  • unspeakablehorror@thelemmy.club ⁨2⁩ ⁨hours⁩ ago

    Off with their heads! GO self-hosted, go local… toss the rest in the trash can before this crap gets a foothold and fully enshitifies

    source
    • ch00f@lemmy.world ⁨17⁩ ⁨minutes⁩ ago

      GO self-hosted,

      So yours and another comment I saw today got me to dust off an old docker container I was playing with a few months ago to run deepseek-r1:8b on my server’s Intel A750 GPU with 8gb of VRAM. Not exactly top-of-the-line, but not bad.

      I knew it would be slow and not as good as ChatGPT or whatever which I guess I can live with. I did ask it to write some example Rust code today which I hadn’t even thought to try and it worked.

      But I also asked it to describe the characters in a popular TV show, and it got a ton of details wrong.

      8b is the highest number of parameters I can run on my card. How do you propose someone in my situation run an LLM locally? Can you suggest some better models?

      source
    • mushroommunk@lemmy.today ⁨1⁩ ⁨hour⁩ ago

      LLMs are already shit. Going local is still burning the world just to run a glorified text production machine

      source
      • suspicious_hyperlink@lemmy.today ⁨25⁩ ⁨minutes⁩ ago

        Having just finished getting an entire front end for my website, I disagree. A few years ago I would offshore this job to some third-world country devs. Now, AI can do the same thing, for cents, without having to wait for a few days for the initial results and another day or two for each revision needed

        source
    • CosmoNova@lemmy.world ⁨1⁩ ⁨hour⁩ ago

      Going local is taxing on your hardware that is extremely expensive to replace. Hell, it could soon become almost impossible to replace. I genuinely don‘t recommend it.

      Even if you HAVE to use LLMs for some reason, there are free alternatives right now that let Silicon Valley bleed money and they‘re quickly running out of it.

      Cancelling any paid subscription probably hurts them more than anything else.

      source
  • emmy67@lemmy.world ⁨49⁩ ⁨minutes⁩ ago

    Quit? Only a fool would waste their time on it.

    source
  • Cruxifux@feddit.nl ⁨1⁩ ⁨hour⁩ ago

    You can subscribe to chatGPT?

    source
    • Dojan@pawb.social ⁨1⁩ ⁨hour⁩ ago

      Yes. I think it’s like $20 a month.

      source
  • atropa@piefed.social ⁨2⁩ ⁨hours⁩ ago

    Great job

    source