Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Microsoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.

⁨27⁩ ⁨likes⁩

Submitted ⁨⁨4⁩ ⁨days⁩ ago⁩ by ⁨cm0002@ttrpg.network⁩ to ⁨technology@lemmy.zip⁩

https://github.com/microsoft/BitNet

source

Comments

Sort:hotnewtop
  • EpicFailGuy@lemmy.world ⁨4⁩ ⁨days⁩ ago

    LOLOLOL I registered the bitnet.dev domain for my home lab … should I sell it?

    Microsft … ring me up, let’s talk money

    source
  • Arghblarg@lemmy.ca ⁨4⁩ ⁨days⁩ ago

    Is it still probabilistic slop or does the model understand what it’s doing and verify primary sources? If not, yay for burning the planet more slowly I guess, but still no thanks.

    source
    • scholar@lemmy.world ⁨4⁩ ⁨days⁩ ago

      No that would require AI (Actual Intelligence)

      source
      • Dojan@pawb.social ⁨4⁩ ⁨days⁩ ago

        We already have that. It’s like, you put the AI in your brain. You are the AI.

        source
  • TrickDacy@lemmy.world ⁨4⁩ ⁨days⁩ ago

    Microslop*

    source