Is it still probabilistic slop or does the model understand what it’s doing and verify primary sources? If not, yay for burning the planet more slowly I guess, but still no thanks.
Microsoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.
Submitted 4 days ago by cm0002@ttrpg.network to technology@lemmy.zip
https://github.com/microsoft/BitNet
Comments
Arghblarg@lemmy.ca 4 days ago
scholar@lemmy.world 4 days ago
No that would require AI (Actual Intelligence)
Dojan@pawb.social 4 days ago
We already have that. It’s like, you put the AI in your brain. You are the AI.
TrickDacy@lemmy.world 4 days ago
Microslop*
EpicFailGuy@lemmy.world 4 days ago
LOLOLOL I registered the bitnet.dev domain for my home lab … should I sell it?
Microsft … ring me up, let’s talk money