What for? I can’t think of a single problem I have in my life where the answer is AI.
Comment on u WoT m8
WorldsDumbestMan@lemmy.today 13 hours ago
Luckily, I have local AI.
And you should too!
HugeNerd@lemmy.ca 7 hours ago
Diurnambule@jlai.lu 9 minutes ago
I badly classify factures with a local ia. That bad but hey that still things I dont’ have to do
Nioxic@lemmy.dbzer0.com 11 hours ago
With these ram prices?
Rather live without ai
WorldsDumbestMan@lemmy.today 10 hours ago
I run it on the hardware I have, the data stays with me, offline. I run it on hardware I already have for other purposes, and I even have a portable solar panel (but I use the plug socket).
WamGams@lemmy.ca 7 hours ago
doesn’t AI need like 96 gigs of ram to be comparable in quality (or lack there of, depending on how you view it) yo the commercial options?
Xylight@feddit.online 7 hours ago
Qwen3 30b a3b, for example, is brilliant for its size and i can run it on my 8 GB VRAM + 32 GB RAM system at like 20 tokens per second. For lower powered systems, Qwen3 4b + a search tool is also insanely great for its size and can fit in less than 3 GB of RAM or VRAM at Q5 quantization