I’m running Qwen 3B and it is seldom useful
Comment on The AI bubble is so big it's propping up the US economy (for now)
tomkatt@lemmy.world 1 day agoYou can probably just use ollama and import the novel.
WorldsDumbestMan@lemmy.today 1 day ago
brucethemoose@lemmy.world 22 hours ago
It’s too small.
IDK what your platform is, but have you tried Qwen3 A3B? Or smallthinker 21B?
huggingface.co/…/SmallThinker-21BA3B-Instruct
The speed should be somewhat similar.
WorldsDumbestMan@lemmy.today 22 hours ago
Qwen3 8B sorry, Idiot spelling. I use it to talk about problems when I have no internet or maxed out on Claude. I can rarely trust it with anything reasoning related, it’s faster and easier to do most things myself.
brucethemoose@lemmy.world 22 hours ago
Yeah, 7B models are just not quite there.
There are tons of places to get free API access to bigger models. I’d suggest Jamba, Kimi, and Google AI Studio.
brucethemoose@lemmy.world 1 day ago
It’s going to be slow as molasses on ollama. It needs a better runtime, and GLM 4.5 probably isn’t supported at this moment anyway.