I think a good 13B model running on 12GB of VRAM can do pretty well. But I’d be hard pressed to believe anything under 33B would beat 3.5.
Comment on Want a more private ChatGPT alternative that runs offline? Check out Jan
Bipta@kbin.social 9 months agoLocal LLMs can beat GPT 3.5 now.
Speculater@lemmy.world 9 months ago
june@lemmy.world 9 months ago
3.5 fuckin sucks though. That’s a pretty low bar to set imo.