Comment on Fun/interesting things to self host?
brucethemoose@lemmy.world 2 months ago
-
For LLM hosting, ik_llama.cpp. You can really gigantic models at acceptable speeds with its hybrid CPU/GPU focus, at higher quality/speed than mainline llama.cpp, and it has several built in UIs.
-
LanguageTool, for self run grammar/spelling/style checking.