I’ve just re-discovered ollama and it’s come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.
I’d like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created !Ollama@lemmy.world for everyone to discuss, ask questions, and help each other out with ollama!
So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!
Thanks!
0ndead@infosec.pub 21 hours ago
Fuck AI
Samdell@lemmy.eco.br 21 hours ago
People keep bringing LLM trash into the fediverse and then complaining when they aren’t put on a pedestal. Another community to throw in the trash bin.