Comment on Idea: Selfhosted Interface for LLMs
mulcahey@lemm.ee 3 days ago
I was looking for something like this. The idea is:
- Run an open model like LLAMA or Mistral on my home server
- Access it from a client on my phone, saving my poor phone’s processor and battery.
Does something like this exist?