Comment on Idea: Selfhosted Interface for LLMs

mulcahey@lemm.ee ⁨3⁩ ⁨days⁩ ago

I was looking for something like this. The idea is:

  1. Run an open model like LLAMA or Mistral on my home server
  2. Access it from a client on my phone, saving my poor phone’s processor and battery.

Does something like this exist?

source
Sort:hotnewtop