Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone
tal@lemmy.today 19 hours agossh -L 0.0.0.0:3000:YOURPUBLICIP:3000
If you can SSH to the LLM machine, I’d probably recommend ssh -L127.0.0.1:11434:127.0.0.1:11434 <remote hostname>
. If for some reason you don’t have or inadvertently bring down a firewall on your portable device, you don’t want to be punching a tunnel from whatever can talk to your portable device to the LLM machine.
(Using 11434 instead of 3000, as it looks like that’s ollama’s port.)
DrDystopia@lemy.lol 19 hours ago
3000 is the OpenWebUI port, never got it to work by using either 127.0.0.1 or
localhost
, only 0.0.0.0. Ollama’s port 11434 on 127.x worked fine though.Fair point.