Comment on Frustratingly bad at self hosting. Can someone help me access LLMs on my rig from my phone

<- View Parent
BlackSnack@lemmy.zip ⁨20⁩ ⁨hours⁩ ago

Backend/ front end. I see those a lot but I never got an explanation for it. In my case, the backend would be Ollama on my rig, and the front end would be me using it on my phone, whether that’s with and app or web ui. Is that correct?

I will add kobold to my list of AIs to check out in the future. Thanks!

Ollama has an app (or maybe interface is a better term for it) on windows right that I download models too. Then I can use said app to talk to the models. I believe Reins: Chat for Ollama is the app for iPhone that allows me to use my phone to chat with my models that are on the windows rig.

source
Sort:hotnewtop