Comment on [deleted]

<- View Parent
xchino@programming.dev ⁨1⁩ ⁨year⁩ ago

I don’t think you understand what this is. It is a local model, it runs locally. It provides an API which you can then use in the same manner as you would the ChatGPT API. I’m not super familiar with GPT4all since llama.cpp/kobold.cpp are pretty much the standard in local inferrence, but for example llama-cpp-python provides an OpenAI compatible API.

source
Sort:hotnewtop