It seems to run locally, along with exposing an API to the local machine it’s running on
Comment on [deleted]
irish_link@lemmy.world 1 year ago
I’m not sure you understand how an API works.
To clear that up calling an API doesn’t mean you run it locally. It just means you can call the cloud service directly from your app/cli/computer without going to the openai website.
rootusercyclone@lemmy.world 1 year ago
xchino@programming.dev 1 year ago
I don’t think you understand what this is. It is a local model, it runs locally. It provides an API which you can then use in the same manner as you would the ChatGPT API. I’m not super familiar with GPT4all since llama.cpp/kobold.cpp are pretty much the standard in local inferrence, but for example llama-cpp-python provides an OpenAI compatible API.
Gnubyte@lemdit.com 1 year ago
echo64@lemmy.world 1 year ago
I’m not sure you understand how the product works. You can search up the name easily.
It runs locally.
Alto@kbin.social 1 year ago
I think there's some confusion coming from the post title and the fact that you can use ChatGPT through it with your own key, which will send your logs back to them.
You can use any number of other models that don't however