I’m not sure you understand how an API works.
To clear that up calling an API doesn’t mean you run it locally. It just means you can call the cloud service directly from your app/cli/computer without going to the openai website.
Submitted 1 year ago by Gnubyte@lemdit.com to technology@lemmy.world
I’m not sure you understand how an API works.
To clear that up calling an API doesn’t mean you run it locally. It just means you can call the cloud service directly from your app/cli/computer without going to the openai website.
I’m not sure you understand how the product works. You can search up the name easily.
It runs locally.
I think there's some confusion coming from the post title and the fact that you can use ChatGPT through it with your own key, which will send your logs back to them.
You can use any number of other models that don't however
It seems to run locally, along with exposing an API to the local machine it’s running on
I don’t think you understand what this is. It is a local model, it runs locally. It provides an API which you can then use in the same manner as you would the ChatGPT API. I’m not super familiar with GPT4all since llama.cpp/kobold.cpp are pretty much the standard in local inferrence, but for example llama-cpp-python provides an OpenAI compatible API.
Wouldn’t the api hit the cloud, isn’t it just a different interface?
Just to expand, you can also use ChatGPT using your own key now, which will send your logs to OpenAI. There's plenty of models available that don't however, and I've found them plenty good enough for my usecases.
No, it provides an API as an interface, it is not consuming an API.
Qvest@lemmy.world 1 year ago
It’s not ChatGPT’s API. It runs locally, no internet required. For those interested read more here: https://gpt4all.io/index.html
anonymoose@lemmy.ca 1 year ago
They’ve now added the ability to use OpenAI GPT 3.5/4 APIs by supplying your own key, I believe.
Gnubyte@lemdit.com 1 year ago
Maybe there is confusion in what I am saying. Yes. That is not ChatGPT’s API:
Image
What I mean is - it literally has an API server you can run yourself if you want to programmatically talk to it. Here is a screenshot of the menu option.