Comment on Wait, the ZimaCube has a private GPT implementation?
fhein@lemmy.world 1 year ago
There are tons of options for running LLMs locally nowadays, though none come close to GPT4 or Claude 2 etc. One place to start is /c/localllama@sh.itjust.works
MigratingtoLemmy@lemmy.world 1 year ago
Thank you, subscribed!