anythingllm.com (run them locally)
Comment on Elon Musk just offered to buy OpenAI for $97.4 billion
notsoshaihulud@lemmy.world 1 week ago
OK, what’s the fediverse alternative for ChatGPT?
TechAnon@lemm.ee 1 week ago
boramalper@lemmy.world 1 week ago
How does it compare to ollama in your experience?
TechAnon@lemm.ee 1 week ago
Easier to get up and running with it’s own interface (no need to run on localhost/with browser). Ollama is better if you want to tweak and customize.
Elgenzay@lemmy.ml 1 week ago
caoimhinr@lemmy.world 1 week ago
some_guy@lemmy.sdf.org 1 week ago
Running the models locally.
notsoshaihulud@lemmy.world 1 week ago
well…I guess I can use chatGPT to walk me through setting one up?! I can use a double 4090 RTX config instead of heating too. (these are really fucked up times)
some_guy@lemmy.sdf.org 1 week ago
Look here: ollama.com
notsoshaihulud@lemmy.world 1 week ago
You guys are awesome!
ours@lemmy.world 1 week ago
Download LM Studio.