anythingllm.com (run them locally)
Comment on Elon Musk just offered to buy OpenAI for $97.4 billion
notsoshaihulud@lemmy.world 2 months ago
OK, what’s the fediverse alternative for ChatGPT?
TechAnon@lemm.ee 2 months ago
boramalper@lemmy.world 2 months ago
How does it compare to ollama in your experience?
TechAnon@lemm.ee 2 months ago
Easier to get up and running with it’s own interface (no need to run on localhost/with browser). Ollama is better if you want to tweak and customize.
Elgenzay@lemmy.ml 2 months ago
caoimhinr@lemmy.world 2 months ago
some_guy@lemmy.sdf.org 2 months ago
Running the models locally.
notsoshaihulud@lemmy.world 2 months ago
well…I guess I can use chatGPT to walk me through setting one up?! I can use a double 4090 RTX config instead of heating too. (these are really fucked up times)
some_guy@lemmy.sdf.org 2 months ago
Look here: ollama.com
notsoshaihulud@lemmy.world 2 months ago
You guys are awesome!
ours@lemmy.world 2 months ago
Download LM Studio.