Comment on Sam Altman says ChatGPT should be 'much less lazy now'

AlmightySnoo@lemmy.world ⁨4⁩ ⁨months⁩ ago

PSA: give open-source LLMs a try folks. If you’re on Linux or macOS, ollama makes it incredibly easy to try most of the popular open-source LLMs like Mistral 7B, Mixtral 8x7B, CodeLlama etc… Obviously it’s faster if you have a CUDA/ROCm-capable GPU, but it still works in CPU-mode too (albeit slow if the model is huge) provided you have enough RAM.

You can combine that with a UI like ollama-webui or a text-based UI like oterm.

source
Sort:hotnewtop