Local models are actually pretty great! They are not great at everything… but for what most people are using LLMs for they do a fine job.
Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.
rem26_art@fedia.io 3 days ago
completely uncharted territory. No one tried to cook a potato until Sam Altman graced us plebs with ChatGPT