Comment on Even Apple finally admits that 8GB RAM isn't enough
sverit@lemmy.ml 4 months agoWhich model with how many parameters du you use in ollama? With 8GB you should only be able to use the smallest models, which ist faaaar from ideal:
You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models.
ssebastianoo@programming.dev 4 months ago
llama3:8b, I know it’s “far from ideal” but only really specific use cases require more advanced models to run locally, if you do software development, graphic design or video editing 8gb is enough