maybe in a browser using external resources. open some chrometabs to feel the pain. apple is a joke.
Comment on Even Apple finally admits that 8GB RAM isn't enough
ssebastianoo@programming.dev 4 months ago
I have a macbook air m2 with 8gb of ram and I can even run ollama, never had ram problems, I don’t get all the hate
yournamehere@lemm.ee 4 months ago
ssebastianoo@programming.dev 4 months ago
vscode + photoshop + illustrator + discord + arc + chrome + screen recording and still no lag
yournamehere@lemm.ee 4 months ago
so not a single cool app and yet you own a computer
ssebastianoo@programming.dev 4 months ago
wtf does that mean
sverit@lemmy.ml 4 months ago
Which model with how many parameters du you use in ollama? With 8GB you should only be able to use the smallest models, which ist faaaar from ideal:
ssebastianoo@programming.dev 4 months ago
llama3:8b, I know it’s “far from ideal” but only really specific use cases require more advanced models to run locally, if you do software development, graphic design or video editing 8gb is enough