Comment on Meta and Microsoft say they will buy AMD's new AI chip as an alternative to Nvidia's
aniki@lemm.ee 1 year ago
I would kill to run my models on my own AMD linux server.
Comment on Meta and Microsoft say they will buy AMD's new AI chip as an alternative to Nvidia's
aniki@lemm.ee 1 year ago
I would kill to run my models on my own AMD linux server.
dublet@lemmy.world 1 year ago
Does GPT4all not allow that? Or do you have specific other models?
aniki@lemm.ee 1 year ago
I haven’t super looked into it but I’m not interested in playing the GPU game with gamers so if AMD can do a Tesla equivalent with gobs of RAM and no display hardware I’d be all about it.
tal@lemmy.today 1 year ago
I doubt that it’s gonna go that far (I mean, AMD competing with Nvidia would doubtless improve the situation for the consumer, but…) That segment of the market is less price-sensitive than gamers, which is why Nvidia is demanding the prices that they are for it.
An Nvidia H100 will give you 80GB of VRAM, but you’ll pay $30,000 for it.
dublet@lemmy.world 1 year ago
Some of the LLMs it ships with are very reasonably sized and still be impressive. I can run them on a laptop with 32GB of RAM.
aniki@lemm.ee 1 year ago
This is very interesting! Thanks for the link. I’ll dig into this when I manage to have some time.