Some of the LLMs it ships with are very reasonably sized and still be impressive. I can run them on a laptop with 32GB of RAM.
Comment on Meta and Microsoft say they will buy AMD's new AI chip as an alternative to Nvidia's
aniki@lemm.ee 11 months agoI haven’t super looked into it but I’m not interested in playing the GPU game with gamers so if AMD can do a Tesla equivalent with gobs of RAM and no display hardware I’d be all about it.
dublet@lemmy.world 11 months ago
aniki@lemm.ee 11 months ago
This is very interesting! Thanks for the link. I’ll dig into this when I manage to have some time.
tal@lemmy.today 11 months ago
I doubt that it’s gonna go that far (I mean, AMD competing with Nvidia would doubtless improve the situation for the consumer, but…) That segment of the market is less price-sensitive than gamers, which is why Nvidia is demanding the prices that they are for it.
An Nvidia H100 will give you 80GB of VRAM, but you’ll pay $30,000 for it.