Comment on AI companies will fail. We can salvage something from the wreckage | Cory Doctorow
the_trash_man@lemmy.world 5 days ago
Hopefully dirt cheap gpus and ram when the bubble bursts
Comment on AI companies will fail. We can salvage something from the wreckage | Cory Doctorow
the_trash_man@lemmy.world 5 days ago
Hopefully dirt cheap gpus and ram when the bubble bursts
TheGrandNagus@lemmy.world 4 days ago
Unfortunately most of the GPUs aren’t usable by gamers. We aren’t talking mining booms where miners buy up gaming GPU stock, then sell cheap when the bubble bursts.
We’re talking companies buying huge GPUs that don’t have video outputs and have an altered software stack to what’s used for gaming.
Granted, many 4090/5090s were also used, and those will be usable by gamers, but even with a significant price drop on those, only richer gamers will find that to be viable.
Somewhat similar story for memory - a lot of it is tied up in HBM, or as GDDR on enterprise graphics cards.
floppybiscuits@lemmy.world 4 days ago
I would pickup an inference GPU to run models locally. I can see a benefit in that especially if they’re on the cheap
melfie@lemy.lol 4 days ago
I certainly wouldn’t let a cheap RTX PRO 6000 Blackwell Server with 96GB of VRAM go to waste. I’d put it to good use with Blender rendering, running models I actually care about, and maybe some Games on Whales.
barryamelton@lemmy.world 3 days ago
True about thoae without video outputs. But in Linux, game specific patches are in Mesa and not in the graphics drivers for example.
SuspciousCarrot78@lemmy.world 3 days ago
Additionally, in windows (linux too?) one could use Moonlight / Sunshine to compute on the GPU and stream to secondary device (either directly, like say to a Chromecast, or via the iGPU to their monitor). Latency is quite small in most circumstances, and allows for some interesting tricks (eg: sever GPUs allow you to split GPU into two “mini-gpus” - essentially, with the right card, you could host two entirely different, concurrent instances of GTA V on one machine, via one GPU).
A bit hacky, but it works.
Source: I bought a Tesla P4 for $100 and stuck it in a 1L case.
GPU goes brrr
Damage@feddit.it 3 days ago
And their energy usage must be outrageous