Comment on The air begins to leak out of the overinflated AI bubble

<- View Parent
bruhduh@lemmy.world ⁨2⁩ ⁨months⁩ ago

Search Nvidia p40 24gb on eBay, 200$ each and surprisingly good for selfhosted llm, if you plan to build array of gpus then search for p100 16gb, same price but unlike p40, p100 supports nvlink, and these 16gb is hbm2 memory with 4096bit bandwidth so it’s still competitive in llm field

source
Sort:hotnewtop