Comment on AI Is Starting to Look Like the Dot Com Bubble
headmetwall@lemmy.dbzer0.com 1 year agoThey are, but training models is hard and inference (actually using them) is (relatively) cheap. If you make a a GPT-3 size model you don’t always need the full H100 with 80+ gb to run it when things like quantization show that you can get 99% of its performance at >1/4 the size.
Thus NVIDIA selling this at 3k as an ‘AI’ card, even though it wont be as fast.