Name one industry the Chinese haven’t beaten sooner or later. When they apply to a problem , they typically lead the world.
Comment on China's first real gaming GPU is here, and the benchmarks are brutal
sirboozebum@lemmy.world 3 days ago
More competition for AMD and NVIDIA, the better.
I wouldn’t expect the first domestic Chinese GPU to be great but hopefully they keep iterating and get better and better.
Appoxo@lemmy.dbzer0.com 2 days ago
Forgot Intel? lol
goferking0@lemmy.sdf.org 2 days ago
Didn’t they drop their arc cards?
Appoxo@lemmy.dbzer0.com 2 days ago
Not yet.
goferking0@lemmy.sdf.org 2 days ago
I thought they were dropping it completely when they changed to just have it as part of the cpu instead a discrete card
orclev@lemmy.world 3 days ago
Sounds like it’s about equivalent to Intel’s latest GPU. Both are running about a little over a generation behind AMD and Nvidia. Meanwhile Nvidia is busy trying to kill their consumer GPU division to free up more fab space for data center GPUs chasing that AI bubble. AMD meanwhile has indicated they’re not bothering to even try to compete with Nvidia on the high end but rather are trying to land solidly in the middle of Nvidia’s lineup. More competition is good but it seems like the two big players currently are busy trying to not compete as best they can, with everyone else fighting for their scraps. The next year or two in the PC market are shaping up to be a real shit show.
glimse@lemmy.world 3 days ago
Sounds like it’s more than “a little over one generation behind” if it benchmarks near an Nvidia card released 14 years ago??
AmidFuror@fedia.io 3 days ago
It's roughly a human generation behind.
orclev@lemmy.world 3 days ago
I was basing that on the quote saying it rivals a 4060.
glimse@lemmy.world 3 days ago
According to the article, the actual performance is on par with a GTX 660 Ti
Anivia@feddit.org 3 days ago
Maybe you should read more than 1 paragraph before commenting. And in general.
turkalino@sh.itjust.works 3 days ago
Which seems wildly shortsighted, like surely the AI space is going to find some kind of more specialized hardware soon, sort of like how crypto moved to ASICs. But I guess bubbles are shortsighted…
CheeseNoodle@lemmy.world 3 days ago
The crazy part is outside LLMs the other (actually useful) AI does not need that much processing power, more than you or I use sure but notthing that would have justified gigantic data centers.
DacoTaco@lemmy.world 3 days ago
Debatable. The basics of an llm might not need much, but the actual models do need it to be anywhere near decent or usefull. Im talking minutes for a simple reply.
Source: ran few <=5b models on my system with ollama yesterday and gave it access to a mcp server to do stuff with
Mihies@programming.dev 3 days ago
I suppose Chinese might quickly catch on. They certainly don’t lack resources.