Comment on NVIDIA CEO Huang urges faster AI development—to make it safer
just_another_person@lemmy.world 11 months agoFriend, I do this for a living, and I have no idea why you’re even bringing gating into the equation, because it doesn’t even matter.
I’m assuming you’re a big crypto fan, because that’s about all I could say of ASIC in an HPC type of environment to be good for. Companies who pay the insane amounts of money for “AI” right now want a CHEAP solution, and ASIC is the most short-term, e-wastey, inflexible solve to that problem. When you get a job in the industry and understand the different vectors, let’s talk. Otherwise, you’re just spouting junk.
drdabbles@lemmy.world 11 months ago
Swing and a miss.
Really? Gee, I think switching fabrics might have a thing to tell you. For someone that does this for a living, to not know the extremely common places that ASICs are used is a bit of a shock.
Yeah, I already covered that in my initial comment, thanks for repeating my idea back to me.
Literally being atabled to the Intel tiles in Sapphire Rapids and beyond. Used in every switch, network card, and millions of other devices. Every accelerator you can list is an ASIC. Shit, I’ve got a Xilinx Alveo 30 in my basement at home. But yeah, because you can get an FPGA instance in AWS, you think you know that ASICs aren’t used. lmao
I’ve got bad news for you about ML as a whole.
Sometimes the flexibility of a device’s application isn’t the device itself, but how it’s used. Again, if I can do thousands, tens of thousands, or hundreds of thousands of integer operations in a tenth of the power, and a tenth of the clock cycles, then load those results into a segment of activation functions that can do the same, and all I have to do is move this data with HBM and perhaps add some cheap ARM cores, bridge all of this into a single SoC product, and sell them on the open market, well then I’ve created every single modern ARM product that has ML acceleration. And also nvidia’s latest products.
Woops.
I’ve been a hardware engineer for longer than you’ve been alive, most likely. I built my first FPGA product in the 90s. I strongly suspect you just found this hammer and don’t actually know what the market as a whole entails, let alone the long LONG history of all of these things.
Do look up ASICs in switching, BTW. You might learn something.
ZahzenEclipse@kbin.social 11 months ago
You're suck a dick for no reason. It definitely bolsters your claims your an old school tech guy lol
drdabbles@lemmy.world 11 months ago
Not for no reason. They made claims, I provided links, they whined about it. They provided zero links backing up their 40 year old claim that FPGA would replace anything that didn’t run away fast enough.
just_another_person@lemmy.world 11 months ago
Let’s just shut this down right now. If you built FPGAs ever, it was in college in the 90s, at an awful program of a US university that trained you in SQL on the side and had zero idea of how hardware works. I’m sorry for that.
The world has changed since 30 years ago, and the future of integer operations is in reprogrammable chips. All the benefit of a fab chip, and none of the downside in a cloud environment.
The very idea that you think all these companies are looking to design and build their own single purpose chips for things like inference shows you have zero idea of where the industry is headed.
You’re only describing how ASIC is used in switches, cool. That’s what it’s meant for. That’s not how general use computing works in the world anymore, buddy. It’s never going to be a co-proc in a laptop that can load models and do general inference, or be a useful function for localized NN. It’s simply for the single purpose uses as you said.
drdabbles@lemmy.world 11 months ago
I mean, you’re such an absolute know-nothing that it’s hilarious. Nice xenophobic bullshit sprinkled in too. Sorry, no university for me, let alone FPGA in university in the 90s. When my friends were in university they were still spending their time learn Java.
Indeed. And people like me have been there every step of the way. Your ageism is showing.
Yes, I remember hearing this exact sentiment 30 years ago. Right around the time we were hearing (again) how neural networks were going to take over the world. People like you are a dime a dozen and end up learning their lessons in a painfully humbling experience. Good luck with that, I hope you take it for the lesson it is.
Except the amount of wasted energy, and extreme amount of logic necessary to make it actually work. You know. The very fucking problem everybody’s working hard to address.
The very idea that you haven’t kept up with the industry and how many companies have developed their own silicon is laugh out loud comedy to me. Hahahaha. TSMC has some news for you.
Nope, I actually described how they are used in SoCs, not in switching fabrics.
Except all those Intel processors I mentioned, those ARM chips in your iPhones and Pixels, the ARM processors in your macbooks. You know. Real nobodies in the industry.
Intel has news for you. It’s impressive how in touch you pretend to be in “the industry” but how little you seem to know about actual products being actually sold today.
Hey, quick question. Does nvidia have FPGAs in their GPUs? No? Hmm. Is the H100 just a huge set of FPGA? No? Oh, weird. I wonder why, since you in all your genuis has said that’s the way everybody’s going. Strange that their entire product roadmap shows zero FPGA on their DPUs, GPUs, or on their soon to arrive SoCs. You should call Jensen, I bet he has so much to learn from a know-it-all like you that has some amazing ideas about US universities. Hey, where is it that all these tech startup CEOs went to university?
Tell you what. Don’t bother responding, nothing you’ve said holds any water or value.
victorz@lemmy.world 11 months ago
Aaand we’re back on Reddit again…
just_another_person@lemmy.world 11 months ago
Because literally everyone else saw the writing on the wall and is preparing FPGA chips EXCEPT for NVIDIA. 🤦
NVidia is just now trying to make their own ARM chips ffs. 5 years late. You’re dated and outmoded. Get with the future.