Obviously a company that makes RAM will say RAM is important, what else would we expect?
I’d love to know what Waymo vehicles have though.
Submitted 2 weeks ago by themachinestops@lemmy.dbzer0.com to technology@lemmy.world
https://www.techspot.com/news/111785-micron-driverless-cars-robots-need-300gb-memory.html
Obviously a company that makes RAM will say RAM is important, what else would we expect?
I’d love to know what Waymo vehicles have though.
They saw Jensen say engineers should be burning tokens to keep warm and thought “fuck it, let’s do this”.
LIDAR. They have lidar.
“You knownthat thing we sell? Buy a shitload of it.”
More like, you know that thing we used to sell and now we don’t because we bet everything on ai datacenters? Well now we bet on robots because we raised the price so high nobody will afford to buy that thing we used to sell and we can’t go back to that price because the line must go up!
I’m reading that headline as: Major electronics company explains why self driving cars and home robots will be unaffordable.
Only because of current RAM prices and artificial scarcity keeping those prices high.
300GB of RAM shouldn’t be that expensive. I have 1/3 of that in my server (bought years ago). If it wasn’t for the AI bullshit, 300GB would be fairly reasonable to buy in a couple of years time.
RAM used to be ~$4/GB. So 300*4=$1200. A price increase of $1200 is actually pretty darn affordable to get self driving, surely?
Sure, there are other components than RAM needed. But the RAM is not what would make it unaffordable.
I imagine pople stealing RAM from cars.
Like catalytic converters?
No way they would make it upgradable or user serviceable lol
They just get really fast and accurate with soldering irons. Until later ones come along absolutely surgical with a flame thrower.
no, they’ll just buy RAMDoubler!
How? It’s soldered on.
I’m not saying that’s realistic, but I imagine my cyberpunk thieves now with portable soldering stations and microscopes
when the loot chests move around the map
Whoever at micron that said this is obviously some novel kind of idiot.
Old kinda idiot. Marketing Dept pumping stock with “white paper”.
With the current level of tech in a car, you’re already likely pushing 300GB in total. There’s dozens of high-compute ECUs doing all sorts of things, running some *nix OS and using anywhere from a couple GB to well… way more.
to reach full driverless capability, those will need to become more powerful, the software will require more memory, and the number of compute modules will likely increase as well for sensors and other stuff.
300GB IMO is probably a conservative estimate.
I’m not trying to sound angry at you, but I’m told I come off that way. So please let me start this with an advanced apology.
We have the esp32 in very common circulation. We have seen what is required to keep a thing fucking airborne, and it is so beyond what I thought was possible twenty years ago. And they did it with <1 gig.
With the current level of tech in a car, you’re already likely pushing 300GB in total.
The actual article (and the call it is reporting on, with statements from the CEO) says that 16GB is the average in new cars today. No need to make stuff up.
They think that makes them sound smart and important.
It actually makes them sound incompetent.
We went to the moon with Kbs and now cars need Gbs?
There is a lot less traffic/pedestrians in space.
We went to the moon with rooms of women with pencils.
Each will also need a portable nuclear reactor and a swimming pool filled with the blood of innocents and ice cubes made out of children’s tears.
What the hell does the President of France know about RAM!? /s
No no no no no! That’s Emmanuel MAcron! This is his little brother, Emmanuel MIcron!
I definitely misread Micron as Macron, and was confused why the French president was chiming in on this conversation
Of course they do. They are extremely impartial on the matter and I trust their judgment.
Ah, yes, the ol’ “X needs 300 of whatever it is I sell” gambit.
I’m sure Micron aren’t bias at all…
This is like an oil company saying you will need more oil.
This sounds reasonable
Shovel vendor forecasts massive uptick in hole digging.
The RAM cartel says things need to use more RAM. …k
If I were the boss of a RAM company, would I say then that the world needs more rubber tires, or would I say the world needs more RAM?
Is their ram going to become the new catalytic converters
Didn’t Musk promise like a decade ago that Tesla self driving would run fine on their “hardware v2” computer, then a few years later that it would require v3, and then v4 before he finally stopped making such promises?
Micron make RAM. I don’t think we should give any more credence to their claims than we do to Elon’s. Their goal here is to pump their share price, nothing more.
Nah, it just needs a team of Indian guys to step in whenever the collision alarms go off.
300? Come on. We all know it comes in powers of 2.
AI write that?
I literally don’t want self driving cars. Fucking stop.
it’s funny because I can run a coral TPU on 4gb that can identify obstacles in live streams.
I’m a fucking genius for figuring it out. make me the CTO of Micron and I will share my knowledge.
This is not credible. A self promoting stock pump and dump PR. Vision AI models are smaller than text models. They do need fast/faster GPUs, but less memory. Very narrow purposed AI/Neural Network models need less memory because the memory is more about storing facts than logic/reasoning capability. LLM breakthroughs in benchmark score/GB are currently having more gains by smaller models than frontier largest models. 32gb is a reasonable ceiling for memory requirement. Robots can swap in task specific AI models as well.
L0 ADAS for life, baby!
Comma AI can do it with far far less.
Bullshit.
Lost_My_Mind@lemmy.world 2 weeks ago
No no no. See, this is why AI is so fucked up. It doesn’t matter if it’s human or driverless. Car’s aren’t supposed to RAM anything!!!
Deceptichum@quokk.au 2 weeks ago
They should dodge not ram.
confuser@lemmy.zip 2 weeks ago
Not enough dodge too much RAM 👌