Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Micron says driverless cars and robots will need 300GB of RAM

⁨237⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨weeks⁩ ago⁩ by ⁨themachinestops@lemmy.dbzer0.com⁩ to ⁨technology@lemmy.world⁩

https://www.techspot.com/news/111785-micron-driverless-cars-robots-need-300gb-memory.html

source

Comments

Sort:hotnewtop
  • Lost_My_Mind@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    No no no. See, this is why AI is so fucked up. It doesn’t matter if it’s human or driverless. Car’s aren’t supposed to RAM anything!!!

    source
    • Deceptichum@quokk.au ⁨2⁩ ⁨weeks⁩ ago

      They should dodge not ram.

      source
      • confuser@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

        Not enough dodge too much RAM 👌

        source
  • Ilovethebomb@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

    Obviously a company that makes RAM will say RAM is important, what else would we expect?

    I’d love to know what Waymo vehicles have though.

    source
    • AbidanYre@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      They saw Jensen say engineers should be burning tokens to keep warm and thought “fuck it, let’s do this”.

      source
    • corsicanguppy@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

      LIDAR. They have lidar.

      source
  • MentalEdge@sopuli.xyz ⁨2⁩ ⁨weeks⁩ ago

    “You knownthat thing we sell? Buy a shitload of it.”

    source
    • amateurcrastinator@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      More like, you know that thing we used to sell and now we don’t because we bet everything on ai datacenters? Well now we bet on robots because we raised the price so high nobody will afford to buy that thing we used to sell and we can’t go back to that price because the line must go up!

      source
  • WanderingThoughts@europe.pub ⁨2⁩ ⁨weeks⁩ ago

    I’m reading that headline as: Major electronics company explains why self driving cars and home robots will be unaffordable.

    source
    • Kushan@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Only because of current RAM prices and artificial scarcity keeping those prices high.

      300GB of RAM shouldn’t be that expensive. I have 1/3 of that in my server (bought years ago). If it wasn’t for the AI bullshit, 300GB would be fairly reasonable to buy in a couple of years time.

      source
    • 8oow3291d@feddit.dk ⁨2⁩ ⁨weeks⁩ ago

      RAM used to be ~$4/GB. So 300*4=$1200. A price increase of $1200 is actually pretty darn affordable to get self driving, surely?

      Sure, there are other components than RAM needed. But the RAM is not what would make it unaffordable.

      source
  • vane@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    I imagine pople stealing RAM from cars.

    source
    • dan69@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Like catalytic converters?

      source
    • floquant@lemmy.dbzer0.com ⁨2⁩ ⁨weeks⁩ ago

      No way they would make it upgradable or user serviceable lol

      source
      • Buddahriffic@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        They just get really fast and accurate with soldering irons. Until later ones come along absolutely surgical with a flame thrower.

        source
    • SaveTheTuaHawk@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

      no, they’ll just buy RAMDoubler!

      source
    • anon_8675309@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      How? It’s soldered on.

      source
      • Kandinsky@europe.pub ⁨2⁩ ⁨weeks⁩ ago

        I’m not saying that’s realistic, but I imagine my cyberpunk thieves now with portable soldering stations and microscopes

        source
      • Damage@feddit.it ⁨2⁩ ⁨weeks⁩ ago

        Take the whole board?

        source
        • -> View More Comments
  • ElcaineVolta@kbin.melroy.org ⁨2⁩ ⁨weeks⁩ ago

    when the loot chests move around the map

    source
  • YiddishMcSquidish@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

    Whoever at micron that said this is obviously some novel kind of idiot.

    source
    • CapuccinoCoretto@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Old kinda idiot. Marketing Dept pumping stock with “white paper”.

      source
    • Omgpwnies@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      With the current level of tech in a car, you’re already likely pushing 300GB in total. There’s dozens of high-compute ECUs doing all sorts of things, running some *nix OS and using anywhere from a couple GB to well… way more.

      to reach full driverless capability, those will need to become more powerful, the software will require more memory, and the number of compute modules will likely increase as well for sensors and other stuff.

      300GB IMO is probably a conservative estimate.

      source
      • YiddishMcSquidish@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

        I’m not trying to sound angry at you, but I’m told I come off that way. So please let me start this with an advanced apology.

        We have the esp32 in very common circulation. We have seen what is required to keep a thing fucking airborne, and it is so beyond what I thought was possible twenty years ago. And they did it with <1 gig.

        source
        • -> View More Comments
      • GamingChairModel@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        With the current level of tech in a car, you’re already likely pushing 300GB in total.

        The actual article (and the call it is reporting on, with statements from the CEO) says that 16GB is the average in new cars today. No need to make stuff up.

        source
  • chunes@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    They think that makes them sound smart and important.

    It actually makes them sound incompetent.

    source
  • Tiger666@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

    We went to the moon with Kbs and now cars need Gbs?

    source
    • zod000@lemmy.dbzer0.com ⁨2⁩ ⁨weeks⁩ ago

      There is a lot less traffic/pedestrians in space.

      source
    • SaveTheTuaHawk@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

      We went to the moon with rooms of women with pencils.

      source
  • BaraCoded@literature.cafe ⁨2⁩ ⁨weeks⁩ ago

    Each will also need a portable nuclear reactor and a swimming pool filled with the blood of innocents and ice cubes made out of children’s tears.

    source
  • melsaskca@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

    What the hell does the President of France know about RAM!? /s

    source
    • Goodlucksil@lemmy.dbzer0.com ⁨2⁩ ⁨weeks⁩ ago

      No no no no no! That’s Emmanuel MAcron! This is his little brother, Emmanuel MIcron!

      source
  • rockSlayer@lemmy.blahaj.zone ⁨2⁩ ⁨weeks⁩ ago

    I definitely misread Micron as Macron, and was confused why the French president was chiming in on this conversation

    source
  • Armand1@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Of course they do. They are extremely impartial on the matter and I trust their judgment.

    source
  • sleet01@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

    Ah, yes, the ol’ “X needs 300 of whatever it is I sell” gambit.

    source
  • Link@rentadrunk.org ⁨2⁩ ⁨weeks⁩ ago

    I’m sure Micron aren’t bias at all…

    source
  • billwashere@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    This is like an oil company saying you will need more oil.

    source
  • garbage_world@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    This sounds reasonable

    source
  • ItsMeSpez@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Shovel vendor forecasts massive uptick in hole digging.

    source
  • JigglySackles@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    The RAM cartel says things need to use more RAM. …k

    source
  • Zwuzelmaus@feddit.org ⁨2⁩ ⁨weeks⁩ ago

    If I were the boss of a RAM company, would I say then that the world needs more rubber tires, or would I say the world needs more RAM?

    source
  • Chivera@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Is their ram going to become the new catalytic converters

    source
  • IphtashuFitz@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Didn’t Musk promise like a decade ago that Tesla self driving would run fine on their “hardware v2” computer, then a few years later that it would require v3, and then v4 before he finally stopped making such promises?

    source
    • Voroxpete@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

      Micron make RAM. I don’t think we should give any more credence to their claims than we do to Elon’s. Their goal here is to pump their share price, nothing more.

      source
  • Blackmist@feddit.uk ⁨2⁩ ⁨weeks⁩ ago

    Nah, it just needs a team of Indian guys to step in whenever the collision alarms go off.

    source
  • anon_8675309@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    300? Come on. We all know it comes in powers of 2.

    AI write that?

    source
  • thatradomguy@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    I literally don’t want self driving cars. Fucking stop.

    source
  • GreenKnight23@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    it’s funny because I can run a coral TPU on 4gb that can identify obstacles in live streams.

    I’m a fucking genius for figuring it out. make me the CTO of Micron and I will share my knowledge.

    source
  • humanspiral@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

    This is not credible. A self promoting stock pump and dump PR. Vision AI models are smaller than text models. They do need fast/faster GPUs, but less memory. Very narrow purposed AI/Neural Network models need less memory because the memory is more about storing facts than logic/reasoning capability. LLM breakthroughs in benchmark score/GB are currently having more gains by smaller models than frontier largest models. 32gb is a reasonable ceiling for memory requirement. Robots can swap in task specific AI models as well.

    source
  • DarrinBrunner@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    L0 ADAS for life, baby!

    source
  • SulaymanF@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Comma AI can do it with far far less.

    source
  • Kolanaki@pawb.social ⁨2⁩ ⁨weeks⁩ ago

    Bullshit.

    source
-> View More Comments