Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Micron Says AI-Driven Memory Crunch is ‘Unprecedented’

⁨79⁩ ⁨likes⁩

Submitted ⁨⁨18⁩ ⁨hours⁩ ago⁩ by ⁨themachinestops@lemmy.dbzer0.com⁩ to ⁨technology@lemmy.world⁩

https://www.bloomberg.com/news/articles/2026-01-19/micron-says-unprecedented-memory-shortage-to-last-beyond-2026

source

Comments

Sort:hotnewtop
  • SharkAttak@kbin.melroy.org ⁨3⁩ ⁨hours⁩ ago

    The shortage may be "unprecedented" but you can't fool me that it wasn't unforeseen... Fuck you Micron.

    source
  • helpImTrappedOnline@lemmy.world ⁨9⁩ ⁨hours⁩ ago

    Easy fix “Nvidia, no one has the supply your asking for, you can wait for your order just like anyone else” Imagine a company ordering the world supply of paper and being told “yes, we’ll divert all stock straight you and everyone else can have the leftovers for 8x the cost.”

    source
    • frongt@lemmy.zip ⁨8⁩ ⁨hours⁩ ago

      Nvidia: “we will pay you three times your asking cost”

      Mfrs: “yes sir, your chips, right away sir”

      source
  • t00l@lemmy.world ⁨9⁩ ⁨hours⁩ ago

    The plans are all part of the company’s commitment to bring 40% of its DRAM manufacturing onto US soil, a goal enabled by a $6.2 billion Chips Act award the company clinched in 2024, and the ability to tap into a now-35% tax credit while construction is ongoing.

    So nice that taxpayers are funding their own shortages now.>

    source
  • FiniteBanjo@feddit.online ⁨18⁩ ⁨hours⁩ ago

    Good news is that when it crashes theres gonna be so much surplus.

    source
    • ZILtoid1991@lemmy.world ⁨17⁩ ⁨hours⁩ ago

      A lot of that memory is at best ECC-enabled, or HBM at worst, so it won’t be…

      source
      • FiniteBanjo@feddit.online ⁨16⁩ ⁨hours⁩ ago

        ECC might be slower but if a ton of it floods the market all at once it could still be a good 2x64 GB purchase. Plus, it’ll be great for selfhosts even if not for gamers.

        source
        • -> View More Comments
      • tal@lemmy.today ⁨16⁩ ⁨hours⁩ ago

        There might be some way to make use of it.

        Linux apparently can use VRAM as a swap target:

        wiki.archlinux.org/title/Swap_on_video_RAM

        So you could probably take an Nvidia H200 (141 GB memory) and set it as a high-priority swap partition, say.

        Normally, a typical desktop is liable to have problems powering an H200 (600W), but that’s with all the parallel compute hardware active, and I assume that if all you’re doing is moving stuff in and out of memory, it won’t use much power.

        That being said, it sounds like the route on the Arch Wiki above is using vramfs, which is a FUSE filesystem, which means that it’s running in userspace rather than kernelspace, which probably means that it will have more overhead than is really necessary.

        source
      • MalReynolds@slrpnk.net ⁨13⁩ ⁨hours⁩ ago

        You’re not wrong, but when/if (joyously, apparently, often it’s more profitable to destroy things for the tax break than to sell them) a significant surplus appears, adapters or new motherboards will appear fairly soon. Even things like H200s can probably be made into co-processors (hopefully running at a sane wattage for home users), as u/tal says there’s already ways to integrate into the linux kernel as (very fast) RAM, I doubt the compute will be left on the table for long.

        H200 PCIe5 x 16 card anyone?

        source
      • AmbiguousProps@lemmy.today ⁨16⁩ ⁨hours⁩ ago

        ECC these days is decent, I wouldn’t hate it even in my gaming PC. It’s the HBM that I’m worried about.

        source
    • just_an_average_joe@lemmy.dbzer0.com ⁨14⁩ ⁨hours⁩ ago

      Bruh, its good to have some hope but im sure they will find a way to screw us anyways. Economy goes up, rich gets richer. Economy goes down, rich gets richer.

      source
  • rimu@piefed.social ⁨15⁩ ⁨hours⁩ ago

    Investing 1.8 bn just before China invades…. Ok then.

    source