Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Huawei shows off AI computing system to rival Nvidia's top product

⁨28⁩ ⁨likes⁩

Submitted ⁨⁨4⁩ ⁨days⁩ ago⁩ by ⁨schizoidman@lemmy.zip⁩ to ⁨technology@lemmy.world⁩

https://www.reuters.com/world/china/huawei-shows-off-ai-computing-system-rival-nvidias-top-product-2025-07-26/

source

Comments

Sort:hotnewtop
  • vzqq@lemmy.blahaj.zone ⁨4⁩ ⁨days⁩ ago

    People will fall over each other to explain exactly why these cops are no match for H100/B100, but that’s besides the point. For a lot of people out there the good NV stuff is basically unobtainium anyway.

    This is a big deal.

    source
    • brucethemoose@lemmy.world ⁨4⁩ ⁨days⁩ ago

      If they manage to actually get this into peoples hands

      To be clear, I think they’re talking about mega-pricey server products, where the minimum size is usually 8 of them in a box.

      source
      • vzqq@lemmy.blahaj.zone ⁨4⁩ ⁨days⁩ ago

        Oh absolutely. The NV equivalent is priced at multiple millions of dollars. If you can get it.

        source
    • BJ_and_the_bear@lemmy.world ⁨4⁩ ⁨days⁩ ago

      Sucks you can’t get Chinese kit in USA though. At least I’ve never seen it available; not sure if it’s outright banned or US suppliers just don’t carry it

      source
      • cardfire@sh.itjust.works ⁨3⁩ ⁨days⁩ ago

        Huawei was uniquely, specifically, forced out of the US market around the time they were completing for 5G Tower standards.

        source
    • Reverendender@sh.itjust.works ⁨4⁩ ⁨days⁩ ago

      Can you explain for those of us who don’t know those words and ackronyms?

      source
      • vzqq@lemmy.blahaj.zone ⁨4⁩ ⁨days⁩ ago

        I edited with a bit more context. They are mostly just product identifiers.

        Unobtainium just nerd speak for “things that are nominally available but impossible to actually get your hands on”. It’s rooted in sci fi tropes that are in themselves very interesting but besides the point right now.

        source
        • -> View More Comments
  • brucethemoose@lemmy.world ⁨4⁩ ⁨days⁩ ago

    It’s not theoretical. They’ve already released an 300B LLM dubbed Pangu Pro, trained on Huawei NPUs:

    huggingface.co/papers/2505.21411

    And it’s open weights!

    huggingface.co/…/pangu-pro-moe-model

    It’s actually a really neat model: the experts are split into 8 ‘groups’ and routed so that 1 is active in each group in any given time. In other words, it’s specifically architected for 8X Huawei NPU servers, so that there’s no excessive cross-communication or idle time between them.

    So yeah, even if it’s not a B200, proof’s in the puddin, and huge models are being trained and run on these things.

    source