Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

NVIDIA could enter the desktop CPU market with performance equal to AMD and Intel

⁨233⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨weeks⁩ ago⁩ by ⁨obbeel@lemmy.eco.br⁩ to ⁨technology@lemmy.world⁩

https://www.tweaktown.com/news/110319/nvidia-could-enter-the-desktop-cpu-market-with-performance-equal-to-amd-and-intel/index.html

source

Comments

Sort:hotnewtop
  • Omega_Jimes@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

    Do I want another option in the desktop CPU space? YES

    Do I want that option to be Nvidia? NOPE

    source
    • semperverus@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      I’m looking forward to the MilkV chipsets that are RISC V architecture. They have like a microATX board that just takes regular computer components and has functioning graphics drivers for AMD. Nothing is optimized for it but its a 64 core CPU if I recall correctly, and its ridiculously low wattage for what it does.

      source
      • Sxan@piefed.zip ⁨2⁩ ⁨weeks⁩ ago

        Ditto. RISCV will catch up, eventually, and it’ll be a Chinese company which does it. Most of þe RISCV solutions are Chinese silicon.

        source
    • ageedizzle@piefed.ca ⁨2⁩ ⁨weeks⁩ ago

      What’s wrong with Nvidia? Genuine question

      source
      • WhyJiffie@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

        they are recently destroying the desktop PC market by selling at overinflated prices, and by being the manufacturer that ends up using all the memory components that’s been removed from the manufacturers of the PC market.

        but for a very long time before that, they were making very shitty, buggy, unstable drivers for linux. we might just get to be taught that CPUs also need drivers, so far that just wasn’t a problem because they was just working fine.

        source
        • -> View More Comments
      • eleitl@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

        They are extremely hostile to open source.

        source
      • Trilogy3452@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Probably them investing mostly in AI hardware nowadays (not sure what %) is the reason

        source
    • Cethin@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      I’m likely never buying one, but more competition is good. It’ll bring prices down because some people won’t care.

      source
  • hark@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    They should try entering the desktop GPU market.

    source
    • massacre@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      /thread

      source
  • EndlessNightmare@reddthat.com ⁨2⁩ ⁨weeks⁩ ago

    If it ever becomes the standard desktop processor, they’ll pull the rug like they have with graphics processors and push everything to AI datacenters.

    Hard pass

    source
  • etchinghillside@reddthat.com ⁨2⁩ ⁨weeks⁩ ago

    Are they hedging against AI collapsing? Not sure I see the motivation.

    source
    • Voroxpete@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

      I think that’s 100% what this is, and it’s a very smart play if that’s the case. Intel are reeling from some significant setbacks, while Nvidia is swimming in cash. There’s never been a better time for them to make a play for the desktop CPU space.

      And they’ve got absolutely no illusions about what’s happening with AI. They’re the ones who are literally paying AI companies to buy their chips. They know the space is collapsing. But as the guys selling the picks and shovels, they can ride out that collapse if they’re smart.

      End of the day, if what we get out of this is a new, serious competitor in the CPU space, that’ll at least be some kind of win. With Nvidia’s money and expertise they could really force Intel to get their shit together. AMD chasing their heels is the only that’s ever kept them from completely going to shit, but more competition is even better. With all three major companies playing in both the CPU and GPU spaces, that could be really good for consumers.

      source
      • T156@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        It might also be groundwork for more complicated things on their GPUs.

        The article says nothing about nVidia actually planning to enter the desktop CPU market, only that a bunch of unrelated analysts compared the CPU performance, and said it was about equal to what’s on the market.

        source
    • empireOfLove2@lemmy.dbzer0.com ⁨2⁩ ⁨weeks⁩ ago

      Both that and vertical integration. They can capture even more of the market by creating all in one Nvidia-only machines that you have to buy the whole rig to use their accelerators

      source
    • Technus@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      Yeah, that was my question. Why the hell would they develop new silicon when 99% of their fab space is dedicated to feeding the AI bubble?

      source
    • jollyrogue@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

      Nvidia wants to be the equal to Intel and AMD. They want to be the 3rd major hardware house.

      In ~2009, Intel didn’t renew a contract which allowed Nvidia to produce chipsets for Intel processors, and since then Nvidia has wanted a CPU of their own to keep from getting locked out again.

      Nvidia tried to buy Arm when SoftBank was trying to sell, but that got scuttled. They had Tegra in the past which was a phone processor and successful in the Nintendo Switch. They can’t buy Intel because of poison pills in the x86 licensing between AMD and Intel which would kick in.

      source
    • Blackmist@feddit.uk ⁨2⁩ ⁨weeks⁩ ago

      They’re doubling down. It’s a special CPU for more AI slop.

      source
  • Eat_Your_Paisley@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    I don’t think I’ll ever purchase anything made by NVIDIA

    source
    • tabular@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      I mean if it’s 2nd hand… and the free (libre) drivers are good… and AMD hasn’t gone full Intel… maybe??

      source
  • breadsmasher@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    clock speed of 4GHz, which is far below AMD and Intel’s 5GHz.

    phrasing is odd. 25% lower clock speed isnt “far below”

    source
    • markz@suppo.fi ⁨2⁩ ⁨weeks⁩ ago

      I don’t think it should even be comparable between totally different architectures.

      source
      • Voroxpete@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

        Yeah, we’ve been through this exact same game with multiple iterations of Intel and AMD chips. When AMD first started doing consumer CPUs they badged them according to their equivalent Intel clock speed because one to one comparisons were misleading.

        What’s the L1 and L2 cache? What are the bus speeds? How many cores and how are they architectured? Multi-threading? How many steps is the instruction cycle? There are so many factors beyond just clock speed that play into real world performance.

        source
      • Peffse@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        I can’t believe people still look at Hz and think it’s a sole metric that can be used for performance.

        Do you think they look at the 2005 Pentium 4’s 3.8GHz and assume it’s only slightly worse than what Nvidia will put on the market?

        source
        • -> View More Comments
      • obbeel@lemmy.eco.br ⁨2⁩ ⁨weeks⁩ ago

        I’m hopeful ARM will follow more the licensing path than the going full Android path. I think stronger ARM computers, built at the ISA level by any company are also stronger RISCV computers. Builders like Rockchip (China) show that ARM and RISCV computers will bring alternatives to people, possibly with smaller fabs or on demand.

        source
        • -> View More Comments
    • Sturgist@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

      25% lower clock speed isnt “far below”

      AHEM! AKTCHEWALEE… it’s 20% which is even less qualified to be “far below” the other two.

      source
      • breadsmasher@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Thanks! Thats more accurate

        source
        • -> View More Comments
    • ramble81@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      You willing to take a 25% pay cut? Yeah that’s hella far. Especially when you’re up in the GHz range.

      source
  • phar@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Does it work and without ram?

    source
    • totesmygoat@piefed.ca ⁨2⁩ ⁨weeks⁩ ago

      Don’t worry. It will only be used for ai data centers.

      source
    • ArkimedesWasRight@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Probably, of you throw enpugh cache at it. 😅

      source
      • boonhet@sopuli.xyz ⁨2⁩ ⁨weeks⁩ ago

        4D V-cache!

        source
  • BeatTakeshi@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    I’m not buying this news, and if I’m wrong I’m not buying the product. Fuck Nvidia

    source
  • Tharkys@lemmy.wtf ⁨2⁩ ⁨weeks⁩ ago

    Good, I won’t be buying them either.

    source
  • Seasm0ke@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    I will not buy another nvidia retail product again. Could make an exception for a second hand shield from an earlier generation, but nvidia is dead to me. AMD is my new best friend.

    source
    • Dagamant@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      AMD is still trying to get in on the AI cash pile. The only thing they have going for them is pretty solid Linux support. I still pick them over nvidia and intel, they just aren’t much better than the others when it comes to “consumer first” ideologies

      source
      • Seasm0ke@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        I’m a fickle mistress and sadly also a captured audience so totally expect to hate them one day… for now it plays mhwilds hi res on ultra great on garuda.

        source
  • skymtf@lemmy.blahaj.zone ⁨2⁩ ⁨weeks⁩ ago

    Let me guess, it won’t work with Linux once so ever without a propetairy nvidia kernel

    source
  • RblScmNerfHerder@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Yeah, but, it’s Nvidia, the same company who’s a key figure on the AI-Govt circlejerk.

    The absolute best thing we can do is boycott them AND OpenAI, because neither company gives a F about the People.

    While Huang and Nvidia continue their current trajectory, I’ll never buy another Nvidia product.

    source
    • phx@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Yeah, seriously. Nvidia is too busy fucking over the consumer PC market to be interested to produce a CPU that’d sell in that same market. My bet is that any CPU they release would be targeted at cloud/AI as well.

      source
      • RblScmNerfHerder@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Absolutely.

        source
  • solrize@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

    This is more about the Arm X925 core than about Nvidia. The X925 is a new superscalar ARM core that’s the first one competitive with current x64 at single threaded compute.

    source
    • vacuumflower@lemmy.sdf.org ⁨2⁩ ⁨weeks⁩ ago

      Apple M-series are ARM64. Are they not competitive?

      source
      • solrize@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

        They get fairly close from what I understand. But while they are more power efficient, they’re still behind in pure speed. The X925 goes for speed at the cost of power, at least per this:

        chipsandcheese.com/…/arms-cortex-x925-reaching-de…

        source
        • -> View More Comments
  • mlg@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    I am literally just waiting for China to catch up and knock over all 3 of these TSMC suckers.

    I don’t care if they throw a 2000% tarrif on it, I will figure out a way to bypass it so I can enjoy pre inflation PC prices again when high end GPUs were going for $300, SSDs became so cheap that the HDD market actually started falling behind, and you could chuck RAM sticks around like spare change.

    source
    • eleitl@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      It would be nice to see hardware smugglers.

      source
      • matlag@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

        If the tariffs last, emergence of some smugglers is unavoidable.

        Add to that that they would be tracked by Kash Patel’s ruined FBI, and the risk assessment is even more in favor of smuggling.

        source
    • SharkAttak@kbin.melroy.org ⁨2⁩ ⁨weeks⁩ ago

      If you think China is consumers' friend, think otherwise.

      source
  • Marshezezz@lemmy.blahaj.zone ⁨2⁩ ⁨weeks⁩ ago

    Fuck that company

    source
  • thedeadwalking4242@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Yeah I think I’d rather stick a CPU up my ass then use a Nvidia CPU an any computer I own.

    source
  • cecilkorik@piefed.ca ⁨2⁩ ⁨weeks⁩ ago

    Maybe they should start making RAM /s

    source
  • etherphon@piefed.world ⁨2⁩ ⁨weeks⁩ ago

    Can’t wait to install a 5GB driver bundle for my CPU that leaves shit all over the place. No thanks.

    source
  • oyzmo@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    so… they have spare production capacity then?

    source
  • DJKJuicy@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

    I mean, of course they could.

    But where’s the money in that?

    source
  • YiddishMcSquidish@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

    Big lol. I’ll believe it when I see benchmarks.

    source
  • devolution@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Image

    source
    • eleitl@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      Doesn’t render.

      source
      • devolution@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        It’s the monopoly man announcing that Nvidia is effectively a monopoly.

        source
  • Hadriscus@jlai.lu ⁨2⁩ ⁨weeks⁩ ago

    kindly nationalize nvidia now

    thank you

    source
  • THE_GR8_MIKE@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    A lot of companies could do stuff. We know it’d all go to AI slop anyway, so whatever.

    source
  • SaveTheTuaHawk@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

    if they don’t tank on AI first.

    source
    • eleitl@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      Couldn’t happen to nicer guys. Godspeed.

      source
  • nutsack@lemmy.dbzer0.com ⁨2⁩ ⁨weeks⁩ ago

    well they start making decent drivers for linux or will they suck ass like they do now

    source
  • vext01@feddit.uk ⁨2⁩ ⁨weeks⁩ ago

    Could they please make cheap ram?

    source
  • network_switch@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

    The tegra boards are good on Linux. They need to get this out so software developers can work out the software kinks and hardware integrators make some good designs. I want a whole lot more Steam Machine sized devices to choose from

    source
    • jjlinux@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      No, they don’t. Fuck them and everything they stand fkr.

      source
  • MuskyMelon@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Why not better? Why just equal? Make the decision simple.

    source
  • melfie@lemy.lol ⁨2⁩ ⁨weeks⁩ ago

    Meh, I’m waiting for AMD’s RDNA 5 to be released in 2027 and am hoping for some decent SoCs that are at least comparable to today’s RTX 5080, except without artificially limited VRAM. The current AI Max SoCs are pretty decent, but the RDNA 5 RTX cores are going to be what really makes it worthwhile for me personally, since I do a lot of Blender rendering and gaming.

    source
-> View More Comments