Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

NVIDIA could enter the desktop CPU market with performance equal to AMD and Intel

⁨137⁩ ⁨likes⁩

Submitted ⁨⁨14⁩ ⁨hours⁩ ago⁩ by ⁨obbeel@lemmy.eco.br⁩ to ⁨technology@lemmy.world⁩

https://www.tweaktown.com/news/110319/nvidia-could-enter-the-desktop-cpu-market-with-performance-equal-to-amd-and-intel/index.html

source

Comments

Sort:hotnewtop
  • MuskyMelon@lemmy.world ⁨19⁩ ⁨minutes⁩ ago

    Why not better? Why just equal? Make the decision simple.

    source
  • hark@lemmy.world ⁨3⁩ ⁨hours⁩ ago

    They should try entering the desktop GPU market.

    source
    • massacre@lemmy.world ⁨1⁩ ⁨hour⁩ ago

      /thread

      source
  • BeatTakeshi@lemmy.world ⁨1⁩ ⁨hour⁩ ago

    I’m not buying this news, and if I’m wrong I’m not buying the product. Fuck Nvidia

    source
  • Omega_Jimes@lemmy.ca ⁨13⁩ ⁨hours⁩ ago

    Do I want another option in the desktop CPU space? YES

    Do I want that option to be Nvidia? NOPE

    source
    • semperverus@lemmy.world ⁨12⁩ ⁨hours⁩ ago

      I’m looking forward to the MilkV chipsets that are RISC V architecture. They have like a microATX board that just takes regular computer components and has functioning graphics drivers for AMD. Nothing is optimized for it but its a 64 core CPU if I recall correctly, and its ridiculously low wattage for what it does.

      source
      • Sxan@piefed.zip ⁨3⁩ ⁨hours⁩ ago

        Ditto. RISCV will catch up, eventually, and it’ll be a Chinese company which does it. Most of þe RISCV solutions are Chinese silicon.

        source
    • ageedizzle@piefed.ca ⁨3⁩ ⁨hours⁩ ago

      What’s wrong with Nvidia? Genuine question

      source
      • Trilogy3452@lemmy.world ⁨10⁩ ⁨minutes⁩ ago

        Probably them investing mostly in AI hardware nowadays (not sure what %) is the reason

        source
  • Seasm0ke@lemmy.world ⁨3⁩ ⁨hours⁩ ago

    I will not buy another nvidia retail product again. Could make an exception for a second hand shield from an earlier generation, but nvidia is dead to me. AMD is my new best friend.

    source
    • Dagamant@lemmy.world ⁨1⁩ ⁨hour⁩ ago

      AMD is still trying to get in on the AI cash pile. The only thing they have going for them is pretty solid Linux support. I still pick them over nvidia and intel, they just aren’t much better than the others when it comes to “consumer first” ideologies

      source
  • EndlessNightmare@reddthat.com ⁨7⁩ ⁨hours⁩ ago

    If it ever becomes the standard desktop processor, they’ll pull the rug like they have with graphics processors and push everything to AI datacenters.

    Hard pass

    source
  • Tharkys@lemmy.wtf ⁨7⁩ ⁨hours⁩ ago

    Good, I won’t be buying them either.

    source
  • DJKJuicy@sh.itjust.works ⁨4⁩ ⁨hours⁩ ago

    I mean, of course they could.

    But where’s the money in that?

    source
  • solrize@lemmy.ml ⁨7⁩ ⁨hours⁩ ago

    This is more about the Arm X925 core than about Nvidia. The X925 is a new superscalar ARM core that’s the first one competitive with current x64 at single threaded compute.

    source
    • vacuumflower@lemmy.sdf.org ⁨1⁩ ⁨hour⁩ ago

      Apple M-series are ARM64. Are they not competitive?

      source
      • solrize@lemmy.ml ⁨1⁩ ⁨hour⁩ ago

        They get fairly close from what I understand. But while they are more power efficient, they’re still behind in pure speed. The X925 goes for speed at the cost of power, at least per this:

        chipsandcheese.com/…/arms-cortex-x925-reaching-de…

        source
  • Eat_Your_Paisley@lemmy.world ⁨12⁩ ⁨hours⁩ ago

    I don’t think I’ll ever purchase anything made by NVIDIA

    source
    • tabular@lemmy.world ⁨11⁩ ⁨hours⁩ ago

      I mean if it’s 2nd hand… and the free (libre) drivers are good… and AMD hasn’t gone full Intel… maybe??

      source
  • mlg@lemmy.world ⁨6⁩ ⁨hours⁩ ago

    I am literally just waiting for China to catch up and knock over all 3 of these TSMC suckers.

    I don’t care if they throw a 2000% tarrif on it, I will figure out a way to bypass it so I can enjoy pre inflation PC prices again when high end GPUs were going for $300, SSDs became so cheap that the HDD market actually started falling behind, and you could chuck RAM sticks around like spare change.

    source
  • etchinghillside@reddthat.com ⁨13⁩ ⁨hours⁩ ago

    Are they hedging against AI collapsing? Not sure I see the motivation.

    source
    • jollyrogue@lemmy.ml ⁨1⁩ ⁨hour⁩ ago

      Nvidia wants to be the equal to Intel and AMD. They want to be the 3rd major hardware house.

      In ~2009, Intel didn’t renew a contract which allowed Nvidia to produce chipsets for Intel processors, and since then Nvidia has wanted a CPU of their own to keep from getting locked out again.

      Nvidia tried to buy Arm when SoftBank was trying to sell, but that got scuttled. They had Tegra in the past which was a phone processor and successful in the Nintendo Switch. They can’t buy Intel because of poison pills in the x86 licensing between AMD and Intel which would kick in.

      source
    • Voroxpete@sh.itjust.works ⁨13⁩ ⁨hours⁩ ago

      I think that’s 100% what this is, and it’s a very smart play if that’s the case. Intel are reeling from some significant setbacks, while Nvidia is swimming in cash. There’s never been a better time for them to make a play for the desktop CPU space.

      And they’ve got absolutely no illusions about what’s happening with AI. They’re the ones who are literally paying AI companies to buy their chips. They know the space is collapsing. But as the guys selling the picks and shovels, they can ride out that collapse if they’re smart.

      End of the day, if what we get out of this is a new, serious competitor in the CPU space, that’ll at least be some kind of win. With Nvidia’s money and expertise they could really force Intel to get their shit together. AMD chasing their heels is the only that’s ever kept them from completely going to shit, but more competition is even better. With all three major companies playing in both the CPU and GPU spaces, that could be really good for consumers.

      source
      • T156@lemmy.world ⁨2⁩ ⁨hours⁩ ago

        It might also be groundwork for more complicated things on their GPUs.

        The article says nothing about nVidia actually planning to enter the desktop CPU market, only that a bunch of unrelated analysts compared the CPU performance, and said it was about equal to what’s on the market.

        source
    • empireOfLove2@lemmy.dbzer0.com ⁨12⁩ ⁨hours⁩ ago

      Both that and vertical integration. They can capture even more of the market by creating all in one Nvidia-only machines that you have to buy the whole rig to use their accelerators

      source
    • Technus@lemmy.zip ⁨13⁩ ⁨hours⁩ ago

      Yeah, that was my question. Why the hell would they develop new silicon when 99% of their fab space is dedicated to feeding the AI bubble?

      source
  • phar@lemmy.world ⁨14⁩ ⁨hours⁩ ago

    Does it work and without ram?

    source
    • totesmygoat@piefed.ca ⁨13⁩ ⁨hours⁩ ago

      Don’t worry. It will only be used for ai data centers.

      source
  • breadsmasher@lemmy.world ⁨14⁩ ⁨hours⁩ ago

    clock speed of 4GHz, which is far below AMD and Intel’s 5GHz.

    phrasing is odd. 25% lower clock speed isnt “far below”

    source
    • markz@suppo.fi ⁨14⁩ ⁨hours⁩ ago

      I don’t think it should even be comparable between totally different architectures.

      source
      • Voroxpete@sh.itjust.works ⁨13⁩ ⁨hours⁩ ago

        Yeah, we’ve been through this exact same game with multiple iterations of Intel and AMD chips. When AMD first started doing consumer CPUs they badged them according to their equivalent Intel clock speed because one to one comparisons were misleading.

        What’s the L1 and L2 cache? What are the bus speeds? How many cores and how are they architectured? Multi-threading? How many steps is the instruction cycle? There are so many factors beyond just clock speed that play into real world performance.

        source
      • Peffse@lemmy.world ⁨13⁩ ⁨hours⁩ ago

        I can’t believe people still look at Hz and think it’s a sole metric that can be used for performance.

        Do you think they look at the 2005 Pentium 4’s 3.8GHz and assume it’s only slightly worse than what Nvidia will put on the market?

        source
        • -> View More Comments
      • obbeel@lemmy.eco.br ⁨13⁩ ⁨hours⁩ ago

        I’m hopeful ARM will follow more the licensing path than the going full Android path. I think stronger ARM computers, built at the ISA level by any company are also stronger RISCV computers. Builders like Rockchip (China) show that ARM and RISCV computers will bring alternatives to people, possibly with smaller fabs or on demand.

        source
    • ramble81@lemmy.zip ⁨9⁩ ⁨hours⁩ ago

      You willing to take a 25% pay cut? Yeah that’s hella far. Especially when you’re up in the GHz range.

      source
    • Sturgist@lemmy.ca ⁨12⁩ ⁨hours⁩ ago

      25% lower clock speed isnt “far below”

      AHEM! AKTCHEWALEE… it’s 20% which is even less qualified to be “far below” the other two.

      source
      • breadsmasher@lemmy.world ⁨12⁩ ⁨hours⁩ ago

        Thanks! Thats more accurate

        source
        • -> View More Comments
  • skymtf@lemmy.blahaj.zone ⁨9⁩ ⁨hours⁩ ago

    Let me guess, it won’t work with Linux once so ever without a propetairy nvidia kernel

    source
  • Marshezezz@lemmy.blahaj.zone ⁨12⁩ ⁨hours⁩ ago

    Fuck that company

    source
  • devolution@lemmy.world ⁨9⁩ ⁨hours⁩ ago

    Image

    source
  • cecilkorik@piefed.ca ⁨13⁩ ⁨hours⁩ ago

    Maybe they should start making RAM /s

    source
  • Hadriscus@jlai.lu ⁨12⁩ ⁨hours⁩ ago

    kindly nationalize nvidia now

    thank you

    source
  • vext01@feddit.uk ⁨12⁩ ⁨hours⁩ ago

    Could they please make cheap ram?

    source
  • SaveTheTuaHawk@lemmy.ca ⁨13⁩ ⁨hours⁩ ago

    if they don’t tank on AI first.

    source
  • melfie@lemy.lol ⁨11⁩ ⁨hours⁩ ago

    Meh, I’m waiting for AMD’s RDNA 5 to be released in 2027 and am hoping for some decent SoCs that are at least comparable to today’s RTX 5080, except without artificially limited VRAM. The current AI Max SoCs are pretty decent, but the RDNA 5 RTX cores are going to be what really makes it worthwhile for me personally, since I do a lot of Blender rendering and gaming.

    source