Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Nvidia might not have any new gaming GPUs in 2026 — and could be 'slashing production' of existing GeForce models

⁨467⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨weeks⁩ ago⁩ by ⁨FirmDistribution@lemmy.world⁩ to ⁨technology@lemmy.world⁩

https://www.techradar.com/computing/gpu/nvidia-might-not-have-any-new-gaming-gpus-in-2026-and-could-be-slashing-production-of-existing-geforce-models

source

Comments

Sort:hotnewtop
  • JensSpahnpasta@feddit.org ⁨2⁩ ⁨weeks⁩ ago

    So that kind of means that the high-end AAA PC market will crash in the next years, right? No new GPUs, production stop for existing GPUs and rising prices for GPU & RAM in combination with inflation and a bad economy ensure that many people can’t afford a gaming computer. And that a lot of those younger gamers can’t afford to start this hobby.

    And that means a shrinking audience for games, which need all this GPU power. If you’re an AAA publisher, it kind of looks crazy to invest multiple millions into a game that you can’t be sure that your audience will be able to afford to play

    source
    • starik@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      All games will be streamed, with a subscription

      source
      • muusemuuse@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

        Retroarch disagrees. I don’t need your newfangled enshittified slop. I have megaman X and wine.

        source
        • -> View More Comments
      • dustman0192@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

        Shhh! Don’t give them any ideas!

        source
        • -> View More Comments
      • WorldsDumbestMan@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

        Just stop playing games, and they will have no hold over you.

        Or code your own, it’s simple to code a simple game.

        source
        • -> View More Comments
    • 4am@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      Don’t worry, you can Stream It From the CLOUD™️ for the low low price of 6x what a GPU would cost you over 5 years.

      source
      • nutsack@lemmy.dbzer0.com ⁨2⁩ ⁨weeks⁩ ago

        yea you can wait in a queue to play your unmodded single player experience game

        source
        • -> View More Comments
    • chonglibloodsport@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Definitely a shrinking audience for AAA games, but I don’t think it will be too bad gamers overall. Consoles will keep marching forward, as will Valve with the Steam Deck and Steam Machine.

      I think the highest of the high end graphics stuff has long since hit diminishing returns. You can do a hell of a lot with yesterday’s hardware and less-than-bleeding-edge process nodes for newer hardware. Consoles have never used bleeding edge GPUs and they’ve always done fine with sales (across the whole market, if not always individually). I think we’re highly unlikely to see a repeat of the 1983 gaming crash.

      source
      • MolochHorridus@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

        They do fine in sales because the consoles sold at loss and they make money on game sales.

        Microsoft, Sony and Nintendo would love to stop producing consoles and start selling you monthly service via a thin client. They just need a ready to go platform for it to gain enough mass first.

        source
        • -> View More Comments
      • ageedizzle@piefed.ca ⁨2⁩ ⁨weeks⁩ ago

        I think we’re highly unlikely to see a repeat of the 1983 gaming crash

        What happened in 1983?

        source
        • -> View More Comments
    • timestatic@feddit.org ⁨2⁩ ⁨weeks⁩ ago

      No not really. AMD is still producing cards. Most people play on older or used cards anyways. Maybe like don’t make Crysis level Graphics but other than that one year of less GPU releases won’t kill gaming. Once the AI bubble bursts NVDIA might have lost a lot of edge over AMD in the gaming market and they’ll scramble to get back

      source
      • Tetsuo@jlai.lu ⁨2⁩ ⁨weeks⁩ ago

        AMD hasn’t stopped making consumer GPUs yet.

        OpenAI owns a good chunk of AMD and AMD definitely also want their share of the AI pie.

        I wouldn’t look at AMD as some savior that wouldn’t ditch consumers for big AI.

        source
      • dreamkeeper@literature.cafe ⁨2⁩ ⁨weeks⁩ ago

        I thought AMD also said they were cutting production?

        source
    • xektop@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      Yes, but think about the money in mobile and console gaming… PC gaming was niche even before that and we represent very small percentage of the overall gaming industry. Nobody gives a fuck about us since some time already. Now they just show it to us in daylight.

      source
      • PalmTreeIsBestTree@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        It’s not as niche as it used to be. In the last 10 years it’s grown quite a bit compared to what it was 20 years ago when bad pc ports were the norm. Due to AI, I guess console gaming will go back to being the main way people game again.

        source
        • -> View More Comments
      • jjlinux@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

        That’s a fact. PC gaming vs consoles is to gaming as Linux vs Winblows is to Computers. It’s a weird world we’re living in.

        source
    • boogiebored@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      “dont you have phones?”

      source
    • Atomic@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

      I think it’ll have the opposite effect. Knowing the hardware won’t change in the next year, they don’t have to worry about making it compatible with the new cards. They can focus on building upon what they already have.

      And as someone that helped pick out a fantastic PC for my little cousin in dec last year, she paid ~500$ for pretty decent hardware, and so far, she hasn’t found a game in my library her PC can’t handle. Including “wh40k Space Marine 2”

      There’s plenty of hardware for younger people that want to get into the hobby. You don’t always need the absolute latest.

      source
  • Brkdncr@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Someone is going to make bank by catering to consumers. Will the market accept nvidia back with open arms if/when the ai investments fall through?

    source
    • tidderuuf@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Well what do most victims of exploitation and abuse do?

      source
      • giminic@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Visiting Stockholm?

        source
        • -> View More Comments
    • ToTheGraveMyLove@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

      Most people are willing to sell their morals. When nvidia comes crawling back it will be like nothing ever happened.

      source
    • neclimdul@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      As a Linux gamer, nvidia was already on thin ice.

      Also I had past them up on recentish purchases since they only really controlled the highest end of the market which I don’t have the budget for. So honestly I have no intention of welcoming them back unless there is literally no other option. You made your bed.

      source
      • jjlinux@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

        This is still a pain point for me. I have been looking for a laptop with an AMD GPU for years to use with Linux, but System76, Starlabs, framework, etc insist on only having Nvidia as a discreet option. Or is it that AMD does not have laptop GPUs? Could be.

        source
        • -> View More Comments
    • mycodesucks@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      That would be nice. But video cards are a VERY niche piece of engineering. The knowledge of HOW to make them is locked in a handful of people, and the ability to make them locked behind a very niche set of equipment that will ALSO be exploding in cost.

      One does not simply start a graphics card company.

      source
      • Brkdncr@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        I don’t think a newcomer could do it, but a company like Intel is posed to be in a good position. They don’t have much market share but they have a good product.

        source
        • -> View More Comments
    • psvrh@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

      Intel, here’s your big chance!

      source
      • ThePantser@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

        Intel is partly owned by the US government now. You think they want tech going to the people when they themselves want them for skynet.

        source
    • markz@suppo.fi ⁨2⁩ ⁨weeks⁩ ago

      Maybe, unless it takes so long that everyone already has a chinese card or something

      source
    • MonkderVierte@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      I hope they don’t have the production of non-ai chips then.

      source
      • wonderingwanderer@sopuli.xyz ⁨2⁩ ⁨weeks⁩ ago

        I get the disdain for GenAI, but are AI chips really the problem? Maybe they’re more expensive and price people out, but it’s not like they’re built on plagiarism like most generative AI models.

        As far as I’m aware, they’re just capable of running highly complex multivariable calculi in parallel, making them more efficient for AI applications, but wouldn’t the same features make them better for more realistic physics and other game mechanics like procedural generation, NPC pathfinding and behaviors, etc.?

        I guess it would suck for anyone who doesn’t have the hardware to play a game, but there could always be options to configure in the settings to make it playable, like “don’t use tensor calculus in game physics” or whatever

        source
        • -> View More Comments
  • scala@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

    Stop buying Nvidia

    source
    • UnderpantsWeevil@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Easy enough when they’re not selling

      source
    • jj4211@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Well, they are helping out with that one…

      source
    • FirmDistribution@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      I wish there were more laptops using AMD gpus here in Brazil. You basically can’t find any laptop with an AMD gpu if you search for “gamer laptop” in Brazilian stores.

      source
      • MasterNerd@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

        Gaming laptops are a not really worth it imo. They’re underpowered, overheat easily, and tend to break quickly. That doesn’t even touch on their battery life, even when not under load.I’d recommend getting a steam deck if you really need the portability, but it doesn’t look like they’re available in Brazil :/

        source
        • -> View More Comments
    • Psythik@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      I will when someone makes a GPU that can surpass a 4090. Not even Nvidia themselves can pull that off, so I’m not getting my hopes up. I’m going to me stuck with this GPU for the next decade the way things are going (not that I’m complaining. It’s a beast of a card, especially coming from someone who could only ever afford bargain bin parts until one day I came into a windfall. That was a fun 4 years.

      source
      • doingthestuff@lemy.lol ⁨2⁩ ⁨weeks⁩ ago

        My 4070ti isn’t as beefy but there’s still not a non-nvidia upgrade. And I am able to play most of my games at 4k / 120. I’d like to upgrade mine so I can give my card to my daughter and give her 3060ti to her brother who is currently running a 1060.

        source
      • cheesorist@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Youre not actively buying from them by using what you have. nobody buys a downgrade.

        source
    • Eagle0110@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      The time to prevent Nvidia from practically gaining a total monopoly on the entire market by stopping buying Nvidia, was 10 years ago, not now.

      Now, I’ll consider buying a GPU from you instead if you can make a GPU that satisfies technical needs like Nvidia could, but you cannot.

      source
      • scala@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

        AMDs the last 10 years have been great. No overheating.

        source
  • percent@infosec.pub ⁨2⁩ ⁨weeks⁩ ago

    Maybe some Chinese manufacturer will find a way to fill the gap in the market

    source
    • CovfefeKills@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Here’s hoping

      source
    • Blackmist@feddit.uk ⁨2⁩ ⁨weeks⁩ ago

      Careful what you wish for.

      🇨🇳 🚣 🇹🇼

      source
      • percent@infosec.pub ⁨2⁩ ⁨weeks⁩ ago

        Oh I wasn’t wishing for anything, just pointing out the possibility. There are some Chinese companies gearing up to fill the gap in the memory market. GPUs would be much harder, but maybe very profitable.

        source
  • RejZoR@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

    While AMD is no angel, I’m glad I went for Radeon RX 9070 XT this time. Really good GPU and fuck NVIDIA. I hope unified RDNA5 will work out for AMD.

    source
    • chronicledmonocle@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      I have gone all AMD graphics since converting to Linux. My 9060XT 16GB and 6600 8GB both are going strong.

      Fuck NVidia.

      source
    • muusemuuse@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

      I went with Intel ARC since I don’t actually need GPU processing power so much as a decent media engine and VRAM for future projects and Intel has that ready to go under Linux. In the CPU side AMD is the only option that makes sense and for gaming AMDs GPUs have already been the practical option for years but their media engines are trash.

      But we don’t need NVIDIA and we don’t even need high end GPUs as much as we think we do.

      source
  • phoenixz@lemmy.ca ⁨2⁩ ⁨weeks⁩ ago

    Let Nvidia go bankrupt, we won’t miss it

    source
    • sbbq@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      If consumers can’t get new gpus, devs aren’t going to bother spec’ing for them. This’ll probably just result in a stalling of tech you’ll see at home for a few years.

      source
      • Blackmist@feddit.uk ⁨2⁩ ⁨weeks⁩ ago

        I want devs to write games for £400 Steam Decks. I don’t want them to write games for £3000 GPUs.

        There’s realistically no games that won’t run on PS5 level hardware. Every effect that can be done with raytracing can be done a little worse without it.

        source
      • dovahking@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        There are already games that lies on the fringe of photorealism like bodycam. As the other guy said, we need more games with better story than better graphics. It’ll be good for the industry if not every AAA game requires a RTX 69000.

        source
  • A_Random_Idiot@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    i really hope nvidia collapses when the AI bubble pops. They’ve been more harm than good for consumers for too long.

    source
    • hamsterkill@lemmy.sdf.org ⁨2⁩ ⁨weeks⁩ ago

      It won’t collapse. It’ll lose a huge chunk of its stock price, but it both has other business to fall back on and its chips will still likely be used in whatever the next tech trend is - probably neural network AI or something.

      source
      • jj4211@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        I am not sure. They have other businesses but not sure those other businesses are able to sustain the obligations that nVidia has committed to in this round. They are juggling more money than their pre-AI boom market cap by a wide margin, so if the bubble pops, unclear how big a bag nVidia will be left holding and if the rest of their business can survive it. Guess they might go bankrupt and come out of it eventually to continue business as usual after having financial obligations wiped away…

        Also, they have somewhat tarnished their reputation with going all in on the dataenter equipment to, seemingly here, abandoning the consumer market to make more capacity for the datacenters. So if AMD ever had an opportunity to maybe cash in, well, here it might be… Except they also dream of being a big datacenter player, but weaker demand may leave them with leftover capacity…

        source
        • -> View More Comments
  • horse@feddit.org ⁨2⁩ ⁨weeks⁩ ago

    As someone not looking to spend a ton of money on new hardware any time soon: good. The longer it takes to release faster hardware, the longer current hardware stays viable. Games aren’t going to get more fun by slightly improving graphics anyway. The tech we have now is good enough.

    source
    • ExLisper@lemmy.curiana.net ⁨2⁩ ⁨weeks⁩ ago

      People don’t just use computers for gaming. If this continues people will struggle to do any meaningful work on their personal computes which is definitely not good. And I’m not talking about browsing facebook but about coding, doing research, editing videos and other useful shit.

      source
      • SoleInvictus@lemmy.blahaj.zone ⁨2⁩ ⁨weeks⁩ ago

        But wait! They can pay for remote computing time for a fraction of the cost! Each month. Forever.

        I fully expect personal computers to be phased out in favor of a remote-access, subscription model. AI popping would leave these big data centers with massive computational power available for use, plus it’s the easiest way to track literally everything you do on your system.

        source
        • -> View More Comments
      • balsoft@lemmy.ml ⁨2⁩ ⁨weeks⁩ ago

        You can write code just fine on 20 or even 30 year old hardware. Basically if it runs Linux, chances are it can also run vim and compile code. If you spring for 10 year old hardware, you can even get an LSP + coc or helix, for error highlighting and goto definition and code actions. And you definitely don’t need a GPU for it (unless you’re doing something GPU-specific of course).

        Editing 720p videos (which, if you encode with a high enough bitrate, still looks alright) can be done on 10-15 year old hardware.

        Research is where it gets complicated. It does indeed often require a lot of computing power to do modern computational research. But for some simpler stuff - especially outside STEM - you can sometimes get away with a LibreOffice spreadsheet on an old Dell or something.

        From the looks of it we will have to get used to doing more with less when it comes to computers. And TBH I’m all for it. I just hope that either my job won’t require compiling a lot more stuff, or they provide me with a modern machine at their expense.

        source
        • -> View More Comments
      • wonderingwanderer@sopuli.xyz ⁨2⁩ ⁨weeks⁩ ago

        Scientific modeling and simulations

        source
      • UnderpantsWeevil@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        If this continues people will struggle to do any meaningful work on their personal computes

        Excel users devestated.

        source
  • Mk23simp@lemmy.blahaj.zone ⁨2⁩ ⁨weeks⁩ ago

    Hey, I’ve seen this one before.

    Last time it was crypto instead of AI, but other than that it’s just the same shit again.

    source
    • MonkderVierte@lemmy.zip ⁨2⁩ ⁨weeks⁩ ago

      Not really. It’s world economy times bigger now.

      source
  • M0oP0o@mander.xyz ⁨2⁩ ⁨weeks⁩ ago

    What could a GPU cost? $5000?

    source
    • BreadstickNinja@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Image

      source
  • UltraBlack@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    We’re running straight into a future where consumers’ only opzion for computers are a cloud solution like MS 365

    source
    • WorldsDumbestMan@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

      The only future, is one where billionaires aren’t in it.

      source
      • UnderpantsWeevil@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Brother, we’re up to trillionaires now and they don’t seem like they’re going anywhere.

        source
        • -> View More Comments
    • SocialMediaRefugee@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

      Pushing constantly towards a subscription economy.

      source
      • M0oP0o@mander.xyz ⁨2⁩ ⁨weeks⁩ ago

        That “economy” is already falling apart. Subscriptions are down, services on “the cloud” are becoming less reliable, piracy is way up again, and major nations and companies are moving to alternatives.

        Hell, DDR3 is making a comeback. All that is needed is one manufacturer to start making 15 year old tech again and bam, the house of cards falls.

        source
        • -> View More Comments
  • Paranoidfactoid@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    If you want to do work with the GPU you’re still buying NVIDIA. Particularly 3D animation, video/film editing, and creative tools. Even FOSS tools like GIMP and Krita prefer NVIDIA for GPU accelerated functions.

    source
  • boaratio@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    I know radeons don’t really have the performance crown, but as a life long Nvidia GPU and Linux user, the PITA drivers are not a problem when you use an AMD radeon card.

    source
  • Bullerfar@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    Would this mean AMD finally gets the supply demand it Reserves?

    source
    • Atherel@lemmy.dbzer0.com ⁨2⁩ ⁨weeks⁩ ago

      Unfortunately AMD is affected by RAM shortage too.

      source
      • Brkdncr@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

        Amd is arguably more affected. Intels CPUs have memory built into it, and intel bought about a years worth of memory.

        source
  • anon_8675309@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    They’re AI only now.

    source
  • sturmblast@lemmy.world ⁨2⁩ ⁨weeks⁩ ago

    This is gonna suck

    source
  • bitwolf@sh.itjust.works ⁨2⁩ ⁨weeks⁩ ago

    So they can sell them for even more thanks to scarcity I’m guessing.

    source