Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Intel announced plans to start making GPUs, challenging NVIDIA's dominance

⁨289⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨days⁩ ago⁩ by ⁨Innerworld@lemmy.world⁩ to ⁨technology@lemmy.world⁩

https://techcrunch.com/2026/02/03/intel-will-start-making-gpus-a-market-dominated-by-nvidia/

source

Comments

Sort:hotnewtop
  • wioum@lemmy.world ⁨2⁩ ⁨days⁩ ago

    I had to check the date on the article. They’ve been making GPUs for 3 years now, but I guess this announcement–although weird–is a sign that Arc is here to stay, which is good news.

    source
    • tomalley8342@lemmy.world ⁨2⁩ ⁨days⁩ ago

      This article was based off what the CEO said at the Second Annual AI Summit, following the news of their new head of GPU hire who says he “will lead GPU engineering with a focus on AI at Intel”. The AI pivot is the actual news.

      source
      • SinningStromgald@lemmy.world ⁨2⁩ ⁨days⁩ ago

        Just what every consumer needs. More AI focused chips.

        Intel just trying to cash in on the AI hype to buy the sinking ship, as far as investors are concerned.

        source
        • -> View More Comments
      • Reygle@lemmy.world ⁨12⁩ ⁨hours⁩ ago

        focus on AI

        Never mind guys, it’s a nothing burger

        source
      • CosmoNova@lemmy.world ⁨1⁩ ⁨day⁩ ago

        Oh so they will actually not focus on GPUs as end consumer products for you and me. They’re just like Nvidia and AMD. This news really just shows how cooked gaming is.

        source
        • -> View More Comments
      • CIA_chatbot@lemmy.world ⁨2⁩ ⁨days⁩ ago

        It feels like TechCrunch is allowing a drunk Ai to write all its articles now.

        source
      • AdrianTheFrog@lemmy.world ⁨1⁩ ⁨day⁩ ago

        It’s not even a pivot. They’ve been focusing on AI already. I’m sure they want it to seem like a pivot (and build up hype); the times before apparently just having the hardware and software wasn’t enough. nobody cared when the gaudi cards came out, nobody uses sycl or onednn, etc

        source
      • ParlimentOfDoom@piefed.zip ⁨2⁩ ⁨days⁩ ago

        Weird, they’re a bit late boarding this train as it already starts to derail…MS just stumbled hard as their AI shit isn’t paying off and it drives consumers away.

        source
        • -> View More Comments
    • BarbecueCowboy@lemmy.dbzer0.com ⁨2⁩ ⁨days⁩ ago

      The actual chips are farmed out to TSMC, I don’t believe they’ve made any in house so I’m guessing maybe they’ve decided that they’re going to do that sometimes now? But then, even some of their CPUs are made by TSMC so I could be on a very wrong path.

      source
      • ag10n@lemmy.world ⁨2⁩ ⁨days⁩ ago

        TSMC is how they stay competitive; that’s what everyone else uses

        Intel is still catching up with 18A

        The 18A production node itself is designed to prove that Intel can not only create a compelling CPU architecture but also manufacture it internally on a technology node competitive with TSMC’s best offerings.

        tomshardware.com/…/intels-18a-production-starts-b…

        source
        • -> View More Comments
      • UnfortunateShort@lemmy.world ⁨2⁩ ⁨days⁩ ago

        They want to make Celestial on 18A, no?

        source
    • fleem@piefed.zeromedia.vip ⁨2⁩ ⁨days⁩ ago

      thanks for your effort

      source
  • imetators@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

    Like if ARC has never existed before?

    source
    • Rooster326@programming.dev ⁨1⁩ ⁨day⁩ ago

      If what‽

      source
      • imetators@lemmy.dbzer0.com ⁨23⁩ ⁨hours⁩ ago

        Intel ARC is a GPU brand by Intel that are half the price of a typical Nvidia card at almost the same performance. They been unpopular due to shaky drivers but they have never been canceled. So, stating that Intel will finally enter GPU market is just plain misleading.

        source
      • treesquid@lemmy.world ⁨1⁩ ⁨day⁩ ago

        English clearly isn’t their first language, but the intent is pretty obviously “As if they aren’t already making ARC GPUs?”

        source
        • -> View More Comments
  • Reygle@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Am I living in an alternate timeline? They’ve been making GPUs for quite some time- and B580 was actually pretty good, incredibly good for the price.

    source
  • Jaysyn@lemmy.world ⁨1⁩ ⁨day⁩ ago

    I guess the Arc a750 in my workstation is imaginary?

    source
  • Goodeye8@piefed.social ⁨2⁩ ⁨days⁩ ago

    Well that article was a waste of space. Intel has already stepped into the GPU market with their ARC cards, so at the very least the article should contain a clarification on what the CEO meant.

    And I see people shitting on the arc cards. The cards are not bad. Last time I checked the B580 had performance comparable to the 4060 for half the cost. The hardware is good, it’s simply meant for budget builds. And of course the drivers have been an issue, but drivers can be improved and last time I checked Intel is actually getting better with their drivers. It’s not perfect but we can’t expect perfect. Even the gold standard of drivers, Nvidia, has been slipping in the last year.

    All is to say, I don’t understand the hate. Do we not want competition in the GPU space? Are we supposed to have Nvidia and AMD forever until AMD gives up because it becomes too expensive to compete with Nvidia? I’d like it to be someone else than Intel but as long as the price comes down I don’t care who brings it down.

    And to be clear, if Intels new strategy is keeping the prices as they are I’m all for “fuck Intel”.

    source
    • Sineljora@sh.itjust.works ⁨2⁩ ⁨days⁩ ago

      The USA owns 10% of the company, which might turn off some.

      source
      • gravitas_deficiency@sh.itjust.works ⁨2⁩ ⁨days⁩ ago

        This is a big part of it, imo. They kissed the ring.

        The other part of it is that, per the article, this is an “AI” pivot. This is not them making more consumer-oriented GPUs. Which is frustrating, because they absolutely could be a viable competitor in low-mid tier if they wanted to. But “AI” is (for now) much more lucrative. We’ll see how long that lasts.

        source
    • ZeDoTelhado@lemmy.world ⁨2⁩ ⁨days⁩ ago

      CPU overhead is quite well known and actually damages a lot the arc cards’ position on the budget class

      source
  • REDACTED@infosec.pub ⁨20⁩ ⁨hours⁩ ago

    Slowpoke news

    source
  • ApplyingAutomation@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Image

    source
  • Itdidnttrickledown@lemmy.world ⁨1⁩ ⁨day⁩ ago

    The problem with intel. They never just keep going. They announce some new gpu/graphics product and when it falls short they don’t or wont stick with it. They abandon it and use it as a write off. They have done this multiple times and I have no reason to believe they will do anything different. The last time was just a few years ago and when sales and performance lagged they just quit.

    source
  • ShinkanTrain@lemmy.ml ⁨1⁩ ⁨day⁩ ago

    Tom Peterson, who’s been working at Intel for half a decade: Am I a joke to you

    source
    • bhamlin@lemmy.world ⁨1⁩ ⁨day⁩ ago

      No, now they’re going to make good video cards!

      source
  • RememberTheApollo_@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Oh great, some wildly overpriced and underperforming GPUs.

    source
    • PrivateNoob@sopuli.xyz ⁨1⁩ ⁨day⁩ ago

      Atleast it’s a 3rd contender, but afaik the Arc series had a decent enough pricing, although AMD’s prices seemed better, but not sure

      source
    • Zetta@mander.xyz ⁨1⁩ ⁨day⁩ ago

      “oh great, competition in a market with no competition. Horrible.”

      Intel has already been making discrete GPUs for two generations and they are very cheap and aren’t the most performant but fantastic for the price.

      source
    • notthebees@reddthat.com ⁨1⁩ ⁨day⁩ ago

      they’ve been quite good on the pricing front?

      source
    • anon_8675309@lemmy.world ⁨1⁩ ⁨day⁩ ago

      They won’t be for you.

      source
  • Diplomjodler3@lemmy.world ⁨2⁩ ⁨days⁩ ago

    What the fuck? What kind of idiotic article is that? Did Techcrunch go down the drain too?

    source
    • LodeMike@lemmy.today ⁨2⁩ ⁨days⁩ ago

      The comma should be replaced with " which"

      source
  • Darkness343@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Oh no, Nvidia’s pet is rebelling. Maybe they should be remindes of their current status

    source
  • angrywaffle@piefed.social ⁨2⁩ ⁨days⁩ ago

    Doesn’t Nvidia have $5bi stakes of intel? I wonder how that influences their decisions.

    source
    • Innerworld@lemmy.world ⁨2⁩ ⁨days⁩ ago

      So does the government.

      source
  • tal@lemmy.today ⁨2⁩ ⁨days⁩ ago

    I donlt know if “GPUs” is the right term, but the only area where we’re seeing large gains in computational capacity now is in parallel compute, so I’d imagine that if Intel intends to be doing high performance computation stuff, they probably want to be doing parallel compute too.

    source
    • badabim@lemmy.world ⁨1⁩ ⁨day⁩ ago

      The term you’re looking for is GPGPU (General Purpose computing on GPU)

      source
  • DelightfullyDivisive@discuss.online ⁨2⁩ ⁨days⁩ ago

    It isn’t much of a challenge if they suck. Just planning to make them doesn’t mean shit.

    Also, why do none of these articles have a summary posted for them? These are some seriously low effort posts.

    source
  • TropicalDingdong@lemmy.world ⁨2⁩ ⁨days⁩ ago

    Not gonna make a lick of difference without the support to run CUDA.

    source
    • woelkchen@lemmy.world ⁨2⁩ ⁨days⁩ ago

      ZLUDA exists.

      source
      • AdrianTheFrog@lemmy.world ⁨1⁩ ⁨day⁩ ago

        Intel GPU support?

        ZLUDA previously supported Intel GPUs, but not currently. It is possible to revive the Intel backend. The development team is focusing on high‑quality AMD GPU support and welcomes contributions.

        Anyways, no actual AI company is going to buy $100M of AI cards just to run all of their software through an unfinished community made translation layer, no matter how good it becomes.

        OneAPI is decent, but apparently usually fairly cumbersome to work with and people prefer to write software in cuda as it’s the industry standard (and the standard in academia)

        source
        • -> View More Comments
  • devolution@lemmy.world ⁨2⁩ ⁨days⁩ ago

    You mean non shit non arcs? They tried already and failed already with battle mage.

    source
  • thedeadwalking4242@lemmy.world ⁨2⁩ ⁨days⁩ ago

    Aren’t TPUs like dramatically better for any AI workload?

    source
    • AdrianTheFrog@lemmy.world ⁨1⁩ ⁨day⁩ ago

      Intel’s Gaudi 3 datacenter GPU from late 2024 advertises about 1800 tops in fp8, at 3.1 tops/w. Google’s mid 2025 TPU v7 advertises 4600 tops fp8, at 4.7 tops/w. Which is a difference, but not that dramatic of one. The reason it is so small is that GPUs are basically TPUs already; almost as much die space as is allocated to actual shader units is allocated to matrix accelerators. I have heard anecdotally.

      source
      • thedeadwalking4242@lemmy.world ⁨1⁩ ⁨day⁩ ago

        At scale the power efficiency is probably really important though

        source
        • -> View More Comments
  • ag10n@lemmy.world ⁨2⁩ ⁨days⁩ ago

    Been looking at their Arc B50/B60 but still too expensive in Canada

    source
  • DrFistington@lemmy.world ⁨2⁩ ⁨days⁩ ago

    Good luck fucking things up like you always do

    source
    • MentalEdge@sopuli.xyz ⁨2⁩ ⁨days⁩ ago

      Wut?

      Alchemist and Batllemage cards were fine.

      source
  • Paragone@piefed.social ⁨2⁩ ⁨days⁩ ago

    From what I’ve read about the “quality” of their drivers, .. NVidia isn’t under any threat, whatsoever.

    Years before bugs get fixed, etc..

    ( Linux, not MS-Windows, but it’s Linux where the big compute gets done, so that’s relevant )

    https://www.phoronix.com/review/llama-cpp-vulkan-eoy2025/5

    for some relevant graphs: Intel isn’t a real competitor, & while they may work to change that .. that lag is SERIOUSLY bad, behind NVidia.

    _ /\ _

    source
  • tidderuuf@lemmy.world ⁨2⁩ ⁨days⁩ ago

    At least they are admitting the Intel ARC was more of a joke rather than a graphics card.

    source
    • RejZoR@lemmy.ml ⁨2⁩ ⁨days⁩ ago

      Intel ARC is no joke. Technologically it’s very capable, they just never really scaled it to compete on any higher level…

      source
      • AdrianTheFrog@lemmy.world ⁨1⁩ ⁨day⁩ ago

        At the datacenter scale Gaudi 3 was pretty good, at least when it came out.

        source