Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Xbox 360/PS3/(to a lesser extent) Wii owners represent

⁨204⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨days⁩ ago⁩ by ⁨heythatsprettygood@feddit.uk⁩ to ⁨[deleted]⁩

https://feddit.uk/pictrs/image/3cc262d2-3044-4211-98f3-8db3a6449115.webp

source

Comments

Sort:hotnewtop
  • Thorry84@feddit.nl ⁨2⁩ ⁨days⁩ ago

    Are you referring to the red ring of death on the Xbox? Because that has absolutely nothing to do with ATI. They just made the chips, it’s Microsoft that put them on the board. Most of the issues were caused by a poor connection between the chip and the board, not a hell of a lot ATI could do about that.

    A lot of it was engineers underestimating the effect of thermals between 80 and 95 degrees for very long times, with cool down cycles in between. The thinking was this was just fine and wouldn’t be an issue. It turned out it was an issue, so they learnt from that and later generations didn’t really have that issue.

    source
    • heythatsprettygood@feddit.uk ⁨2⁩ ⁨days⁩ ago

      As far as I am aware, the 360 GPUs had faulty solder connections (due to poor underfill choice by ATI that couldn’t withstand the temperature) between the chips and the interposer, not the interposer and the board, shown by the fact that a lot of red ring 360s show eDRAM errors (i.e. can’t communicate to the module on the same interposer, ruling out poor board connections). Microsoft even admitted this in a documentary they made (link), where they said it wasn’t the board balls, it was the GPU to interposer balls. A similar underfill choice is also why there are slightly higher failure rates of early Wiis, although nowhere near as bad as 360 due to the low power of the GPU on there.

      source
      • cepelinas@sopuli.xyz ⁨2⁩ ⁨days⁩ ago

        I thought tsmc chose the poor underfill.

        source
        • -> View More Comments
      • jyl@sopuli.xyz ⁨2⁩ ⁨days⁩ ago

        I don’t know how much of it was ATI’s fault or the fab’s, but my understanding is that no one had experience handling that amount of heat.

        source
        • -> View More Comments
  • ColeSloth@discuss.tchncs.de ⁨1⁩ ⁨day⁩ ago

    Wiis and ps3’s weren’t crapping out, and the 360 failures weren’t due to ati. This meme is dumb. Dumb in the bad meme kinda way.

    source
    • Sylvartas@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

      GameCubes also had ATI GPUs. Never heard of any issues with these

      source
    • heythatsprettygood@feddit.uk ⁨1⁩ ⁨day⁩ ago

      Wii was mostly okay, but boards with a 90nm Hollywood GPU are somewhat more likely to fail than later 65nm Hollywood-A boards (so RVL-CPU-40 boards and later), especially if you leave WiiConnect24 on as it keeps the Starlet ARM chip inside active even in fan off standby - most 90nm consoles will be okay due to low operating temperatures, but some (especially as thermal paste ages and dust builds) are more likely to die due to bumpgate related problems.

      PS3s did crap out with yellow lights of death, although not as spectacularly as 360 red rings (lower proportion due to beefier cooling and different design making the flaws less immediately obvious, but still a problem). NVIDIA on the RSX made the same mistakes as ATI on the Xenos - poor underfill and bump choice that could not withstand the thermal cycles, which should have been caught (NVIDIA and bumpgate is a whole wild story in and of itself though, considering it plagued their desktop and mobile chips). The Cell CPU on there is very reliable though, even though it drew more power and consequently output more heat - it was just the GPU that could not take the heat.

      360s mostly red ringed due to faulty GPUs, see previous comments about the PS3 RSX. ATI had a responsibility to choose the right materials, design, and packaging partner to ship to Microsoft for final assembly, and so they must take some responsibility (they also, like NVIDIA, had troubles with their other products at this time, leading to high failure rates of devices like the early MacBook Pros). However, it is unknown if they are fully to blame, as it is unknown who made the call for the final package design.

      source
      • DacoTaco@lemmy.world ⁨1⁩ ⁨day⁩ ago

        Ok so, let me set this all straight.
        The wii had nothing to do with the gpu but with the die of the gpu that nintendo had designed and kept secret.
        Inside the gpu die is both the gpu (hollywood) but also an arm core called starlet. It runs the security software and thats where (rarely but happened) things went wrong, as it was always running code, even in standby. This had nothing to do with ati.

        And the ps3 was not what you said. The ps3’s problem was that the ihs wasnt making a good enough contact to the core so the heat of the cpu didnt transfer well into the cooler. You can fix this, but is very tricky and is easy to permanently damage the ps3 in doing so ( you have to cut the silicon under the ihs without touching the die or the pcb, remove the silicon and do reattach it with less glue ). This could be contributed to the manufacturer i suppose

        source
        • -> View More Comments
      • ColeSloth@discuss.tchncs.de ⁨1⁩ ⁨day⁩ ago

        I dunno, man. I never knew anyone who had a yellow light PS3 and the only ones I read about were from people who had kept them in enclosed cabinets. I also watched a very in depth 2 hour documentary on the 360 rrod and it wasn’t due to ati.

        source
    • zipzoopaboop@lemmynsfw.com ⁨1⁩ ⁨day⁩ ago

      It is shit post to be fair

      source
  • NotAGamer@lemmy.org ⁨2⁩ ⁨days⁩ ago

    Apparently you are unaware of the shit storm that’s been Nvidia lately.

    source
    • heythatsprettygood@feddit.uk ⁨2⁩ ⁨days⁩ ago

      Oh, NVIDIA have always been a shitstorm. From making defective PS3 GPUs (the subject of this meme) to the constant hell that is their Linux drivers to melting power connectors, I am astounded anyone trusts them to do anything.

      source
      • bodaciousFern@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

        CEO Jensen Huang after reading your disparaging remarks about his company:

        Image

        source
        • -> View More Comments
  • edinbruh@feddit.it ⁨1⁩ ⁨day⁩ ago

    The PS3 doesn’t have an ATI gpu. TL;DR: it’s Nvidia.

    The PS3 has a weird, one-of-a-kind IBM processor, called Cell. You can think of it kind of as a hybrid design that is both a CPU and a GPU (not like “a chip with both inside” but “a chip that is both”) meant for multimedia and entertainment applications (like a game console). It was so peculiar that developers took a long time before learning how to use it effectively. Microsoft didn’t want to risk it, so they went with a different CPU always from IBM that shared some of the Cell’s design, but without the special GPU-ish parts, and paired it up with an ATI GPU.

    Now, Sony wanted to get away with only the Cell, and use it both as CPU and GPU, but various tests showed that despite everything, it wasn’t powerful enough to keep up with the graphics they expected. So they reached out to NVIDIA (not ATI) to make an additional GPU, so they designed a modified version of the 7800 GTX to work together with the Cell. To fully utilise the PS3 graphics hardware, one would have to mainly use the GPU for graphics, and assist it with the special Cell hardware. Which is harder.

    source
    • heythatsprettygood@feddit.uk ⁨1⁩ ⁨day⁩ ago

      Ah, I should have made that more clear in the meme. Both NVIDIA and ATI messed up in this era, bad. Sony’s efforts with the Cell are always so fascinating - so much potential is in that (just look at the late PS3 era games), but they just could not get it to the point of supplanting the GPU.

      source
  • Kolanaki@pawb.social ⁨1⁩ ⁨day⁩ ago

    Until they got bought by AMD, ATI was more reliable than nVidia cards which were prone to bursting into flames.

    source
  • spicytuna62@lemmy.world ⁨2⁩ ⁨days⁩ ago

    Still got my PS3. What a great console. Uncharted just has no business looking as good as it does running on hardware as old as the PS3.

    source
  • darkevilmac@lemmy.zip ⁨2⁩ ⁨days⁩ ago

    It’s more: making their cards competitive on price and performance lately for team red

    source
    • heythatsprettygood@feddit.uk ⁨2⁩ ⁨days⁩ ago

      AMD have been amazing lately. 9070 XT makes buying most other cards in that price range pointless, especially with NVIDIA’s melting connectors being genuine hazards. ATI (who were dissolved in 2010 after being bought out by AMD) and NVIDIA in the mid to late 2000s however were dumpster fires in their own ways.

      source
      • darkevilmac@lemmy.zip ⁨2⁩ ⁨days⁩ ago

        Depends if you can actually find a 9070XT at the price they advertised it at. Once that happens I’ll be convinced, right now though that’s very much felt like a bit of a bait and switch. Holding out hope though.

        source
        • -> View More Comments
  • deegeese@sopuli.xyz ⁨2⁩ ⁨days⁩ ago

    How old is this meme? Older than that kid I think.

    source
  • CurlyWurlies4All@slrpnk.net ⁨1⁩ ⁨day⁩ ago

    I’ve been refurbishing my PS3. It’s running like a dream.

    source