The late 90s and early 2000s were a time of rapid increases in game graphics.
We went from DOOM in 1993 with sprite enemies, abstract textures, and technical limits like not even being able to have second story rooms on top of each other to Half Life in 1998 with full 3D characters and objects, physics, and much higher resolution textures.
Jumps in graphics back then could be huge. As graphics get better though, improvements on them become diminishing returns. It stops being going from 2D to 3D or going from flat head models with textures pasted on to modeled faces, and starts becoming things like subcutaneous light scattering. Things will keep looking better and better but we’ve long ago hit a baseline with graphics.
Mass Effect was made on Unreal 3. While we are currently on Unreal 5, there have been lots of games released in the intervening years that either used Unreal 3 or a modified version of it.
kakes@sh.itjust.works 10 months ago
I would argue there’s some merit to catching the cultural “wave” of a new AAA release every now and again.
Obviously I don’t do it often, but I recently picked up Baldur’s Gate 3, and it’s been fun to talk to people about it at work and such.
tuckerm@supermeter.social 10 months ago
That's a really good point. Sometimes the fun you can have with the game's "multi player" community isn't in the game itself.
Baldur's Gate 3 is probably the best example I can think of. (And I don't have it, and it is really tempting for the reason you just gave.) I actually overheard two people talking about it at a coffee shop today, and three people talking about it on the train a couple weeks ago. I can't think of any other game that has been like this.