Bro when Majora’s mask came out nothing was 60fps lol
Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.
Actually, 60.0988fps according to speed runners.
Comment on Why do low framerates *feel* so much worse on modern video games?
SolidShake@lemmy.world 9 months ago
Bro when Majora’s mask came out nothing was 60fps lol. We weren’t used to it like how we are today. I’m used to 80fps so 60 to me feels like trash sometimes.
Bro when Majora’s mask came out nothing was 60fps lol
Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.
Actually, 60.0988fps according to speed runners.
Wrong but also not completely wrong.
FPS and alternating current frequency are not at all the same thing
I was looking it up, and games like Super Mario World are allegedly at 60fps according to some random things on the internet
Because CRTs (and maybe other displays) are slaved to the input and insensitive to exact timing, and console chipset designers used convenient line counts/clock frequencies, consoles often have frame rates slightly different from standard NTSC (which is 60000/1001 or ~59.94 fields per second).
The 262 AND A HALF lines per field NTSC uses to get the dumb oscillator in a CRT to produce interlacing, is not convenient. “240p” moves the VSYNC pulse, shortening the frame duration.
So NES’s run at -60.1 FPS.
The TV might refresh the screen 60 times per second (or actually refresh half the screen 60 times per second, or actually 50 times per second in Europe), but that’s irrelevant if the game only throws 20 new frames per second at the TV. The effective refresh rate will still be 20Hz.
That’s just a possible explanation. I don’t know what the refresh rate of Majora’s Mask was.
I’m pretty sure the 16-bit era were generally 60FPS
Framerates weren’t really a
Thing.
Before consoles had frame-buffers
The first console with a framebuffer was the 3DO. The first console people cared about with a framebuffer was the PSX.
Before that, you were in beam-racing town.
If your processing wasn’t enough to keep up with the TV’s refresh rate (60i/30p in NTSC territories, 50i/25p in PAL) – Things didn’t get stuttery or drop frames like modern games. They’d either literally run in slow-motion, or not display sprites (often both!)
You had the brief window of the HBlank and VBlank intervals of the television to calc stuff and get the next frame ready.
Buuuut, as of the PSX/N64/Saturn, most games were running anywhere between 15 and 60 FPS, with most sitting at the 20s.
Yeah but even now you can go back and play Majora’s mask, and it not feel bad.
But as mentioned the real thing is consistancy, as well as the scale of action, pace of the game etc… Zelda games weren’t sharp pinpoint control games like say a modern FPS. Gameplay was fairly slow. and yeah second factor is simply games that were 20FPS, were made to be a 100% consistant 20 FPS. A game locked in at 20, will feel way smoother than one that alternates between 60 and 45
No more optimizations. This must then be compensated for with computing power, i.e. by the end user. These are cost reasons. Apart from that, the scope has become much greater, making optimizations more time-consuming and therefore more expensive. In the case of consoles, there is also the fact that optimizations have to be made specifically for a hardware configuration and not, as with PCs, where the range of available components is continuously increasing.
Ackshuli – By late 2000 there were a couple games on PC that could get there.
… If you were playing on high-end hardware. Which most PC gamers were not. (despite what Reddit PCMR weirdos will tell you, PC gaming has always been the home for janky hand-built shitboxes that are pushed to their crying limits trying to run games they were never meant to)
tomkatt@lemmy.world 9 months ago
F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.
Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.