Like I’m not one of THOSE. I know higher = better with framerates.
BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.
The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!
… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.
Yet like.
I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.
And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?
Probably consistency.
Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.
Modern games fluctuate so much that it pulls you right out.
Well a good chunk of it is that older games only had ONE way they were played. ONE framerate, ONE resolution. They optimized for that.
Nowadays they plan for 60, 120, and if you have less too bad. Upgrade for better results.
Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.
I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps
I think it has something to do with frame times.
I’m not an expert but I feel like I remember hearing that low framerate high frametime feels worse than low framerate low frametime. Something to do with the delay it takes to actually display the low framerate?
As some of the other comments mentioned, it’s probably also the framerate dropping in general too.
Stuttering, but mostly it’s the FPS changing.
Lock the FPS to below the lowest point where it lags, and suddenly it wont feel as bad since it’s consistent.
The display being at a higher resolution doesnt help either. Running retro games on my fancy flatscreen hi-def massive TV makes them look and feel so much worse than on the smaller fuzzy CRT screens of the time.
I can’t stand modern games with lower frame rates. I had to give up on Avowed and a few other late titles on the series S because it makes me feel sick when turning the camera. I assume most of the later titles on xbox will be doing this as theyre starting to push what the systems are capable of and the series S can’t really cope as well.
Are you using an OLED screen?
I had to tinker with mine a fair bit before my PS1 looked good on it.
Stability of those fps are even more important. Modern games have problems with that.
Because they all fuck with the frame timing in order to try to make the fps higher (on paper)
also latency! playing with high input latency typical of when modern game engines choke is awful.
It’s a few things, but a big one is the framerate jumping around (inconsistent frame time). A consistent 30fps feels better than 30, 50, 30, 60, 45, etc. Many games will have a frame cap feature, which is helpful here. Cap the game off at whatever you can consistently hit in the game that your monitor can display. If you have a 60hz monitor, start with the cap at 60.
Also, many games add motion blur, AI generated frames, TAA, and other things that really just fuck up everything. You can normally turn those off, but you have to know to go do it.
If you are on console, good fucking luck. Developers rarely include such options on console releases.
30 50 30 60 30… Thats FPS… Frametime means the time between each frame in this second.
It depends on what you’re running, but often if the frame rate is rock solid and consistent it helps it feel a lot less stuttery. Fallout games are not known for their stability and well functioning unfortunately.
For comparison, deltarune came out a few days ago, that’s locked to 30 fps. Sure it’s not a full 3D game or anything, but there’s a lot of complex motion in the battles and it’s not an issue at all. Compared to something like bloodborne or the recent Zeldas, even after getting used to the frame rate they feel awful because they’re stuttering all the damn time.
I think player expectations play a big role here. It’s because you grew up with 20fps on ocarina of time that you accept how it looks.
I’m pretty sure that game is not a locked 20 FPS and can jump around a bit between 15-20, so the argument that it is locked 20 and so feels smooth doesn’t really convince me.
If that game came out today as an indie game it would be getting trashed for its performance.
Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.
Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).
Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.
First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that’s a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
Edit: to be clear, this is the screen’s refresh rate, the game doesn’t need to run at hfr to benefit.
I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.
The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing. Black frame insertion or frame doubling doubles the frame, cutting input delay roughly in half (not quite that because of overhead, but hopefully you get the idea).
This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.
Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
Essentially, the speed of the beam determined how many lines you could display, and the more lines you tried to display, the slower the screen was able to refresh. So higher resolutions would have lower max refresh rates. Sure, a monitor could do 120 Hz at 800x600, but at 1600x1200, you could probably only do 60 Hz.
Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
That’s why I specified 60hz :)
I see that you meant TVs specifically but I think it is misleading to call processing delays ‘inherent’ especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.
True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.
Bear in mind I can’t pinpoint the specific issue for any given game but there are many.
Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
I mentioned TVs above re: post processing.
Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.
Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”
Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.
Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.
The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it’s not buffering whole frames before displaying them.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
And sometimes less, like when vsync is disabled.
That’s not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That’s 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora’ mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn’t miss vsync
old games animations were sometimes made frame by frame. like the guy who drew the character pixel by pixel was like “and in the next frame of this attack the sword will be here”
Two things that haven’t quite been mentioned yet:
1) Real life has effectively infinite FPS, so you might expect that the closer a game is to reality, the higher your brain wants the FPS to be in order for it to make sense. This might not be true for everyone, but I imagine it could be for some people.
More likely: 2) If you’ve played other things at high FPS you might be used to it on a computer screen, so when something is below that performance, it just doesn’t look right.
These might not be entirely accurate on their own and factors of these and other things mentioned elsewhere might be at play.
Source: Kind of an inversion of the above: I can’t focus properly if games are set higher than 30FPS; It feels like my eyes are being torn in different directions. But then, the games I play are old or deliberately blocky, so they’re not particularly “real” looking, and I don’t have much trouble with real life’s “infinite” FPS.
Part of it is about how close you are to the target FPS. They likely made the old N64 games to run somewhere around 24 FPS since that was an extremely common “frame rate” for CRT TVs common at the time. Therefore, the animations of, well, basically everything that moves in the game can be tuned to that frame rate. It would probably look like jank crap if they made the animations have 120 frames for 1 second of animation, but they didn’t.
On to Fallout 4… Oh boy. Bethesda jank. Creation engine game speed is tied to frame rate. They had several problems with the launch of Fallout76 because if you had a really powerful computer and unlocked your frame rate, you would be moving 2-3 times faster than you should have been. It’s a funny little thing to do in a single-player game, but absolutely devastating in a multi-player game. So, if your machine is chugging a bit and the frame rate slows down, it isn’t just your visual rate of new images appearing that is slowing down, it’s the speed at which the entire game does stuff that slowed down. It feels bad.
And also, as others have said, frame time, dropped frames, and how stable the frame rate is makes a huge difference too in how it “feels”.
I have never come across a CRT whose native “frame rate” was 24
My favorite game of all time is Descent, PC version to be specific, I didn’t have a PlayStation when I first played it.
The first time I tried it, I had a 386sx 20MHz, and Descent, with the graphics configured at absolute lowest size and quality, would run at a whopping 3 frames per second!
I knew it was basically unplayable on my home PC, but did that stop me? Fuck no, I took the 3 floppy disks installer to school and installed it on their 486dx 66MHz computers!
I knew it would just be a matter of time before I got a chance to upgrade my own computer at home.
I still enjoy playing the game even to this day, and have even successfully cross compiled the source code to run natively on Linux.
But yeah I feel you on a variety of levels regarding the framerate thing. Descent at 3 frames per second is absolutely unplayable, but 20 frames per second is acceptable. But in the world of Descent, especially with modern upgraded ports, the more frames the better 👍
Descent broke my brain. I’m pretty good at navigating in FPS’, but Descent’s 4 axis of movement just didn’t click for me. I kept getting lost, recently I tried it again after many years, I just can’t wrap my head around it.
Same with space sims. I’m dog awful in dog fights.
Indeed, it’s not quite a game for everyone, especially if you’re prone to motion sickness. Initially it only took me about a half hour to get a feel for the game, but configuring the controls can still be a headache.
Every time I set the game up on a new or different system, I tend to usually go for loosely the same sort of controls, but each new setup I might change up the controls a bit, like an endless guessing and testing game to see what controls might be ideal, at least for me.
By the way, Descent is considered a 6 Degrees Of Freedom game, not 4. But hey, at least they have a map feature, I’d go insane without the map sometimes…
I meant 6, not sure why I typed 4.
Great games. Free space was almost mind blowing when I first played it as well
I haven’t actually played Free Space before, but I did manage to get a copy and archive it a few years ago.
I also got a copy of Overload and briefly tried that, but on my current hardware it only runs at about 3 frames per second…
The Descent developers were really ahead of their time and pushing gaming to the extreme!
Definitely give it a shot. It’s obviously different, but I loved it. My mom actually banned me from playing descent 3: vertigo, because she had vertigo and it made her sick
Vertigo was actually an expansion on Descent 2, I made the NoCD patch for it via a carefully hex edited mod based on another NoCD patch for the original Descent 2.
Any which way, yeah, anyone with vertigo wouldn’t be comfortable or oriented in any way if they’re watching or playing the game, no matter what version.
Descent is pretty fun. Not as big of a fan as you are, but I definitely dig it.