Like I’m not one of THOSE. I know higher = better with framerates.
BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.
The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!
… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.
Yet like.
I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.
And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?
That’s why I specified 60hz :)
I see that you meant TVs specifically but I think it is misleading to call processing delays ‘inherent’ especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.
True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.
Bear in mind I can’t pinpoint the specific issue for any given game but there are many.
Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
I mentioned TVs above re: post processing.
Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.
Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”
Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.
Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.
The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it’s not buffering whole frames before displaying them.
And sometimes less, like when vsync is disabled.
That’s not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That’s 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora’ mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn’t miss vsync