Forums

Topic: The framerate hill I'm willing to die on.

Posts 21 to 30 of 30

kkslider5552000

Now 60 FPS is objectively cool and based, but so are many things that aren't in...most games I like.

Oh well.

Not a hill I'd die on, seems like a pathetic reason to die. Imagine your family being told why you died. Because of 60 FPS video games. Shameful.

Non-binary, demiguy, making LPs, still alive

Megaman Legends 2 Let's Play!:
LeT's PlAy MEGAMAN LEGENDS 2 < Link to LP

FishyS

I can't say I have ever noticed the difference between a 30 fps game and a 60 fps game. I can definitely tell when the frame rate is like 7 frames per second (cough parts of Pokemon Violet) but for the subtlety between a solid 30 and 60... I can never decide whether my eyesight is just bad or if everyone else is just hallucinating. 😆

As a related note, the medical profession doesn't seem to even consistently agree that all people can even see at a rate of 60 fps. It seems to vary by person how fast visual clues can be picked up and completely separately how fast brains interpret them. For more complicated visual data brains seem to take about 1/10th of a second to even process and for simpler things more like 1/100th. Again, also varying by person and the research seems not to be totally agreed upon. Two things out of this:

  1. The difference might in fact be very noticeable to one person but literally physically impossible to notice by another.
  2. Although real human processing isn't literally that discrete, round numbers like 60 still might not by the best brain-wise. Maybe 43.5 is the sweet spot for one person and 67 for another and 28 for a third.

FishyS

Switch Friend Code: SW-2425-4361-0241

skywake

@FishyS
Yeah, some truth in that. I think the trap that a lot of people fall into when talking about "what the human eye can see" is that we're talking about an organic device here. There's not really a "framerate" for eyes, it doesn't work like that. Instead what we have is a light receptor that takes some time to react to a change in stimulus. Combined with a brain that's incredibly good at "filling in the gaps". There's not really a "resolution" of the eye, there's a limited number of receptors but they're not evenly distributed and your eye constantly moves around and, again, your brain fills in the gaps

And your brain, your brain is the key bit here. You know what the brain is also pretty good at? It's great at taking shortcuts. You can learn to be good at picking this stuff up, you can also learn to ignore it. It's like how you can fool yourself into thinking your vision is better than it actually is by reading text at a distance. Because your brain knows what shapes letters generally are and what that word probably is. So you have to add all of that mental gap-filling ontop of what's an entirely organic process to begin with. It gets a bit fuzzy

But I think fundamentally there's an even bigger problem with how @Ironcore presented this question. This here is not at all showing what the negative impacts of "lower framerates" is:
Untitled
What it is showing is a relatively slow pixel response time. Either from the camera taking that picture or the screen itself. This happens in the transitions between frames so, somewhat perversely, the phenomenon shown in this image happens MORE often (although admittedly less pronounced) at higher framerates

It's also the same kind of motion blur that occurs when recording video which makes 24fps acceptable. So in some sense this is a phenomenon that improves the perception of motion, if done correctly. But going back to this specific "problem".... it's frame persistence, not framerate. The solution is a technology with a faster pixel response time, like OLED, and things like Black Frame Insertion. Higher framerate doesn't really factor into it

Edited on by skywake

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

Rambler

@skywake
Back in the day, if a computer was shown on TV, the monitor screen was all messed up. I always reckoned that was to do with the resolution of the monitor Vs the filming medium.
I guess your reasoning above is similar? It's the disparity between the recording medium and the game.
I suppose if the game was at 30fps and the screen was at, say, 80hz, something like this would occur?

Rambler

FishyS

@skywake Brain training is a good point which medical studies on this type of topic often ignore, focusing instead on the easier-to-test static question.

It makes we wonder if people who only play Switch have inadvertently trained their brains to be better at frame-interpolation than people who play e.g. high FPS PC games.

FishyS

Switch Friend Code: SW-2425-4361-0241

Greatluigi

I seriously don’t get why people complain about games that have 30FPS. As long as it’s a stable frame rate it’s perfectly fine.

Greatluigi

skywake

@Rambler
Yeah, it's similar sort of thing, not quite but similar. Basically if you have a monitor that is running at 60Hz and your game is running at 30fps it means every second frame is being shown on screen. At a locked 30fps every ~33ms the image shown will change. But if you miss that window you have to either sit around and wait for the next refresh cycle, which means you're waiting another 16ms ontop of the 33ms, or you display your partially updated frame. Judder or tear, those are your options

That's what I meant in my first post. 30fps locked? That's fine. You get a consistent 33ms between every single frame. But if you drop to 29fps every 15th frame takes 50ms, so every 15th frame is sitting on the screen for an additional 16ms. That's what you notice. But if you target 60fps on a 60Hz display and miss it by the same amount i.e. you only get 58fps? You're generally hitting a frame every ~17ms but every 30th frame you jump to 33ms

As I said in the earlier post, it's similar to what happens when you watch a 24fps movie on a display that's running at a refresh rate that's not running a multiple of 24Hz. Did a quick google and found this example:

With pre-recorded content you can control the framerate and, ideally, run it at on a display with a refresh rate that lines up. Games don't have that luxury. Games are inherently a variable refresh rate media. The only way to avoid that kind of inconsistency with a game is to overshoot a target framerate and have the GPU wait for the display (i.e. framerate targets plus V-Sync).... or have a fairly stable performance at an arbitrary framerate and have the display wait for the GPU (i.e. VRR). Or you run at a high enough framerate that the frame variability is imperceivable

Edited on by skywake

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

RupeeClock

So light and the way we process it is a lot more complex than people consider.
A cheaper TV or monitor may exacerbate the issues with lower frame rate content, and have visible juddering, ghosting, blurring, etc.
The UFO Test is a popular way to test display technology with a horizontally scrolling image at fixed framerates and movement speeds.
Untitled
https://www.testufo.com/

Photo capture and video recording equipment can also exaggerate the appearance of blur or ghosting on TV or monitor displays, due to the wavelength of light they emit.

What Ironcore is saying about the 30 FPS looking blurry to the human eye, isn't necessarily wrong just from looking at the UFO Test myself.
However, it can depend if you are visually tracking the moving object or not. Try looking at a fixed position on the screen and you may be able to discern the individual frames of the UFO clearly.

Even simple LED lightbulbs have refresh rates! A frequency of how often they emit light to illuminate surroundings, and a lower frequency can cause eye strain and an observable stroboscopic effect.
Cheaper light bulbs can have lower refresh rates that can cause eye discomfort.

Now as for video game software, frame pacing and latency is typically more important than frame rate.
If the game feels perfectly responsive and smooth, that matters more than if the video output is 60 FPS or 30 FPS (though the former is typically preferable).
It can be possible to have juddery 60FPS where those frames are not drawn at consistent intervals in the lapse of a second, and throw off your timing.

It can also depend on the implementation of video output and game logic.
Some games tie their program logic and internal tick-rate to the video output (fixed frame rate), others do not (variable frame rate).
The NES game Dr. Jekyll and Mr. Hyde is an interesting example of the former.

As for the Princess Peach Showtime demo, it's been established that it's an Unreal Engine 4 game and that it runs with a 30 FPS cap which can be bypassed to allow the game to run at 60 FPS (with system performance being the limiting factor).

Edited on by RupeeClock

RupeeClock

Takoda

The things I never thought I’d learn in the seemingly rather pointless discussion of 30 fps vs 60.
But yeah, that blurred image definitely isn’t how the game is perceived usually, and the brain training angle is also super interesting!

I keep buying fighting games for some reason, even though I barely got anyone to play against.

Switch Friend Code: SW-7519-0735-1595

Ryu_Niiyama

@sdelfin Whoops sorry, looks like NL ate my comment yesterday. I will try to recreate it.

I'm intrigued to hear someone else has the same/similar issue although your described it in a way I hadn't thought of but realize it fits: unnatural. I don't have correct depth perception as I don't have stereoscopic vision so I assumed it was a me/people like me problem. Good to know!

Taiko is good for the soul, Hoisa!
Japanese NNID:RyuNiiyamajp
Team Cupcake! 11/15/14
Team Spree! 4/17/19
I'm a Dream Fighter. Perfume is Love, Perfume is Life.

3DS Friend Code: 3737-9849-8413 | Nintendo Network ID: RyuNiiyama

Please login or sign up to reply to this topic