I mean, I don't understand much about games, but why do people care so much if a game is running at 60 fps?

What's the difference between 30 fps and 60 fps in gaming for humans' eyes? (not trolling)

Edit: I've read a lot of answers. Very interesting, thanks. Let's use a real example: Dark Souls Remastered.

It says it'll be running at 60 fps on PC, PS4 and XBOX ONE and 30 fps on Switch. I've seen some people arguing about it on other forums.

Some people here said it is clearly visible for PCs, but it doesn't make much difference if you play on TV (what about a portable screen?). So what's the big deal?

6 years ago*

Comment has been collapsed.

Smoothness and imput lag

6 years ago
Permalink

Comment has been collapsed.

30.

6 years ago
Permalink

Comment has been collapsed.

I see what you did there, nice one 🤣

6 years ago
Permalink

Comment has been collapsed.

clearly. . .

6 years ago
Permalink

Comment has been collapsed.

Everything is smoother when running at 60 FPS. Most importantly animations and key inputs.
I simply don't understand how some people can't notice and feel the difference.

6 years ago
Permalink

Comment has been collapsed.

Indeed.

6 years ago
Permalink

Comment has been collapsed.

I think officially the human eye can't see more than 30fps or something like that...
But if I check some of the comparisons I think 60fps looks a bit smoother.
for example

But it's really not a big deal I think, as long as the game runs somehow :P
Most of the time I play ark with 25fps and it's fine :D

6 years ago
Permalink

Comment has been collapsed.

This was debated so many times in every gaming discussion about human eyesand FPS. It was said the human eye can't see more than 60FPS but it was already provem many times the human eyes can see well over 100 frames per second.

EDIT: found the source about fighter pilots. ''Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second.'' which means 220 frames

6 years ago*
Permalink

Comment has been collapsed.

Baloney conclusion. There's a huge difference between a high-contrast transient causing a detectable response on the retina versus the retina's ability to distinguish repeated stimuli in rapid succession. If that weren't so, everyone would see fluorescent lights flickering all the time. In reality, the nerve response doesn't have time to cool down faster than 30-60 Hz.

6 years ago
Permalink

Comment has been collapsed.

The conclusion is not baloney, it just means that we can't directly translate it to games or movies, as the conditions are very different. It does prove that the whole "the human eye can't see above X FPS" is bogus though.

6 years ago
Permalink

Comment has been collapsed.

The human eye doesn't have a shutter so it doesn't see in frames per second. The human brain can deal with hundreds of frames per second and still pick details out.

6 years ago
Permalink

Comment has been collapsed.

I almost can't believe you're being serious :p I can't play any action game < 60 fps. My upgrade from a 60 Hz to a 144 Hz monitor was one of the best investments I ever did.

6 years ago
Permalink

Comment has been collapsed.

yeah, of course it would be better with higher fps :)

But at the moment I can't afford better hardware :(
And you get used to these low fps :P

6 years ago
Permalink

Comment has been collapsed.

Do you then value graphics over fps or play most games on low as well because of your hardware?

You probably do :p I just try to avoid it as much as possible ^^ like In the witcher 3 I'll play on low (few settings on medium) in order to get to 60. If I ever win the lottery, I'll buy us both a new graphics card (y)

6 years ago
Permalink

Comment has been collapsed.

Well...yes, I don't really care much about the fps, as long as it feels okay for me.

But usually I don't have that problem, because it doesn't happen very often, that I buy a game wich costs more than 10 or 15 dollar/euro.
So by the time I buy these high graphic games my hardware should be able to handle it :)

Ark is just an exception <3

Thanks for sharing your lottery win :D

6 years ago
Permalink

Comment has been collapsed.

24fps is actually a minimum, under which we perceive visual input as firmly being "stuttery". Active perception doesn't start smoothing out until about 48~60fps (see SpankyPie's image below), and we actually can actively assess information till about 120, after which point we can still recognize and respond to motion cues via reflex or instinct, and can still perceive information to be assessed (even if we can't actively respond to it as it's happening).

As an interesting quirk, movies are shot at 24fps but shown at 48hz (showing each frame twice). This creates a weird visual effect where we both perceive it as moving at a smoother and more natural 48fps pace, but still have the 'fill in the blanks' motion blur effect that 24 fps normally provides us; as such, our brain is better able to ignore visual cues which indicate what we're seeing is fake- something which, obviously, is of benefit to a film [hence why the 24/48 approach is industry standard].

Slower framerates are actually rather hard for us to perceive natural motion on (thus why rapid scene shifts in certain media can be disorienting), which is why higher framerates feel so much "smoother" on such elements. On the other hand, at times people may find higher framerates to feel 'unnatural'- but that's not because we can't perceive information at that rate, but because the information either is being upscaled to a higher framerate than what it's intended for, or because cues we perceive with fakeness are no longer being disguised by our brain.

But it's really not a big deal I think

The difference in being able to naturally perceive something and not, makes a massive difference for "active" games, like any "twitch" game, racing game, fighting game, FPS game, etc. Likewise, higher framerates past that point can give a distinct edge to your "twitch" response rate. You're correct that for any game that's not based on quick responses, the difference isn't of much importance, other than for smoother (and thus more aesthetically pleasing) transitions between visuals- but there are definite genre niches where a faster framerate is of special note to fans of the genre. :)

Edit: Further reading

6 years ago*
Permalink

Comment has been collapsed.

Thanks, that informative :)
didn't know that about the movies!

6 years ago
Permalink

Comment has been collapsed.

I'll second that thanks. very informative

6 years ago
Permalink

Comment has been collapsed.

As an interesting quirk, movies are shot at 24fps but shown at 48fps (showing each frame twice). This creates a weird visual effect where we both perceive it as moving at a smooth natural pace, but still have the 'fill in the blanks' motion blur effect that 24 fps normally provides us; as such, our brain is better able to ignore visual cues which indicate what we're seeing is fake- something which, obviously, is of benefit to a film [hence why the 24/48 approach is industry standard].

you mean interlaced, right? but that's only half the image per frame. it's double the frame rate with half frames, but effectively the same with full frames.

6 years ago
Permalink

Comment has been collapsed.

Films are not interlaced. That's only used in TV.

It's the way a cinema projector works: the light need to be turned off to advance to the next frame (otherwise we would see the film moving). But a 24 Hz flickering light is very noticeable and would cause headache. So they increase the flickering frequency to 48 Hz (the effect in our brain is that each frame is showing twice).

This is for classic projectors. Digital projectors I guess don't need any flickering since there's no film movement that need to be hidden (but I don't know).

6 years ago
Permalink

Comment has been collapsed.

ARK is a spec eating monster, i play it at 15 fps whenever i enter in my tribe base, and run 60 fps smoothly everywhere else x) i don't mind having less than 30 fps unless its in fighting action, there you clearly see a difference

6 years ago
Permalink

Comment has been collapsed.

I lock all my games to 24 fps for that cinematic look, that and my PC is sh!t.

6 years ago
Permalink

Comment has been collapsed.

View attached image.
6 years ago
Permalink

Comment has been collapsed.

Nice example

6 years ago
Permalink

Comment has been collapsed.

Top! thumbs up

6 years ago
Permalink

Comment has been collapsed.

Actually, if you can only play at 30 FPS, it's more than probable your game will drop well below that mark during more intense moments, and anything below 25 FPS is really unpleasant. So if you play at 60 FPS, you actually make sure that your drops are not worse than 30 FPS, which in all fairness is pretty okay. Playing at 30 FPS max means you'll be dealing with occasional stuttering and real lag, easily with moments at 15 FPS and that's a bummer.

6 years ago
Permalink

Comment has been collapsed.

I can see twitches of 30 fps and less.
Fps 30-60 makes animation smoothness. But not all people can see this cause of physiology.

I changed my gpu some months ago. So, I still adopting to new 60fps without laggs after old 20-30 with laggs (+low graphic quality). It make me sense I'm in "water" or something close to it vs "piloting mech" when I using mouse to control view.

6 years ago*
Permalink

Comment has been collapsed.

They can see it, they're just either in denial or aren't particularly bothered by the difference. Or maybe they just looked at it for 5 seconds and said: "it looks the same!". There's a reason people who switch from consoles to PC for the first time are so impressed by the difference after a while of playing - it's not just superior graphics (if the game allows the PC version to have that) or accessibility, but FPS "smoothness".

6 years ago
Permalink

Comment has been collapsed.

ever play on 3fps that means three picuters per secand thats just a slide show ever play on 30 thats a movie

6 years ago
Permalink

Comment has been collapsed.

I was confused about the same thing until I went to play Payday 2 and toggled the FPS lock from 30 to 60 back and forth. The animations of the particles falling down became noticeably laggy when in 30 fps compared to 60.
Over time you'll get used to it of course.

Now, that's just the visuals. It's a whole other thing to actually play something.
Put hard locks on different FPSs for some game. Like Mad Max. At first, lock the FPS to 20, then 30 and then 60. Each of those will have a large difference. At 60 fps the input lag is almost non-existent. At 30 fps you can feel a slight input lag if you pay attention to it, but at 20... you'll feel it.

We can try and explain it, but if you're anything like me then you need to experience it for yourself. It'll take you a few minutes to do overall (15 max) and you'll probably get it.

6 years ago
Permalink

Comment has been collapsed.

i prefer rts or rpg

6 years ago
Permalink

Comment has been collapsed.

Easy solution: Play a game at 30 and 60 FPS and you'll feel the difference if it's a game that requires any sort of high-paced action.

60 FPS is unnecessary in, say, a turn based strategy game. Even in an RTS it's not mandatory but nice. In an FPS/TPS or just an action game in general it is very very pleasant.

And yes, we can see the difference. That's why consoles are striving to get to 60 FPS aswell.

6 years ago
Permalink

Comment has been collapsed.

well, even a game like Hearthstone profits from 60fps. i wouldn't say it's necessary, but it definitely feels a lot better. :)

6 years ago
Permalink

Comment has been collapsed.

When I play at 30 fps - I start to cry in less than 1 hour xD

6 years ago
Permalink

Comment has been collapsed.

View attached image.
6 years ago
Permalink

Comment has been collapsed.

60 is okay. 30 is sometimes to less and it starts to lag. The better question should be why to have 120 and more fps.

6 years ago
Permalink

Comment has been collapsed.

i can't stand games at 30fps. i am so used to 60, i notice when a game runs at 30 in like 2 seconds. as others said, the graphics are way smoother and the input is more direct.

i am still limited to 60fps due to my monitor. but that will change soon. as soon as Acer launches the new X35, i am on board (supposed to happen in Q1). 21:9, 1440p, 200hz, GSync! it will be the queen of all monitors! it will be glorious! ^^

6 years ago
Permalink

Comment has been collapsed.

60 Hertz is the most common refresh rate for monitors, this also means that not only they are using it to its full scale, but also better for the eyes in general and has the least amount of graphical quality loss in motion.

As a side note, if you ever wondered why every game now has to include a motion blur and some depth of field filter, it is only partly because it looks good and mostly because this hides a ton of the visually less impressive things that came with the very old and not so strong PS3 and X360 hardware. Essentially, they blurred everything that looked bad.

Nowadays on PC it is also almost an entry level, as 120 and 144 Hertz monitors start to spread. Once you see motion that smooth, it is actually difficult for your eyes to get used to lower framerate again. It is a common thing with those who switched to PC gaming from consoles and now have issues playing games at 30 fps, as movement looks choppy for them, despite the motion blur. Same with those wh experienced 144 Hz and the same fps for a longer time, 60 Hertz feels straining for the eyes until they get used to it again.

Edit: This is a relatively nice comparison video showing that it is not as drastic as people make it out to be, because motion blur masks a lot, but even if you have a 60 Hz monitor and can only see the bottom two, the difference is clear: https://www.youtube.com/watch?v=GWZWCkljsmQ

6 years ago*
Permalink

Comment has been collapsed.

If you play on 120hz, you won't be able to stand to even 60hz. Basically we get used to that smoothness, and anything lower than that becomes a chore on the eyes to appreciate. I'm talking about monitor/TV frequency, which of course to be fully used, requires the game to have the same framerate or higher.

And of course, the input delay, which games the game quicker to respond to your inputs, so more responsive, fluid and precise controls in general. The more frames you have, the lower the input rate, but that works even if the display only shows 60hz, providing you turn vsync off of course.

6 years ago*
Permalink

Comment has been collapsed.

Yep... Few months ago I discovered that my monitor goes up to 75 Hz and not 60 as I first thought. If I play with less than 70 FPS I just close the game because it's very hard to look at.

6 years ago
Permalink

Comment has been collapsed.

If you always play on 30FPS you probably adapt over time (you have no idea it could be much smoother). But if you switch between 30 and 60 FPS back and forth, you will definitely notice a significant difference.

6 years ago
Permalink

Comment has been collapsed.

I must be super human because I notice the difference. However, 30 fps isn't a massive deal if the game is good

6 years ago
Permalink

Comment has been collapsed.

Well, it depends on the person. For example I'm a big Total War and Warhammer fan but I couldn't play more than 5 minutes of Total Warhammer due to low fps (25-35 max)...

6 years ago
Permalink

Comment has been collapsed.

This used to be an arguement between consoles and pc , if you put 2 games side by side one 30 one 60 you will feel the difference . People that game on consoles comforted in the idea that is no different but seeing how consoles try to get 60 fps now days in games , now that technology made it doable , i believe the same people feel happier
Myself i can play a game on 30 fps or 60 fps and i mostly judge the gameplay the story and if its fun, so given a game is locked on 30 i wont get mad but i do prefer play at 60

6 years ago
Permalink

Comment has been collapsed.

30fps is also a lot more tolerable on a TV that's 8' away from me than on a monitor that's 18" from my nose. I can tolerate 30fps games on my PS4. Obviously the 60fps games like Doom are nicer but all the big open-world action-adventure games are 30fps and quite playable.

On a monitor though is a different story. I was playing Witcher 3 at around 45fps and it was barely tolerable.

6 years ago
Permalink

Comment has been collapsed.

It already was an issue in the very old consoles. NTSC had 60Hz , PAL 50Hz. Depending on the port, Europe often got a slower game.
Here the fps was actually tied to the tv-output signal rate.

6 years ago
Permalink

Comment has been collapsed.

is it still 2009? 144hz is where its at

6 years ago
Permalink

Comment has been collapsed.

The game looks more smooth and it's better when you are playing games that needs quick reflexes like some hack n slashes.

As Killer is Dead 🖤

6 years ago
Permalink

Comment has been collapsed.

It should depend on the type of game. Like, in Point&click adventure, you shouldn't care if there's more than 24 fps. In a FPS like Quake, you need every frame to have more headroom for reaction (specially in competition).

It's considered that 30fps is okay for most 2D games, while fast-paced action games need 50-60fps (constant fps is more important than high fps) so the action doesn't get "choppy" and "laggy".

Some people swear they can't stand anything under 120fps or even more so... whatever.

In my experience, 45-50 fps is okay and 60+ is perfect as long as it's stable.

6 years ago
Permalink

Comment has been collapsed.

Sign in through Steam to add a comment.