Okay! Every time frame rate and performance are discussed in the video game sphere, there is somebody who claims that there is no noticeable difference between 30 frames per second and 60 frames per second in games.

My current theory is that there is some fraction of people who neurologically cannot see the difference between high and low framerates. I’ve heard similar discussions regarding frame averaging and 48FPS and 24FPS in films – not that one is better than the other, but that there is no perceivable difference. This could be something like amusia, a condition where a person cannot discern between different frequencies of sound:
It might also be part of IQ – some people’s internal clocks are slower, so they cannot properly process the extra frames, and due to the deficiencies in theory of mind that go along with low IQ, they think everyone is simply lying about perceiving higher framerates. Perhaps some of them are trolling, but this debate goes way back, and too many people have been serious about this in the past for me to think it is a long, elaborate hoax. The debate was settled well over 10 years ago, or so I thought, and yet here we are.
Speaking of the debate, it has always been false, in my opinion. Trying to say 30 FPS is “more cinematic” (a phrase frequently used by game publishers) because movies run at 24 FPS is a non-sequitur, partially because 30FPS is not 24FPS, but also because games are not movies. Games are an interactive medium that, at times, requires quick reactions to things moving on screen. Fewer frames equals less data, hindering the player’s ability to make predictive or reactive decisions in the game. 60 FPS is always better than 30 FPS when active gameplay is involved, and I would say it is always better in things like cutscenes, too.
The fact is that live-rendered computer graphics do not look or behave like film, even when motion blur is added. In fact, motion blur is among the most annoying processing effects in games (even worse than chromatic aberration, a visual flaw that’s a function of poor optics), making low framerates look slightly better on video but making them feel even worse when playing, since they further reduce the input the player is expected to use to play. The low framerate can even cause motion sickness.
It is interesting to me that low framerates are mostly a post-2007 (gaming ground zero) phenomenon, since that is the generation where smooth gameplay was routinely traded for higher “fidelity” graphics. This puts aside the fact that smooth motion is a form of higher fidelity itself, but starting with the Xbox 360, there was an increasing emphasis on higher resolution rendering, more complex 3D models, and higher quality textures. The consensus for almost two generations of consoles was “gamers want sharper frames, even if there are fewer of them.” I remember a particular low point in 2014 when I bought a new Need for Speed game for the PS4, which ran at a locked 30 FPS. It was nearly unplayable.
Back in the late 90s, high framerate in the PC space was the norm. Consoles were jumping to 3D, and many had poor framerate (which a few of us noticed), but PC and arcade games all ran at 60FPS or more. Spend some time playing the original Quake and Unreal games, and you will see why. They are incredibly fast. We would get excited about getting 100 frames in Quake because it made the game, especially in multiplayer, so much easier to handle. And yes, we could see 100 FPS because we played on CRTs that had high refresh rates. Those disappeared for about 15 years, which is why 60FPS became the norm. LCDs typically had a 60Hz refresh rate. Go find an arcade cabinet of something like Street Fighter Alpha, King of Fighters, or even Tekken. You need a high framerate to deal with the speed of those old games.
So, what changed?
Well, gaming became more mainstream and moved even more into the home than it was in the 1990s. Graphics had always moved hardware, so emphasizing HD was part of the greater marketing zeitgeist of the post-2007 gaming world. There was a desire from gaming media as well as publishers to make gaming “Grow up,” which meant making things more realistic and serious. If poor quality “cinematic” framerates were the trade-off, so be it. 30 FPS became the new target, which meant many games fell below even that.
One of the best things I can say about the PS5 is that there is an abundance of games that at least give the player the option to run at a higher framerate while reducing some graphical settings like ray tracing. The biggest reason I can give to play the Demon’s Souls remake for the PS5 is the fact that it runs at 60 FPS (or more), which makes a big difference in an action game like that. I can’t think of any game that gives me the option where I found that a higher framerate didn’t deliver both a better visual and superior gameplay experience.
But people will tell themselves whatever they need to in order to justify decisions. Ports to Nintendo’s new hardware will likely run at 30FPS, which is actually impressive, as it is mobile hardware, and some of these games (like Cyberpunk) can bring a gaming PC to its knees at 4k. 30FPS is playable for many kinds of games. It’s a trade-off, but undoubtedly some people will have a psychic need for their hardware to be “just as good” as something else. Meanwhile, I expect Mario Kart World to run at 60FPS, and it is very noticeable when it is not (like in 4-player splitscreen).
Maybe instead of asking how people would feel if they hadn’t eaten breakfast, we should just ask them if they can tell the difference between 30 and 60 FPS.
And if you don’t know what the breakfast question is, I have you covered:
I am an independent artist and musician. You can get my books by joining my Patreon, and you can listen to my current music on YouTube or buy my albums at BandCamp.



There seem to be people that are unable to see proper 3D, so there might be something to your theory…
“Stereo Vision: The Haves and Have-Nots
It is widely thought that about 5% of the population have a lazy eye and lack stereo vision, so it is often supposed that most of the population (95%) have good stereo abilities. We show that this is not the case; 68% have good to excellent stereo (the haves) and 32% have moderate to poor stereo (the have-nots).”
https://pmc.ncbi.nlm.nih.gov/articles/PMC4934608/
Interesting. I’ve read in the past that we mostly use one eye for “seeing” and the other is mostly for depth perception. I’m farsighted in my left eye, but without my glasses I can still perceive depth very well
Some people claim that films with high frame rates appear too realistic, leading directly to the uncanny valley, where flaws become visible and immersion is destroyed. A lower frame rate might allow the brain to “fill in the blanks” easier…
I wish there were more films in 48 fps so we could really see the difference, but part of the problem is that doubling the framerate also increases the data size massively, and there is really no consumer standard for 48 fps. It’s a discussion that most of us just don’t have enough exposure to really make a judgment. Peter Jackson did want to try 48 fps, though, which is interesting. Where I really notice the lower framerate in 24fps is in specific types of film techniques, like panning.
Demonstrating the difference with bouncing miqo’te boobs is certainly an eye-catching method, I’ll give you that.
In regards to the visibility issue, it’s certainly possible that the inability to see the difference has something to do with mental faculty, but I’m not fully convinced of it. My reasoning for this is simple: I actively know people in my life who are low enough IQ that they don’t understand the breakfast question, but they can see and swear by the difference in 30 and 60 fps. Which then makes me wonder, could it also be an issue with our eyes? Could it be that people don’t notice this difference because their retinas don’t have enough rod and cone cells to properly distinguish the subtle changes in light that lead to 60 frames appearing smother than 30, similar to how lacking in certain numbers of these cells affects how we perceive variances in light and color?
These are just guesses on my part at the end of the day. I’m not versed enough in any of these sciences to make any claim with certainty. What I can say is that when it comes to the feel between 30 and 60 frames, I rarely find this difference feels significant to me. That’s likely down to the types of games I tend to play, though. Excluding From’s ARPGs, I rarely play games that require quick twitch reactions, so I tend not to feel a significant difference beyond a smoothing of the visuals. At the end of the day, I think what the game’s trying to achieve is the real deciding factor between whether 60 fps is the superior and arguably required choice or not. Games like racing sims that require precision timing, FGs that involve lots of frame counting, or quick-twitch action games and shooters are going to feel a much bigger difference than a rogue-lite deck builder RPG, the latest Skyrim clone, or an MMO that gives you lengthy windows to react to mechanics will.
It could be neurological, but if so, the colorblind should simply trust the color-sighted that color matters.
I don’t see the difference so I guess I’m low IQ according to this article. I can see that both are in 3d and moving relatively at the same speed.
Meh, don’t care about this issue and genuinely don’t know why anyone would care.