Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

the8thark

macrumors 601
Original poster
Apr 18, 2011
4,628
1,735
Why do people insist on more than 60fps in games?
A question I fail to understand. This 60 being for 60hz screens. All the extra frames are being wasted. Maybe there's something I don't known but from my understanding of it if your game fps constantly equals the refresh rate of your screen you have the best and anything more is superfluous.
 

antonis

macrumors 68020
Jun 10, 2011
2,085
1,009
Why do people insist on more than 60fps in games?
A question I fail to understand. This 60 being for 60hz screens. All the extra frames are being wasted. Maybe there's something I don't known but from my understanding of it if your game fps constantly equals the refresh rate of your screen you have the best and anything more is superfluous.

There's no real meaning having all these fps over 60 (for 60hz screens at least). It's just that for most of the games, the framerate is not constant. Heavy parts/scenes tend to sink the framerate. So, for a constantly total-smooth experience, average fps should be pretty higher than 60.

Having said that, I'm very happy running any game at 30 fps as long as it doesn't go lower than that.
 

Pakaku

macrumors 68040
Aug 29, 2009
3,134
4,440
More head-room for parts of the game that are more processor/graphics-intensive. That way the game would ideally stay at a consistent 60fps.
 

InfiniteDeath

macrumors newbie
Jun 19, 2012
21
0
I have 120Hz screens. Games feel smoother than 60Hz. Its a thing you need to experience to tell the difference.
 

edddeduck

macrumors 68020
Mar 26, 2004
2,061
13
Why do people insist on more than 60fps in games?
A question I fail to understand. This 60 being for 60hz screens. All the extra frames are being wasted. Maybe there's something I don't known but from my understanding of it if your game fps constantly equals the refresh rate of your screen you have the best and anything more is superfluous.

If the physics of the game and your user input is tied to the renderer (your fps) then if you have fps of 120 your user inputs and the physics engines reactions will be smoother even if the rendered image is not smoother.

The fluctuation of fps is usually a bigger killer than low frame rate. By being able to run a crazy high fps it means you can lock the fps to 60 and no matter what happens on screen your fps will be constant.

There are a few things that running over your refresh rate can improve the game experience in subtle ways that the really focused ;) users might notice but in general most users are happy with the fps once it is over 24 fps as long as you don't tell them the fps!

The human eye is pretty rubbish at noticing once you go over 24fps, that's why all cinema's have 24fps as standard. If you have games with fast movement the gap between the frames has more impact and is more likely to be seen with the eye which is why racing games (Gran Turismo) are usually 60fps and all other games (Call Of Duty) 30fps on consoles.

For example F1 2012 on the Mac people say feels "smooth" and the fps might be 45/50 fps, however those same people will say XCOM is smooth when it is running at just 15 fps. How the game works and what it is trying to do vastly effects what fps is needed for the game to feel smooth.

When you get down to it it's about user preference and also in some cases just wanting to be able to post bigger numbers the old "mine's bigger than yours" ego thing. :)

Edwin
 

barkmonster

macrumors 68020
Dec 3, 2001
2,134
15
Lancashire
TVs are 60hz (or 50hz in the UK) and frame rates of video is either 30, 25 or, in the case of HD video, 24 FPS.

Those lower framerate don't account for motion blur which gives the impression of even smoother movement so to get the same level of fluid motion in a game as a typical DVD/Blu-ray or TV show, it needs to have double the framerate to compensate for what your brain ignores because of the motion blur.
 

mm201

macrumors regular
Feb 17, 2013
113
1
To reduce input lag.

A game running at 60fps could potentially have a whole extra frame of lag compared to the same game running at a theoretically infinite FPS. This is because of the time difference between polling for input and displaying the rendered frame.

Let's break down the lifecycle of a frame at 60fps vsync:
1. Poll for input
2. Compute game logic, physics, etc. based on input and the game state
3. Render a frame
4. Wait
5. Display the rendered frame.

Now consider the same lifecycle if vsync is turned off and you get 180fps:
1. Poll for input
2. Compute game logic, physics, etc. based on input and the game state
3. Render a frame
4. Poll for input
5. Compute game logic, physics, etc. based on input and the game state
6. Render a frame
7. Poll for input
8. Compute game logic, physics, etc. based on input and the game state
9. Render a frame
10. Display the rendered frame.

Note how in the second lifecycle, much less time is spent after input has been polled before the frame is shown onscreen.
 

edddeduck

macrumors 68020
Mar 26, 2004
2,061
13
To reduce input lag.....

Thanks for adding all the detail, I just said this and skipped the fully detailed example :)

If the physics of the game and your user input is tied to the renderer (your fps) then if you have fps of 120 your user inputs and the physics engines reactions will be smoother even if the rendered image is not smoother.
 

Irishman

macrumors 68040
Nov 2, 2006
3,392
843
Why do people insist on more than 60fps in games?
A question I fail to understand. This 60 being for 60hz screens. All the extra frames are being wasted. Maybe there's something I don't known but from my understanding of it if your game fps constantly equals the refresh rate of your screen you have the best and anything more is superfluous.

It's bragging rights. If I pick up a new videocard that boosts my framerate from 46 to 72, if I'm into greater performance for its own sake, you're darned right I'm going to brag about it! It helps hardware manufacturers set their products apart. And gamers have latched onto it (like refresh rates on tvs).
 

Psychj0e

macrumors regular
Jun 5, 2010
180
0
a refresh rate of 25 frames per second is completely fine - thats an image update of once every 40ms. You wouldn't perceive jerkiness.

frame rates are chucked around like megapixels on a camera.
 
Last edited:

Renzatic

Suspended
a refresh rate of 25 frames per second is completely fine - thats an image update of once every 25ms. You wouldn't perceive jerkiness.

frame rates are chucked around like megapixels on a camera.

25 is the bare minimum for what we perceive as "smooth". It works well enough (I can play games at 25 FPS without any issues, so long as it doesn't dip below that), but the eye can discern so much more, and bumping the framerate up a good bit gives you a much better, more fluid feeling experience overall. This is especially true for games, where you're actively interacting with the image, and are thus more susceptible to noticing tiny differences in the way things move.

I'm fine with 30 FPS as a base framerate. It's smooth to me, and any game that runs at that framerate is perfectly playable. But 60 FPS is smoother, and does look and feel better overall.
 

Dranix

macrumors 65816
Feb 26, 2011
1,063
543
left the forum
To reduce input lag.

A game running at 60fps could potentially have a whole extra frame of lag compared to the same game running at a theoretically infinite FPS. This is because of the time difference between polling for input and displaying the rendered frame.

Let's break down the lifecycle of a frame at 60fps vsync:
1. Poll for input
2. Compute game logic, physics, etc. based on input and the game state
3. Render a frame
4. Wait
5. Display the rendered frame.

Any coder entangling input and renderloop this way should be shot and then hanged outside to see for all other coders. It is the worst way possible to do a engine runloop...
 

thejadedmonkey

macrumors G3
May 28, 2005
9,180
3,324
Pennsylvania
Any coder entangling input and renderloop this way should be shot and then hanged outside to see for all other coders. It is the worst way possible to do a engine runloop...

I don't do game development, so this is an honest question, but what's a better way? I have a few game design books from 2007ish, and they all are some version of a loop that checks for user input at the top of the loop and draws the frame somewhere at the bottom.
 

Psychj0e

macrumors regular
Jun 5, 2010
180
0
25 is the bare minimum for what we perceive as "smooth". It works well enough (I can play games at 25 FPS without any issues, so long as it doesn't dip below that), but the eye can discern so much more, and bumping the framerate up a good bit gives you a much better, more fluid feeling experience overall. This is especially true for games, where you're actively interacting with the image, and are thus more susceptible to noticing tiny differences in the way things move.

I'm fine with 30 FPS as a base framerate. It's smooth to me, and any game that runs at that framerate is perfectly playable. But 60 FPS is smoother, and does look and feel better overall.

EDIT - I made a typo first time around, 25fps is an update every 40ms, not 25ms.

Have you got a link for that? I'm not saying you're wrong, but a rule of thumb is that 50ms of exposure is required for the brain to process an image which would be 20 frames per second.

For example, look at the difference between this video shot at 20fps:

http://www.youtube.com/watch?v=5RVepKauLTU


compared to this shot at 15:

http://www.youtube.com/watch?v=h1VTIenPXzM


Then compare the 20fps with the 30fps:
http://www.youtube.com/watch?v=f4q61AWv8pA

Subjectively for me, the biggest difference comparing the 20fps, is between the 15fps, rather than the 30fps.
 
Last edited:

Psychj0e

macrumors regular
Jun 5, 2010
180
0
I don't do game development, so this is an honest question, but what's a better way? I have a few game design books from 2007ish, and they all are some version of a loop that checks for user input at the top of the loop and draws the frame somewhere at the bottom.

I don't program computer games, but I image now we're no longer in 1980, you could assign one of the cores to monitor for inputs.
 

Dranix

macrumors 65816
Feb 26, 2011
1,063
543
left the forum
I don't do game development, so this is an honest question, but what's a better way? I have a few game design books from 2007ish, and they all are some version of a loop that checks for user input at the top of the loop and draws the frame somewhere at the bottom.

You use multithreading. For io, simulation etc use each a thread with a timer to get a fixed interval. The renderloop runs in another thread always taking the current state of the gamedata and the other threads at this time.

In the engine I built for a shoot'em'up I use gcd for all that stuff.
 

thejadedmonkey

macrumors G3
May 28, 2005
9,180
3,324
Pennsylvania
You use multithreading. For io, simulation etc use each a thread with a timer to get a fixed interval. The renderloop runs in another thread always taking the current state of the gamedata and the other threads at this time.

In the engine I built for a shoot'em'up I use gcd for all that stuff.

Cool, next time I make a game I'll give it a shot.
 

Dranix

macrumors 65816
Feb 26, 2011
1,063
543
left the forum
Threads and gcd have the big advantage that multicoresysdtems are better utilized and the gamespeed is not linked to the speed of the gpu-rendering.

But I wont lie: YOu have to do some synchronization code that is not needed when doing the classical gameloop from the 80tis.
 

Erko

macrumors member
Aug 12, 2011
73
0
Estonia
EDIT - I made a typo first time around, 25fps is an update every 40ms, not 25ms.

Have you got a link for that? I'm not saying you're wrong, but a rule of thumb is that 50ms of exposure is required for the brain to process an image which would be 20 frames per second.
Have you ever gamed? There is a notable difference between 60 and 30 fps, easily distinguishable by the human eye. For example Call of Duty is a game that runs ca 60 fps on consoles as thats their target, it most likely is somewhere between 40-60. On the other hand, Battlefield 3 was developed with a target of 30 fps, and its v-synced so it wont go above it. Now, prior to playing BF3 I had played Modern Warfare 2 for a fair amount of time, and when I played BF3 there seemed something off, put I didn't quite get what. It didn't have the same smoothness MW2 had due to the difference in framerate.
I hope this helps to maybe clarify some things. In console games, developers have to decide between more visual fidelity and go with a lower framerate like if BF3, or with less visual detail, but with a higher fps, like COD.
 

Dranix

macrumors 65816
Feb 26, 2011
1,063
543
left the forum
its not the difference in fps you notice - its the difference in Simulation and io Speed that comes from the insane singlethread gameloop...
 

Erko

macrumors member
Aug 12, 2011
73
0
Estonia
its not the difference in fps you notice - its the difference in Simulation and io Speed that comes from the insane singlethread gameloop...
Yes you do, have you ever compared a 60 fps video to a 30 fps video?
First link from googlehttp://boallen.com/fps-compare.html
Yes, there is a difference in io as well, but saying that you dont notice the fps difference is BS. Did you see The Hobbit in 48fps? Was sooo smooth compared to a normal movie.
Today I tried 3D on a new Samsung TV and the flickering in those glasses was quite noticeable and made me crazy, as it occurs only 24 times per second.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.