Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Why run a game at faster FPS than the monitor refresh rate?

I've seen lots of gamers taking about running their games at frame rates of between 100-200 FPS, is there any point in this with monitor refresh rates of 50, 60, or 75 Hz

Or is it that windows being rubbish at real time may delay some frames so you want to have spares ready.

Is there an optimum relationship between screen refresh rate and game FPS, and does it vary between DVI and VGA.

Hope this gets you thinking.

3 Answers

Relevance
  • 1 decade ago
    Favorite Answer

    As far as i'm aware you are correct.

    It doesn't matter what your frame rate is, it will be limited to the refresh rate of the monitor.

    FPS = Frames per second

    Hz = Cycles per second.

    FPS and Hz are basically the same thing, In this case they both mean screen updates per second.

    The optimum ratio is 1:1, The graphics card sends one frame and that get displayed on screen, then sends a another frame etc. If you have a 90fps and a refresh rate of 60hz, it simply can't display all 90fps. It either drops 30 frames, or combines 1 and half frames and displays it as one frame. Both will result in screen tearing which is pretty annoying.

    There's a detailed explanation here on screen tearing http://en.wikipedia.org/wiki/Screen_tearing

    There is a method of perfectly syncing the FPS with Hz and it's V-SYNC, most games have it as an option. Si uf your monitor is 60Hz, it will try to run the game at 60fps. Using V-SYNC prevents tearing and so improves quality, but consumes slightly more processing power than if it were off.

    There's an explanation of V-SYNC here http://en.wikipedia.org/wiki/Vertical_synchronizat...

    There's no difference between DVI and VGA in this respect.

  • Anonymous
    5 years ago

    Techno geeks get all over excited with attempting to get a hundred fps. action picture runs at 24 fps, television and complete-action video refresh at 30fps, and your concepts can not technique action lots swifter than that. basically everybody claiming to tell the version between 50 fps and eighty fps is a similar guy or woman who will declare they could tell the version between 24 bit shade and 32 bit shade. the clarification particularly severe fps expenditures count is by way of fact that's the only way for the gfx card makers for example their means and destiny potential for extra stressful video games.

  • Anonymous
    1 decade ago

    Basically if the game refreshed at only something like 10fps it would not matter if you had a 1000Hz moniter as the game would still lagg.

    Fps is how many times the game updates a second.

    Hz is how many times the monitor updates a second. (I think)

    To get smoothest gameplay your game fps should be sitting around the monitors refresh rate.

    Say if the game was 100fps and your monitor was 50Hz the game would play looking as if it was 50fps due to the fact thats all the monitor can do.

    Bear in mind i could be completely wrong lol, Its just what i think.

Still have questions? Get your answers by asking now.