How to benchmark Epic Games' Unreal Tournament

Discussion in 'Mac and PC Games' started by Caesar_091, May 2, 2017.

  1. Caesar_091 macrumors regular

    Caesar_091

    Joined:
    Jan 18, 2005
    Location:
    Italy
    #1
    I would like to run some benchmarks with this game since it looks like the RX480 is not performing good at all with epic settings at full HD resolusion. How to run it?


    TIA
     
  2. gkarris macrumors 604

    gkarris

    Joined:
    Dec 31, 2004
    Location:
    "No escape from Reality..."
    #2
    I'm running it on High 1080p on an RX470 and it seems fine - locked at 60fps...
     
  3. Irishman macrumors 68030

    Joined:
    Nov 2, 2006
    #3

    You can use the Mac OS frame counter - Count It - but, you have to turn off Apple's SIP to do so.

    http://www.macgamerhq.com/count-it-mac-frame-rate/
     
  4. Janichsan, May 5, 2017
    Last edited: May 6, 2017

    Janichsan macrumors 65816

    Janichsan

    Joined:
    Oct 23, 2006
    #4
    Out of curiosity: is the Mac version for you also missing crucial parts of the user interface for a while now?

    For comparison:

    Mac version

    [​IMG]

    Windows version (the shot is a bit older, but the problem is still essentially the same)

    [​IMG]
     
  5. gkarris macrumors 604

    gkarris

    Joined:
    Dec 31, 2004
    Location:
    "No escape from Reality..."
    #5
    Unlocked I'm getting 95fps/ave.

    btw - what system are you running it on?
     
  6. jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #6
    Apparently UT uses Metal. Curiously, the Mac Unreal Engine developer (Mark) said here that Metal games were forced to use V-sync. But UT can run above 60 fps, so I'm confused. :confused:
     
  7. marksatt macrumors regular

    Joined:
    Jun 26, 2013
    Location:
    Epic UK
    #7
    Metal uses CoreAnimation to display and CoreAnimation supports updating at up to 120fps but is always v-sync'd. On 60Hz monitors some frames will be dropped by CoreAnimation (otherwise it'd run at half-speed!) so our frame counter will say 120fps etc. but the monitor still only shows you 60 frames each second.
    --- Post Merged, May 6, 2017 ---
    What has happened there? Get yourself onto the UT forum and report this so somebody fixes it...
    --- Post Merged, May 6, 2017 ---
    Well now that UT is running Metal it won't work for UT - but you shouldn't need it as UT has its own frame counter.
     
  8. jeanlain, May 6, 2017
    Last edited: May 6, 2017

    jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #8
    Thanks.
    So why is there a VSync setting in UT (I don't have the game, but it looks like there is one)? I've heard that V-Sync reduces response time, which is the main reason why pro players turn it off. Does turning V-Sync off have any befit for Metal games at all?
    I also see that GFXBench Metal can measure frame rates above 120 fps in "onscreen" mode. How is it possible?
    Finally, do you know how the "adaptive refresh rate" of the latest MacBook Pros (as Apple says) affects games?
    Lots of questions I know. :)
     
  9. marksatt macrumors regular

    Joined:
    Jun 26, 2013
    Location:
    Epic UK
    #9
    UT is cross-platform so the v-sync option makes sense on Windows or even back when using Mac OpenGL because both supported toggling it. Metal's just a bit of outlier and no-one's hidden the option.

    V-Sync ensures that display of the game is synchronised with the refresh rate of the physical monitor to avoid tearing but this can reduce frame-rate if you aren't locked at or above the refresh rate. It is a tradeoff between visual artefacts and apparent performance. For CoreAnimation which renders most of the UI it is essential for V-Sync to be enabled as you don't want to see tearing moving or interacting with your windows & apps.

    I doubt the display is showing you more than 120 frames - most monitors use a 60Hz refresh rate. Metal's display API allows you to buffer a number of frames before it stalls so there's scope for CoreAnimation to just not present the output of any intermediate frames it can't actually display.

    Don't have one of these so I genuinely don't have a clue.
     
  10. jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #10
    Thanks Mark, but I was asking about the benefit of turning V-Sync OFF in Metal games. Are the additional non-displayed frames just wasted computation, or do they add some granularity to the gameplay, e.g. allowing the player to react quicker? I suppose not, because if it was possible, games would implement that solution: allow minimum response time without screen tearing.

    I'm not quite sure why Apple decided to force V-Sync in Metal just because core animation renders most of the UI. There's the windows compositor (which uses Quartz Extreme) and there's what happens within windows. OpenGL used to power both 3D games and the windows compositor, but you could still have a windowed game without V-Sync interacting with other windows. This didn't cause screen tearing of the whole desktop.

    And yes, in GFXBench Metal, there is no screen tearing so no more than 60 fps show on screen. But some tests can measure up to 500 fps and more.
     
  11. marksatt macrumors regular

    Joined:
    Jun 26, 2013
    Location:
    Epic UK
    #11
    Quite simply you can't. The V-Sync option in UT does nothing on Metal - nothing at all.

    Well if you are running at more than the display refresh rate then you were *already* rendering frames that were never displayed and could be considered wasted - that's why there's a frame-rate cap in the UT UI ;)

    The frames that are never displayed have to be generated because you can't know when you start the frame that you'll miss V-Sync and not be displayed. That's just how almost all game engines work, such frames don't typically add anything visually but obviously all the CPU work they do to update game state absolutely *is* necessary for the game to function...

    On iOS all APIs have always enforced V-Sync - even with OpenGL - so there was no disadvantage to using CoreAnimation to present Metal to the screen. You *really* don't want screen tearing on a small screen so close to your eyes - it would be immediately noticeable.

    Obviously when Metal come over to the Mac it was replacing a legacy API (OpenGL) that can present frames directly to the compositor, so not being able to disable V-Sync is a missing option.

    That's really the point: OpenGL offered display APIs that interacted with Quartz Compositor directly, so could disable V-Sync, Metal doesn't. With the exception of games there's no reason to disable V-Sync - all other users will want V-Sync enabled to avoid tearing.

    Normally when you enable V-Sync in an API the "Present" function will block the caller to synchronise the display with V-Sync. How much will depend on the buffering mode (unbuffered, double-buffered, triple-buffered) and how fast the game is going vs. display refresh. This can mean that if the game drops beneath 60fps on a 60Hz monitor it can find itself being blocked such that it runs at 30fps, so that the "Present" call can synchronise with the display as < 60fps and you'll miss a V-Blank event and have to wait for the next. If you miss by only a small amount then you wait effectively 2 display frames...

    Metal deals with this a bit differently, rather than directly telling the system when to present you render into a Drawable which provides you with a texture from an internal pool. You just tell CoreAnimation when this texture is ready to present and CoreAnimation decides when to actually present it. Provided your application & CoreAnimation has finished with at least one previous frame before you come around to render another you'll never block the CPU. What you do see when you are <60fps is some stuttering as the frame-pacing looks a bit "wrong" to your eyes. Turning off V-Sync wouldn't make the frame-rate better and would probably tear, but you wouldn't feel the game is stuttering.
     
  12. jeanlain, May 7, 2017
    Last edited: May 7, 2017

    jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #12
    Ok. Does it mean that a Metal game can run at any frame rate between 60 and 30 fps? Because I know some OpenGL game where it's either 30 fps (or less) or 60 (with V-Sync ON of course), which isn't great. I've noticed that Source games (including Source 2) behave like this on nVidia GPUs (OS X). But on AMD cards frame rates can take any value between 30 and 60. Not sure why.
    --- Post Merged, May 7, 2017 ---
    Which means that if a Metal game engine can generate much more than 60 fps, forced V-Sync does not reduce power consumption and does not increase battery life?
    OpenGL V-Sync does. At least I hear the fans slowing down when I enable V-Sync.

    EDIT: Ok I'm starting to understand. With triple-buffering, the game still goes as fast as possible and you don't increase battery life by enabling V-Sync. I suppose that core animation has the same effect as triple buffering, even if it works differently.
    One question though. Is it equivalent to directX "render ahead", which, as I understand, may induce some lag?
     
  13. Irishman macrumors 68030

    Joined:
    Nov 2, 2006
    #13


    Mark,

    How do you find that frame counter? I can't seem to locate it.
     
  14. marksatt macrumors regular

    Joined:
    Jun 26, 2013
    Location:
    Epic UK
    #14
    There used to be an official one, but not anymore. Instead use '~' to bring down the console and 'stat unit' (no quotes obviously) to show the frame times.
    --- Post Merged, May 7, 2017 ---
    I bet the OpenGL implementation is vendor specific but Metal's is at the system level.

    Correct, frame-pacing is now the responsibility of the application. If you want to only render 60 fps then the application needs to pace itself as V-Sync won't do it for you.

    Double & triple buffering just mean you can have either 1 or 2 frames between you and the display, reducing the likelihood of stalling at the expense of memory.

    I don't think Metal Drawable present is just render-ahead, i.e. a fixed swap chain where each frame must be displayed and will block if no element is available for the next frame to use. Since some applications seem to think they can render at very high frame rates (e.g. GFxBench's Onscreen test) my supposition is that they actually drop some late frames, like in this answer about D3D Render Ahead.
     
  15. jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #15
    Interestingly, Sierra may have changed a few things in respect to frame rate limits compared to El Cap. Apparently, GFXBench could not go faster than 120fps onscreen before Sierra. https://forums.macrumors.com/threads/looks-like-metal-got-an-update.1977661/#post-23016160
     
  16. Irishman macrumors 68030

    Joined:
    Nov 2, 2006
    #16
    Thanks, Mark! :)

    But could you help me translate from ms per frame to frames per second??
     
  17. marksatt macrumors regular

    Joined:
    Jun 26, 2013
    Location:
    Epic UK
    #17
    Yeah, either GFxBench changed something or Sierra did and I can believe either.
    --- Post Merged, May 7, 2017 ---
    You could also try 'stat fps' which is what you really wanted ;)

    The 'stat unit' numbers are times in milliseconds for the frame as a whole and on each thread, so 33.3 would be 30fps, 16.6 is 60fps etc.
     
  18. jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #18
    That'd be Sierra since the poster who did the comparison said they used the same GFXBench executable.
     
  19. Janichsan macrumors 65816

    Janichsan

    Joined:
    Oct 23, 2006
    #19
    You tell me. :p I posted this on the UT forum, and it apparently has already been reported back in January.
     

Share This Page