Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You mean rendering more frames per second than your monitor actually can display?
You mean in doing so, producing more unnecessary heat and strain on every component in your computer?
You mean introducing screen tearing which does not look nice at all?
Then yes - you are right, you are „getting the most out of your hardware“.

V-Sync effects in game performance, triple buffering may help. Better off with G-Sync or better still a stronger GPU. My primary notebook has a full fat GTX 1070, so even on the TDP down silent mode it easily breaks 60 FPS, full bore it can easily surpass three figures, right up to the displays limit of 144Hz, although I prefer a cooler and quieter experience at around 60 FPS. Not big into gaming, just casually if the mood takes me.

Gaming on a Mac, certainly possible equally not the optimal experience. New Vega is very much a welcome addition, although Apple being Apple once again is overly greedy as Vega should be the default dGPU in 2018. That said if people keep paying, Apple will continue to "milk" the customer for all their worth, simple as that...

Q-6
 
I'm unfamiliar with G-Sync, but power of a video card does not really matter. In the past, all nVidia cards were horrible managing screen tearing. They would push as much frames out to get an average benchmark that was not a whole number (not matching the screen refresh rate), but over a large frame calculation, they would have an advantage. ATI tended not not do that much so V-Sync (video sync) didn't do anything. nVidia would simply drop to the correct frame rate for the refresh rate & their benchmarks would go down along with that, but not much at all.

G-Sync, I'm completely unfamiliar with. Anyone have any experience with what that is?
 
I saw that. I’m more concerned that the Vega may not live up to what I need and I’ll have to exchange. If I order from Adorama though I’ll save $500 versus ordering from Apple

If the Vega won't, no laptop will in that form factor. You will need a iMAC Pro then.
 
No, right from the article:

"Because VSync makes frames wait for when the monitor is ready, this can cause problems. You may find that your inputs, such as key-presses and mouse clicks, are slightly delayed. This can be fatal in games that require reflex and snap reactions to play. There are some technologies developed for VSync to help reduce this lag, but it’s worth keeping in mind if you enable VSync and notice your actions are less responsive than before."
True, and 1000 ms equal to one second. Divide that by 60 frames (because everything happens only at these 60 frames per second), and you get 16,67 milliseconds. ESL players claim that they notice that input lag when playing FPS games, which are correctly described above as "games that require reflex and snap reactions". Just try it!
 
That happens if your frame rate drops below monitor refresh rate, vsync makes it worse then, a lot. What I do, is modify the game settings to get consistent 60 FPS+ and then turn Vsync on, this results in the best experience, no jitter or screen tearing, silky smooth, even if it’s at the cost of some eye candy.

My frame rate has fallen below the refresh rate on some games if I set it to 4K ultra everything. The last Tomb Raider game does that on 1080 ti. Some games like CoD Advanced Warfare were tearing even above 60FPS though.
 
  • Like
Reactions: IdentityCrisis
My frame rate has fallen below the refresh rate on some games if I set it to 4K ultra everything. The last Tomb Raider game does that on 1080 ti. Some games like CoD Advanced Warfare were tearing even above 60FPS though.

My desktop can't run any game at 4k at a good frame rate, so I doubt the Vega 20 even will. You need top of the line graphics cards & AMD just doesn't make that yet.
 
His scores show W3 @ 1200p medium settings averaging 57fps
NBC scores show W3 @1080p high settings averaging 55fps for Vega M GH

For modern gaming it seems Vega is good enough to maintain 60fps+ on medium settings at FHD. I'd say that's descent for a MBP, and would satisfy the average user's needs, especially on the go.

Any word on battery life while gaming in said res/settings?
Thanks for the mention. I appreciate it.
 
  • Like
Reactions: MandiMac
Yes, that article is a nice, albeit very simplified account of the topic targeted on gamers without any technical knowledge or interest in graphical programming. In the end, their basic recommendation is exactly what I said: if your hardware can draw faster than the display can refresh, turn Vsync on ;)

Nope, I'll go by what is recommended and turn it off which is how I play games 90% of the time.
 
Nope, I'll go by what is recommended and turn it off which is how I play games 90% of the time.
You do know that this is actually responsible for the huge load on your system, and you would stress out another, mich newer system in the exact same way, right?

If you have your current system that is capable of, let's say 70 fps with VSync off, and assuming you have a screen with 60 Hz refresh rate, there will be no concernible difference if you get a new machine that can go up to 200 fps with VSync off. The weak link is your screen in that regard, and your CPU and GPU (both the old as well as the new one) will be under maximum load until one of the two units are under 100 % stress. That being said, I'll bow out of this discussion now as you‘ve made pretty clear that you are not using VSync either way.
 
You may want to contact Adorama and ask the question.

I saw that. I’m more concerned that the Vega may not live up to what I need and I’ll have to exchange. If I order from Adorama though I’ll save $500 versus ordering from Apple
 
This got a wee bit off topic. To bring it back on topic...I'm willing to pay someone $20 if they have the i7 Vega 20 that they are willing to do a thermal benchmark on.
 
You do know that this is actually responsible for the huge load on your system, and you would stress out another, mich newer system in the exact same way, right?

If you have your current system that is capable of, let's say 70 fps with VSync off, and assuming you have a screen with 60 Hz refresh rate, there will be no concernible difference if you get a new machine that can go up to 200 fps with VSync off. The weak link is your screen in that regard, and your CPU and GPU (both the old as well as the new one) will be under maximum load until one of the two units are under 100 % stress. That being said, I'll bow out of this discussion now as you‘ve made pretty clear that you are not using VSync either way.

I think v-sync got a bad rep due to:


i. Higher input lag
ii. Where it does go below 60fps in say some demanding part of a game, it causes the frame to go down all the way to 30 under v-sync. There are some methods to improve that such as triple buffering although they may have their own side effects

I haven’t followed these things much to know how much of an issue these still are, but they are probably where the negativity stems from historically.
 
This got a wee bit off topic. To bring it back on topic...I'm willing to pay someone $20 if they have the i7 Vega 20 that they are willing to do a thermal benchmark on.

A bit off topic indeed :D Speaking of which: matd2100 of bootcampdrivers.com is still waiting for the Bootcamp Vega drivers!!! ;)
 
You do know that this is actually responsible for the huge load on your system, and you would stress out another, mich newer system in the exact same way, right?

If you have your current system that is capable of, let's say 70 fps with VSync off, and assuming you have a screen with 60 Hz refresh rate, there will be no concernible difference if you get a new machine that can go up to 200 fps with VSync off. The weak link is your screen in that regard, and your CPU and GPU (both the old as well as the new one) will be under maximum load until one of the two units are under 100 % stress. That being said, I'll bow out of this discussion now as you‘ve made pretty clear that you are not using VSync either way.

https://www.reddit.com/r/Games/comments/1doh5l/vsync_and_input_lag/
 
^ saw this. What is interesting is that he noted 45w on his vega 16 unit hence the battery life will be worse than 555x which only used 30w.
 

If that really is true, then there would be not much left for the CPU, so it would throttle severely. Performance would suffer greatly and in the end we wouldn’t see the framerates we have already seen on the Vega 20. Something isn’t right. Isn’t HBM2 way more energy efficient and thus the Vega 20/16 can fit within the 35 watts envelope?
 
Last edited:
  • Like
Reactions: 0-0
^ saw this. What is interesting is that he noted 45w on his vega 16 unit hence the battery life will be worse than 555x which only used 30w.

And how does he know that? What is he using to measure the power consumption of the GPU? (I didn't watch the video, can't be really bothered to spend 20 minutes on this).
 
And how does he know that? What is he using to measure the power consumption of the GPU? (I didn't watch the video, can't be really bothered to spend 20 minutes on this).

I am curious about this as well. Looked like he was using MSI Afterburner at one point. Does that one even work without proper drivers?

Do you think GPUZ is working already? One could possibly find out the GPU‘s watts consumption from there.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.