Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And what about the Vega 20, anybody with it can check this? Because this changes the situation completely when comparing to Nvidia.
There is a question, wheter iStat Menus is reporting just GPU power, WITHOUT GDDR5 power.

Vega can have 45W Power Limit, for the whole package. 4 memory chips, on Radeon Pro 555 will consume up to 16W Of power(4W, each memory consumes). If so, then there is ceratin logic to this siuation.

Just for the record. In EFI, BiOSes of the GPUs do not report anything apart from GPU voltages, amperages, and Power Limits. If Vega's design is considered GPU power as a whole, contrary to GDDR5/6 GPUs, then it makes sense.
 
There is a question, wheter iStat Menus is reporting just GPU power, WITHOUT GDDR5 power.

Vega can have 45W Power Limit, for the whole package. 4 memory chips, on Radeon Pro 555 will consume up to 16W Of power(4W, each memory consumes). If so, then there is ceratin logic to this siuation.

Just for the record. In EFI, BiOSes of the GPUs do not report anything apart from GPU voltages, amperages, and Power Limits. If Vega's design is considered GPU power as a whole, contrary to GDDR5/6 GPUs, then it makes sense.
That's actually simple to verify, compare total system power draw when GPU is loaded, if somebody could replicate what we did here :

Thoughts and Tests on the 2.2GHz MacBook Pro

when comparing 555x and 560x - run heaven 4 times, make a screenshot of iStatMenus sensor page at 255 second mark of the last run. Ideally lock the fans to 4600 (left) and 4200 (right).
 
Radeon High Side. So it reports only VRM power.

There are seperate VRMs on the board for GPU, and for the GDDR5. It is the same thing for all hardware: Desktop, laptop. GPUs have GPU VRMs separate from memory VRMs, and on CPU side: CPU VRMs are separate from RAM VRMs.

In general, power section for memory is a laugh compared to CPUs(Highest power draw of GDDR5 memory ever reported was on Radeon R9 390X with around 65W of power drawn).

If Vega Pro 20 is considered SoC, it will have higher Radeon High Side power draw, because of the unified design, of memory subsystem with the GPU.

45W TDP. It is quite interesting, after all.
 
So what’s the verdict Vega or no Vega? I’m ready to pull the trigger before the Adorama sale ends tomorrow.
 
And this was Vega 16. So what about Vega 20? 60W?

The delta between the 555X and 560X is 5W TDP and 100Mhz Core Speed. So given that the core speed delta of the Vega 16 and Vega 20 is 100Mhz, I'd guess that it'll be 50W.

Do you know if there's a way we can limit the GPU TDP in software like we can do on the CPU?
 
The delta between the 555X and 560X is 5W TDP and 100Mhz Core Speed. So given that the core speed delta of the Vega 16 and Vega 20 is 100Mhz, I'd guess that it'll be 50W.

Do you know if there's a way we can limit the GPU TDP in software like we can do on the CPU?
On Mac Pro you can modify the power table in AMD kext to get whatever you want, but no idea if that's doable on MBP.
 
  • Like
Reactions: ashcairo
And this was Vega 16. So what about Vega 20? 60W?
Both: Radeon 555 and 560, from what I remember from BIOSes have had the same TDP Power Limit set in the BIOS. Thats how Apple does this, don't ask me why.

I don't believe it would be different for Vega GPUs.
 
I've been tracking the Vega performance differences in Geekbench.

i7/V20:
https://browser.geekbench.com/v4/compute/search?dir=desc&q=AMD+Radeon+Pro+Vega+20+8850H&sort=score

i9/V20:
https://browser.geekbench.com/v4/compute/search?dir=desc&q=AMD+Radeon+Pro+Vega+20+8950HK&sort=score

Combined V20:
https://browser.geekbench.com/v4/compute/search?dir=desc&q=AMD+Radeon+Pro+Vega+20&sort=score

V16:
https://browser.geekbench.com/v4/compute/search?dir=desc&q=AMD+Radeon+Pro+Vega+20&sort=score


Some interesting tidbits:
  • Not a single person who ran a test with the i9 had 16GB RAM, everyone had 32GB.
  • The top 18 scores on the i7 had 32GB RAM. Definitely appears to run the test faster.
  • i7/16GB top score was 72287
  • i7 scored in the top 8 out of the combined results
  • Based on the time stamps, only 4 unique people (I'm looking at you @ashcairo) have tested the V16 vs 100+ V20.
Interesting results indeed.
 
Last edited:
  • Like
Reactions: bbasra and ashcairo
The amd drivers never get updated on bootcamp its still on last years 17.10 why dont they update it, makes the mac old cos of old drivers
 
It's still accurate. 200 fps on a 60hz monitor is still far more responsive to input than 60, 70, 80, 100hz etc.
And still we are talking about milliseconds here. An average gamer won‘t see a difference - the same discussion is being had with 30 and 60 fps. Some people claim they do not see any difference, others people do. I notice a difference between 30 and 60 fps, but I‘d never say I lose in Overwatch or CS:GO because of input lag.
 
And still we are talking about milliseconds here. An average gamer won‘t see a difference - the same discussion is being had with 30 and 60 fps. Some people claim they do not see any difference, others people do. I notice a difference between 30 and 60 fps, but I‘d never say I lose in Overwatch or CS:GO because of input lag.

How many ms are we talking? I know some games which rely on 1 frame input links (giving only 16 ms, based on 60 FPS game lock), some ms can completely put you off making them links if you're a competitive gamer.

Most of us probably game on IPS panels, but if you were to play competitively, you'd have to go TN panel's or CRT's.
 
  • Like
Reactions: MandiMac
It's still accurate. 200 fps on a 60hz monitor is still far more responsive to input than 60, 70, 80, 100hz etc.

This has nothing to do with FPS though and everything with bad game design and coding. Input processing should be separate from the rendering loop in a well designed system. Which of course doesn’t change the end result, unfortunately.
 
And still we are talking about milliseconds here. An average gamer won‘t see a difference - the same discussion is being had with 30 and 60 fps. Some people claim they do not see any difference, others people do. I notice a difference between 30 and 60 fps, but I‘d never say I lose in Overwatch or CS:GO because of input lag.

I can't believe you're calling yourself a gamer yet implying milliseconds don't matter. These things don't matter to most people, but they matter to competitive gamers. Ask anyone who played CS since the beta or quake 3.
 
  • Like
Reactions: g75d3 and koyoot
All of Apple GPUs will be hindered by drivers under Windows.

I am completely staggered, that Apple uses Crimson 17.12 on Windows.

Absolutely horrible.
 
I can't believe you're calling yourself a gamer yet implying milliseconds don't matter. These things don't matter to most people, but they matter to competitive gamers. Ask anyone who played CS since the beta or quake 3.
I said „the average gamer“. Well but if we‘re all playing at ESL niveau in these forums, I stand corrected :)
 
I will contact AMD support when I get mine if this is the case.
AMD really don't care, it's an Apple computer and Apple is responsible to provide drivers

How many ms are we talking? I know some games which rely on 1 frame input links (giving only 16 ms, based on 60 FPS game lock), some ms can completely put you off making them links if you're a competitive gamer.

Most of us probably game on IPS panels, but if you were to play competitively, you'd have to go TN panel's or CRT's.
not really, there are 120Hz IPS gaming monitors, I recently bought Acer Predator X34P and I'm waiting for delivery within few days
 
not really, there are 120Hz IPS gaming monitors, I recently bought Acer Predator X34P and I'm waiting for delivery within few days

Yes, we "casual" gamers who like to be competitive will buy monitors like the Predator/Asus 144hz IPS monitor and at high screen sizes of 27"-34". I am talking competitive gamers who play tournaments and none of them ever use these monitors, usually 22-24" TN 144-240+ hz monitor depending on the game. Even for MOBA's like DOTA 2, they don't use the high refresh rate IPS monitors.
 
All of Apple GPUs will be hindered by drivers under Windows.

I am completely staggered, that Apple uses Crimson 17.12 on Windows.

Absolutely horrible.
Yeah the reason gaming etc isn’t as good as it should be is cos of drivers apple need to update them, I know there’s hacks like boot camp drivers but I don’t trust them
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.