Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How would that work? The VRAM on the 560x is not part of the GPU package. Different GPU package, different heatsink.
To be fair, XPS 15 has the VRAM under the GPU heat sink. I know since I had to bend the hell out of it and put thicker thermal pads to let them actually make contact :) That was on 960m + GDDR5. Apple could have changed the design, highly unlikely, but plausible.
 
Sure, but the package is also different... The 560X package only included the GPU itself, the VRAM was extra (you see it on the first image, covered by the large heatsink thats not connected to the heatpipe), the Vega 12 package is an overall more compact GPU + HBM2 RAM, and the heatsink has been enlarged to cover the entire package, meaning that the VRAM is also getting actively cooled now. But the size, arrangement and number of heat pipes is still the same, thats what I mean. They obviously had to change the heatsink geometry to accommodate the new chip, but that is pretty much all the extent of the changes that one can see on the photo.



How would that work? The VRAM on the 560x is not part of the GPU package. Different GPU package, different heatsink.
How would you explain professors vega running 13 degrees cooler?
 
The image on the left is idle and the image on the right is after four runs of the heaven benchmark. For the record, temperatures stabilized after the second run and did not change appreciably for the final two runs.
[doublepost=1543844827][/doublepost]Intel Power Gadget right after the fourth run.
 

Attachments

  • Screen Shot 2018-12-03 at 8.44.25 AM.png
    Screen Shot 2018-12-03 at 8.44.25 AM.png
    714.4 KB · Views: 293
  • Screen Shot 2018-12-03 at 8.46.37 AM.png
    Screen Shot 2018-12-03 at 8.46.37 AM.png
    60.8 KB · Views: 259
How would you explain professors vega running 13 degrees cooler?

Can't say anything about it since I don't know what the test conditions were. If its with external display attached, I'd speculate that Vega Pro could be more efficient with its low power mode. If its with the iGPU running, I'd speculate that the older laptop had defective cooling :)
 
The image on the left is idle and the image on the right is after four runs of the heaven benchmark. For the record, temperatures stabilized after the second run and did not change appreciably for the final two runs.
[doublepost=1543844827][/doublepost]Intel Power Gadget right after the fourth run.
Thank you so much. Do you have iStatMenu by any chance to look at the wattages? It runs definitely hotter than 560x, your fans are maxed out to maintain same GPU temperature (they are around 4600/4200 left/right on 560x), it does indeed look like 45W that needs to be dissipated. I'm glad I didn't ask you to freeze the fans to the same rpms. Anybody knows if that "GPU diode" sensor is present on polaris GPUs?
 
I'd speculate that the older laptop had defective cooling :)
The only thing the service found out with mine was that it ran "hotter than it was supposed to". (despite the fact that flicker is very obvious visually).
The fans did ramp up much sooner than on my 2012 while doing similar tasks.

Still, wouldn't the larger heatsink allow more heat to dissipate through enclosure while the vega is inactive?
 
I will go purchase it now... standby.

Thank you so much. Do you have iStatMenu by any chance to look at the wattages? It runs definitely hotter than 560x, your fans are maxed out to maintain same GPU temperature (they are around 4600/4200 left/right on 560x), it does indeed look like 45W that needs to be dissipated. I'm glad I didn't ask you to freeze the fans to the same rpms. Anybody knows if that "GPU diode" sensor is present on polaris GPUs?
 
Still, wouldn't the larger heatsink allow more heat to dissipate through enclosure while the vega is inactive?

Not really in this design, the only larger component is the baseplate but they made it to accommodate VRAM, to transfer more heat the die size would have to be bigger (I don't know how that compares to RX560X) but in here you are also limited by the heatpipe size and the actual heatsink (the element at the end, with fins, attached to the fan) which look the same.
[doublepost=1543850845][/doublepost]
I will go purchase it now... standby.
I just realized that the iStatMenu from App Store has some functions disabled as compared to the version from their website.

Anyway - as a baseline, here is my 555x with fans at full blast after 4 times Heaven Extreme run, screenshot made at 250s mark of the last run. Note - it looks like fans on Vega can go higher than on mine.

555xheaven.jpg
 
  • Like
Reactions: Ploki
Where do you see CPU and GPU power dissipation?

Not really in this design, the only larger component is the baseplate but they made it to accommodate VRAM, to transfer more heat the die size would have to be bigger (I don't know how that compares to RX560X) but in here you are also limited by the heatpipe size and the actual heatsink (the element at the end, with fins, attached to the fan) which look the same.
[doublepost=1543850845][/doublepost]
I just realized that the iStatMenu from App Store has some functions disabled as compared to the version from their website.

Anyway - as a baseline, here is my 555x with fans at full blast after 4 times Heaven Extreme run, screenshot made at 250s mark of the last run. Note - it looks like fans on Vega can go higher than on mine.

View attachment 808131
 
Power dissipation is the correct term. I was looking for the GPU and CPU power dissipation, but I see now that istat only calculates CPU power dissipation. I will run this test as soon as I get home from the university.

One thing I have noticed, is that the heaven benchmark stops running as soon as I click on the istat sensor menu. Do you know of a way around this?

Joe

I think you meant power draw, you can see it on my screenshot, the Watts for individual components and totals.
 
Power dissipation is the correct term. I was looking for the GPU and CPU power dissipation, but I see now that istat only calculates CPU power dissipation. I will run this test as soon as I get home from the university.

One thing I have noticed, is that the heaven benchmark stops running as soon as I click on the istat sensor menu. Do you know of a way around this?

Joe
There is an option in settings "Always run" - it doesn't have any effect on my case, my installation runs in the background regardless, you can try it. And the GPU power is shown under "Radeon High side".
About power consumption/dissipation - the values shown in sensors are based on current measured at voltage regulators, that's the input power and I don't know if this takes into account the VRM efficiency. So the actual GPU power dissipation may be lower (or identical) to the reported power draw.
 
Here you go. After four runs: Highest frame rate 64, lowest frame rate, 27.
 

Attachments

  • Screen Shot 2018-12-03 at 4.08.28 PM.png
    Screen Shot 2018-12-03 at 4.08.28 PM.png
    49 KB · Views: 181
  • Screen Shot 2018-12-03 at 4.11.19 PM.png
    Screen Shot 2018-12-03 at 4.11.19 PM.png
    83.5 KB · Views: 209
  • Screen Shot 2018-12-03 at 4.11.39 PM.png
    Screen Shot 2018-12-03 at 4.11.39 PM.png
    185.9 KB · Views: 234
  • Like
Reactions: Thysanoptera
Here you go. After four runs: Highest frame rate 64, lowest frame rate, 27.

Thanks again! This is the first time somebody posted readable values. Even I eventually got caught up in the hype that this is going to be 35W GPU based on repeating claims that it is cooler, I need to edit my posts where I admitted to be wrong in my predictions. Dang it, I almost sold my 555x to get better balanced laptop. So, realistically, we're looking at 10%~15% power efficiency improvement from polaris, and not 60%.

Did you use some third party charger? Your system was tapping into battery.

To answer OP question - Vega is hotter, by a lot.
 
  • Like
Reactions: Qaulity
You are most welcome. At idle, this computer is 13C cooler than an identically configured 560x. Unfortunately, I did not run any benchmarks to measure load temperatures. With respect to your question, I am using the caldigit TS3 Plus docking station.

Thanks again! This is the first time somebody posted readable values. Even I eventually got caught up in the hype that this is going to be 35W GPU based on repeating claims that it is cooler, I need to edit my posts where I admitted to be wrong in my predictions. Dang it, I almost sold my 555x to get better balanced laptop. So, realistically, we're looking at 10%~15% power efficiency improvement from polaris, and not 60%.

Did you use some third party charger? Your system was tapping into battery.

To answer OP question - Vega is hotter, by a lot.
 
Thanks again! This is the first time somebody posted readable values. Even I eventually got caught up in the hype that this is going to be 35W GPU based on repeating claims that it is cooler, I need to edit my posts where I admitted to be wrong in my predictions. Dang it, I almost sold my 555x to get better balanced laptop. So, realistically, we're looking at 10%~15% power efficiency improvement from polaris, and not 60%.

Did you use some third party charger? Your system was tapping into battery.

To answer OP question - Vega is hotter, by a lot.

Curious, Vega is more efficient at lower power apparently. I wonder if the power draw isn’t linear vs workload. So if the card pulls 50W max is it using 50% of its number crunching ability at 25W the or is it 60% or 70% and it’s just diminishing returns up to 100% power draw. If this is the case the efficiency could be higher for normal workloads that aren’t benchmarks. Just spitballing here I have no clue on how the architecture works.
 
Thanks again! This is the first time somebody posted readable values. Even I eventually got caught up in the hype that this is going to be 35W GPU based on repeating claims that it is cooler, I need to edit my posts where I admitted to be wrong in my predictions. Dang it, I almost sold my 555x to get better balanced laptop. So, realistically, we're looking at 10%~15% power efficiency improvement from polaris, and not 60%.

Did you use some third party charger? Your system was tapping into battery.

To answer OP question - Vega is hotter, by a lot.

I don't think anybody ever claimed 60% power efficiency improvement, where did that come from?

I'm not sure why you thought it was a 35W GPU when info has been out for a while that even the Vega 16 is 45W

3:17 in
 
I don't think anybody ever claimed 60% power efficiency improvement, where did that come from?

I think the Apple website says it’s 60% faster on some video software benchmark. Cinema 4D or something?
 
don't think anybody ever claimed 60% power efficiency improvement, where did that come from?

I'm not sure why you thought it was a 35W GPU when info has been out for a while that even the Vega 16 is 45W

Just go through older posts, pretty much all the way from the announcement. 60% faster at 35W. Architecture, HBM, circuit optimization etc. I was skeptical the whole time, but then reviews started coming in and everybody says it is cooler than rx560. So, if it is cooler there is no way it can consume more power, right? AMD made a breakthrough and the video card market will never be the same. This dude with Vega 16 was the first one who showed any kind of power related information, but he shoot this thing in a resolution low enough that I couldn't make up individual numbers from his screens. Maybe it was showing 45W but was making it up with lower memory power requirement? So now I have my answers. Still have no idea how anybody could have observed lower temperatures.

EDIT: There was also a suspicion that the 45W value returned by iStatMenu is not real.

[doublepost=1543882108][/doublepost]
Curious, Vega is more efficient at lower power apparently. I wonder if the power draw isn’t linear vs workload. So if the card pulls 50W max is it using 50% of its number crunching ability at 25W the or is it 60% or 70% and it’s just diminishing returns up to 100% power draw. If this is the case the efficiency could be higher for normal workloads that aren’t benchmarks. Just spitballing here I have no clue on how the architecture works.
The big Vega, 56-64, has a sort of linear performance/power ratio until about 1250 MHz. Above it just takes off and to get like 15% performance improvement you need to supply 50% more power. There was a nice analysis I think on AnandTech.
 
Last edited:
The big Vega, 56-64, has a sort of linear performance/power ratio until about 1250 MHz. Above it just takes off and to get like 15% performance improvement you need to supply 50% more power. There was a nice analysis I think on AnandTech.

Interesting good to know, so the real benefit is from the higher memory bandwidth and that's about it.
 
13C cooler at idle than my identically equipped 560X under the exact same test conditions.

I switched back to High Sierra and achieved an average 15˚ drop on a MPB i9 2.9 1TB 32GB Radeon 560x. Video memory pressure using an external screen (Acer XR382CQK) went from 90/100% down to a running percentage of round 25. It does still peak at the mid 35's but never goes much above that now. On Mojave I could switch from extended to mirrored and it would drop to 21% and then start a steep climb back up.

I'm still using High Sierra now.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.