You can turn the demos off in 3dmark, but for some reason identifying the system was painfully slow. I'm not sure why, it wasn't that bad for the Razerthe time it took for the benchmark to identify my system was as long as the actual
You can turn the demos off in 3dmark, but for some reason identifying the system was painfully slow. I'm not sure why, it wasn't that bad for the Razerthe time it took for the benchmark to identify my system was as long as the actual
You can turn the demos off in 3dmark, but for some reason identifying the system was painfully slow. I'm not sure why, it wasn't that bad for the Razer
Yeah, that's my thinkingMaybe it couldn't fully identify what the Radeon Pro 560X
Looks like for older version there are modified drivers that allow Wattman to show, with this you should be able to modify GPU clock speeds to what you need, don't know about any fan control software for bootcamp. But for now this doesn't work on the 560x/555x.@Thysanoptera Thanks for taking all those tests! I wonder if I could somehow manually control the fans on bootcamp side so that I could take full advantage of both the GPU and CPU. I can thermally limit CPU to 40W maybe to make sure GPU has all the headroom it needs.
Can someone make a thermal (temps) comparison between 550x and 560x ?
Thanks
That may be more complicated, I guess we could use iStatMenus on Mac to capture 'GPU proximity' sensor (Valley and Heaven work on Mac) while running benchmarks, on Windows 3dmark saves the logs for temps, frequency and utilization, but if you look at the screenshots they don't look realistic at all (spikes to 120C, and flat in general), except for FPS, I think the driver doesn't present the information yet and 3dmark cheats by calculating the values instead of reading the sensor.I’d appreciate a temp comparison between the two too please if possible?
That may be more complicated, I guess we could use iStatMenus on Mac to capture 'GPU proximity' sensor (Valley and Heaven work on Mac) while running benchmarks, on Windows 3dmark saves the logs for temps, frequency and utilization, but if you look at the screenshots they don't look realistic at all (spikes to 120C, and flat in general), except for FPS, I think the driver doesn't present the information yet and 3dmark cheats by calculating the values instead of reading the sensor.
[doublepost=1534028842][/doublepost]For what is worth, these are screenshots after Heaven benchmark runs, Extreme preset, had to run it 4 consecutive times to get stable fan and temps. After the first run the fans barely moved from the idle settings. Screenshots made around the end of benchmark, 255 second mark. Someone with 560x can easily replicate this.
View attachment 775463 View attachment 775464
@Thysanoptera
Attempted to recreate the same procedure as above, ran it in extreme 4 times and took a screen shot at 255 seconds of the last run. This is without using my monitor so it shouldn't be more stressed than normal too.
![]()
It has higher temperatures for sure. Though, it does look to eek out a bit more performance at the cost of higher temps. 4% higher temperature for 17% higher performance is not the worse tradeoff in the world.
It has higher temperatures for sure. Though, it does look to eek out a bit more performance at the cost of higher temps. 4% higher temperature for 17% higher performance is not the worse tradeoff in the world.
4% higher temperature for 17% higher performance is not the worse tradeoff in the world.
The temps are a function of fan rpms, and it look like to dissipate a mere 4W of additional power you need 1000rpm more. Looks like they trying to keep 70C. To get an idea of actual temp difference you'd need to run the same fan rpm, give me sec, I'm just curious at this time.
I've set the fans to 4626 and 4263 rpm in Macs Fan Control and run the benchmark 4 times. Ended up with 59 C on GPU, and CPU cores around 60C. 560x was at 72C GPU and 74 on CPU cores. So thats actually 22% temp increase for 17% gain, I mean in this benchmark, but it gives a general idea. I actually like the fans at those settings, they're not too loud but the keyboard remains cool to the touch. I think apple is too skittish with fan curves.
View attachment 775489
Does anybody else want the 2.6 to just not have the base model? I can't shake somehow that Im supposed to have it but this performance ceiling on the 2.2 is pretty interesting. Im also coming from the i7 3.5 13" 2017. Its just unreally faster huh?
My usage isn’t that intensive but I like s*** to fly when I ask for it to. Is your workflow cpu intensive?@Thysanoptera Thanks for the secondary round of testing to actually confirm the difference. I was a little surprised to see such a low temperature difference. That honestly makes more sense than just a little 4% difference.
I actually had the 2.6 model for a week or so actually! I found myself hitting the thermal limit of the chassis a lot quicker than my current 2.2 GHz model. From averages, the 2.2 GHz model seems to run about 3-5 C cooler at any given task. The one nice thing with the 2.6 model was that I consistently was able to hit 1000+ on CineBench results. Plus, I felt like if the 2.2 GHz was going to run cooler, than that would give me more headroom when playing games with the 560X.
Yeah, I notice a difference in my day to day. For some reason, my 2017 laptop began slowing down pretty dramatically and battery life tanked to 3 hours lmao. I took it to the Apple store who said everything was normal so what can you do. The same workflow on this laptop is easily getting me 8 hours compared to the 3 I was getting before. That upgrade in itself made it worth the price tbh.
Does anybody else want the 2.6 to just not have the base model? I can't shake somehow that Im supposed to have it but this performance ceiling on the 2.2 is pretty interesting. Im also coming from the i7 3.5 13" 2017. Its just unreally faster huh?
@Thysanoptera Thanks for the secondary round of testing to actually confirm the difference. I was a little surprised to see such a low temperature difference. That honestly makes more sense than just a little 4% difference.
I actually had the 2.6 model for a week or so actually! I found myself hitting the thermal limit of the chassis a lot quicker than my current 2.2 GHz model. From averages, the 2.2 GHz model seems to run about 3-5 C cooler at any given task. The one nice thing with the 2.6 model was that I consistently was able to hit 1000+ on CineBench results. Plus, I felt like if the 2.2 GHz was going to run cooler, than that would give me more headroom when playing games with the 560X.
Yeah, I notice a difference in my day to day. For some reason, my 2017 laptop began slowing down pretty dramatically and battery life tanked to 3 hours lmao. I took it to the Apple store who said everything was normal so what can you do. The same workflow on this laptop is easily getting me 8 hours compared to the 3 I was getting before. That upgrade in itself made it worth the price tbh.
I personally haven't really gamed much on the computer, too many things happening for me to settle down and play games. Maybe in a week or so.Out of interest what games have you tried on the 560x?
@DannEboE @Thysanoptera do you guys notice a big difference in temp using an external monitor...I only just learned that the dGPU drives any external displays you connect so obviously that’d cause it to heat up?