Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

akis-k

macrumors member
Original poster
Feb 4, 2010
51
24
Greece
Yet another debate i guess, but i am in search of a thin and light mobile workstation. I am a freelance architect with my own practice and i am looking for a laptop for use in home and on the go (work never stops in my profession :)). I have already a desktop in my office. The programs that i am using are Autocad with a lot of 3d modeling (i need to get into BIM soon), Cinema 4d with an intensive use of rendering, so these extra 2 cores matter a lot, post edit in Photoshop and for some videos editing in premiere or final cut. A casual gaming here and there is not bad either. I am into strategy games (civilization), as well as some fps games (battlefield), but i don't consider myself a hardcore gamer... So is the 560x enough for my needs or do i get vast boost in performance with the vega 20?
P.S.1 i usually work on both platforms (windows-macos), so i will use boot camp.
P.S.2 i know that there are plenty of windows laptops and at a lower cost (xps 15, razer blade, msi gs65, asus zenbook 580, but i believe that nothing beats the quality and the aesthetics of a macbook pro)
 
Vega 20 will help a lot with 3D modeling. But - if your life depends on Autocad, its MacOS version is crippled compared to Windows one. You say you’re using both platforms so make sure you compare both versions before making decision.

And if I were you I’d wait a month for CES, some benchmarks leaked this week of new Nvidia mobile graphics, the card with similar power requirements to Vega 20 (that can fit into thin&light 15 incher) shows 3x better OpenCL scores. From my own experience, maintaining two different systems on the same machine is pretty frustrating, eats into productivity and no swag from aesthetics is going to compensate for it. If you need Windows for anything other than games, stick with Windows laptop.
 
  • Like
Reactions: akis-k
Thanks for the input! Autocad for mac has changed a lot since the 2011 release, and now it is quite competitive with the windows version. Apart from that, i already have an imac 4k (2015 model-i7-m395x) at the office (alongside a pc) and i'm planning buying an ipad pro as well for building surveying on site using tools like orthograph architect or magicplan, so i was thinking that, because of the overall interaction between ios-macos devices, i will have a smoother flow of the whole process, from day 1 on site till the last "tile" in the building...
 
but i believe that nothing beats the quality and the aesthetics of a macbook pro

Might wanna give Surface Book 2 a try. It has one of the best keyboards and trackpads, insane battery life, a built-in tablet (so you don't have to spend extra $$$ on the iPad), a quad-core 8th gen processor and a GTX 1060 in the 15'' model, making rendering anything is a breeze. My favorite laptop of all I've tried.

If you still want to stick to Mac, then take Vega, it will be noticeably faster for you. Don't save money on machines that make you money :D
 
Thanks for the input! Autocad for mac has changed a lot since the 2011 release, and now it is quite competitive with the windows version. Apart from that, i already have an imac 4k (2015 model-i7-m395x) at the office (alongside a pc) and i'm planning buying an ipad pro as well for building surveying on site using tools like orthograph architect or magicplan, so i was thinking that, because of the overall interaction between ios-macos devices, i will have a smoother flow of the whole process, from day 1 on site till the last "tile" in the building...
If you have an iMac then just to try to use it exclusively for some time with Windows loaded as VM and see if works for you. Forget about bootcamp for anything work related, the Mac and windows partitions don't see each other, at least when they are encrypted. And the bootcamp drivers are bad, touchpad goes from the best in the business to the worst, I just disable it and use a mouse, dGPU is always active - drains battery and hot. I wouldn't pay much attention to iOS/MacOS integration, handoff is a nice feature but limited to Apple's native apps and a some third party apps, and I couldn't get it to work reliably every time especially in the third party ones. So it all ended up in cloud storage anyway, which is system agnostic.
 
  • Like
Reactions: akis-k
Might wanna give Surface Book 2 a try. It has one of the best keyboards and trackpads, insane battery life, a built-in tablet (so you don't have to spend extra $$$ on the iPad), a quad-core 8th gen processor and a GTX 1060 in the 15'' model, making rendering anything is a breeze. My favorite laptop of all I've tried.

If you still want to stick to Mac, then take Vega, it will be noticeably faster for you. Don't save money on machines that make you money :D
You are absolutely right as far concerns the last sentence! :) The surface book is indeed a powerful machine and with a great graphics card, but it lacks the 2 extra cores that the intel h series has. And i need cores in C4d. More cores leads to faster rendering times, thus reaching easier the deadlines-job done and money then comes in... :D:D:D
 
So is the 560x enough for my needs or do i get vast boost in performance with the vega 20?

From the usage you describe, Vega Pro 20, no discussion whatsoever.

And if I were you I’d wait a month for CES, some benchmarks leaked this week of new Nvidia mobile graphics, the card with similar power requirements to Vega 20 (that can fit into thin&light 15 incher) shows 3x better OpenCL scores.

There are leaks of 2050? Where? Couldn't find any...

Might wanna give Surface Book 2 a try. It has one of the best keyboards and trackpads, insane battery life, a built-in tablet (so you don't have to spend extra $$$ on the iPad), a quad-core 8th gen processor and a GTX 1060 in the 15'' model, making rendering anything is a breeze. My favorite laptop of all I've tried.

Yeah, but all that is achieved a) by making the laptop unnecessary bulky and b) by using a 15Watt CPU that will clock down you put it to any kind of serious any kind of serious work on it. If you play games or rely solely on the GPU for work, SB2 is a great option. If you also need CPU performance — not so much.
 
Nope, just this December ones. Unlike AMD Nvidia has a tendency to double performance of mobile SKU's between generations. Geekbench OpenCL scores, RTX2080ti - 287k at mobile 1070 TDP, RTX2070 Max-Q - 224k at probably current 1050ti or 1060 TDP. Vega Pro 20 does 80k-85k.

What makes you think that RTX2070 Max-Q will be 50-60 watts? The Turing GPUs so far have higher TDPs than the corresponding Pascal cards and the perf-per-watt increases are moderate at best. This makes me doubtful that a 50Watt Turing will perform that much better. But we will see next year.
 
Nope, just this December ones. Unlike AMD Nvidia has a tendency to double performance of mobile SKU's between generations. Geekbench OpenCL scores, RTX2080ti - 287k at mobile 1070 TDP, RTX2070 Max-Q - 224k at probably current 1050ti or 1060 TDP. Vega Pro 20 does 80k-85k.

https://browser.geekbench.com/v4/compute/3312774
https://browser.geekbench.com/v4/compute/3329703
Ill believe those TDP figures when I will see them.

Turing increased the power consumption of those GPUs, and yet, people claim that the mobile versions will f***** magically use less power than their predecessors, and even less complex/less thermally demanding GPUs, with smaller die sizes.

Yeah, right.

What makes you think that RTX2070 Max-Q will be 50-60 watts? The Turing GPUs so far have higher TDPs than the corresponding Pascal cards and the perf-per-watt increases are moderate at best. This makes me doubtful that a 50Watt Turing will perform that much better. But we will see next year.
Because Nvidia has such brand perception. The end.
 
What makes you think that RTX2070 Max-Q will be 50-60 watts? The Turing GPUs so far have higher TDPs than the corresponding Pascal cards and the perf-per-watt increases are moderate at best. This makes me doubtful that a 50Watt Turing will perform that much better. But we will see next year.
Because Lenovo doesn't have anything in their current lineup higher than 1050Ti, and the maximum they ever got to was 1060. And Nvidia always steps down their models in regards to power requirements.
Ill believe those TDP figures when I will see them.

Turing increased the power consumption of those GPUs, and yet, people claim that the mobile versions will f***** magically use less power than their predecessors, and even less complex/less thermally demanding GPUs, with smaller die sizes.

Yeah, right.

Because Nvidia has such brand perception. The end.

You're kidding right? You were the one preaching 35W Vega Pro 20 for two months, and cursing nonbelievers blaming it on brand perception. You almost got me convinced. Well, guess what, it didn't happen. And when comparing power consumption compare the SKUs with the same performance, not the same markings. RTX is more power efficient.

And I bet nether you or leman ever looked at RTX power requirements or benchmarks. Because if you did, you would notice immediately that RTX has unexplainable advantage in Geekbench OpenCL that doesn't correlate with performance improvements in other tests. And the numbers look good, half of what the desktop does. It will happen, just most likely it is not going to be 3x in real life.
 
You're kidding right? You were the one preaching 35W Vega Pro 20 for two months, and cursing nonbelievers blaming it on brand perception. You almost got me convinced. Well, guess what, it didn't happen. And when comparing power consumption compare the SKUs with the same performance, not the same markings. RTX is more power efficient.
power_average.png

power_average.png

Yep. It is for sure more power efficient than Pascal.

RTX 2070 uses 50W more than GTX 1070, RTX 2080, uses 50W more than GTX 1080. Yeah. More power efficient.
 
Yep. It is for sure more power efficient than Pascal.

RTX 2070 uses 50W more than GTX 1070, RTX 2080, uses 50W more than GTX 1080. Yeah. More power efficient.

The word 'efficiency' requires two components. Power and performance. RTX 2070 is a lot faster than 1070, 2080 a lot faster than 1080. Granted, the efficiency improvement is low, by Nvidia standards. But is there anyway, for example, in Geekbench OpenCL (did you read my previous post completely, just checking) 2080 has 412k score and 1080 a meager 200k.

It's really funny how you pulled 180 turn just because the talk is about AMD competitor and suddenly unsubstantiated claims based on single leaked benchmark become irrelevant. I think I'll start digging your old enthusiastic posts explaining the 35W Vega 20 and copying them here word for word as responses.
 
The word 'efficiency' requires two components. Power and performance. RTX 2070 is a lot faster than 1070, 2080 a lot faster than 1080. Granted, the efficiency improvement is low, by Nvidia standards. But is there anyway, for example, in Geekbench OpenCL (did you read my previous post completely, just checking) 2080 has 412k score and 1080 a meager 200k.

It's really funny how you pulled 180 turn just because the talk is about AMD competitor and suddenly unsubstantiated claims based on single leaked benchmark become irrelevant. I think I'll start digging your old enthusiastic posts explaining the 35W Vega 20 and copying them here word for word as responses.
I think you clearly have no idea about what you are talking about.

Let me quote your post:
Thysanoptera said:
RTX2070 Max-Q - 224k at probably current 1050ti or 1060 TDP
I showed you directly that it is impossible for 445 mm2 die, to consume less than 60-75W as you claimed, because it already uses more power, than GTX 1070 ever used, on desktop. How in your mind, you can fit, 445 mm2, 256 Bit memory bus GPU, with 195W power draw, into 60W, without clocking it at 500 MHz? The memory ITSELF will use half of your power budget.

In laptops - efficiency means total power draw. RTX 2070 WILL NOT EVER USE LESS POWER THAN GTX 1060 max-Q. At best - expect 100W TDP, with relatively medium boost clocks.

P.S. Tell me, how much more Vega Pro 20 uses more power that RP 560X, as a whole: GPU+Memory?
 
Last edited:
I think you clearly have no idea about what you are talking about.

You should keep "no idea what you're talking about" in your clipboard, you use it so often it will speed up your postings.

I showed you directly that it is impossible for 445 mm2 die, to consume less than 60-75W as you claimed, because it already uses more power, than GTX 1070 ever used, on desktop. How in your mind, you can fit, 445 mm2, 256 Bit memory bus GPU, with 195W power draw, into 60W, without clocking it at 500 MHz? The memory ITSELF will use half of your power budget.

In laptops - efficiency means total power draw. RTX 2070 WILL NOT EVER USE LESS POWER THAN GTX 1060 max-Q. At best - expect 100W TDP, with relatively medium boost clocks.

"Have you ever thought that Vega RTX GPU in MBP Lenovo will have 3560W Power Limit? The Vega RTX GPU you are talking about, has very little to do with Vega RTX which will land in MBP Lenovo. It is the same architecture. But apart from that - it is different chip"
"GPU will not exceed 3560W Power Limit state, because that is its power limit." "What I have written, about Vega RTX GPU is not an opinion. I am not speculating about this chip. That is the difference."
P.S. Tell me, how much more Vega Pro 20 uses more power that RP 560X, as a whole: GPU+Memory?
It's me again - that's easy. 21W. Just be a man and admit you were wrong.
 
It's me again - that's easy. 21W. Just be a man and admit you were wrong.

Show me the data. 60W? Where did you got this number?

Edit: I see now. The dynamic TDP Power Limit is up to 50W. Can you tell me what is static TDP of the GPU, when both: CPU and GPU and GPU are loaded? ;)

The only thing Apple changed is added Dynamic, shared TDP power limit to power states, at leats that is what it appears from the first glance. I would have to get new MBP, or get BIOS of the system from the computer to confirm this, or deny. The Static TDP power limit is still 30W for Vega, as it was for Radeon Pro 560X.

I actually did not expected that. Im not sure it is a good thing, tho.
 
Last edited:
Show me the data. 60W? Where did you got this number?
rxvega.jpg

That is Heaven run 4 times, screenshot taken at the end of the last one, unfortunately there was only one person in the whole forum with Vega 20 willing to do it, and you have to compare it to my 555x, just add 5W to get 560x number. Get system total, remove CPU stuff and all other things being equal you will get difference between GPU+memory.
 
So it is 50W TDP. You still have to add Memory power on 555X. Not the GPU alone. The difference will be in 5-6W of power, more for Vega 20. Vega 20 is considered whole package: GPU+HBM2 package, and it powered through one High Side. For Radeon Pro 555X you have two High Sides: for GPU and for GDDR5. And 128 bit memory bus consumes 12-16W, depending on clocks and voltage.
 
So it is 50W TDP. You still have to add Memory power on 555X. Not the GPU alone. The difference will be in 5-6W of power, more for Vega 20. Vega 20 is considered whole package: GPU+HBM2 package, and it powered through one High Side. For Radeon Pro 555X you have two High Sides: for GPU and for GDDR5. And 128 bit memory bus consumes 12-16W, depending on clocks and voltage.
Your math is wrong, my 21W number was without using Radeon high side at all. If you add 16W to 555x you’ll get perpetum mobile. You’d rather propose that 555x breaks fundamental laws of physics than admit you were wrong. Just look at system total, the system with Vega needs 33W more. Why?
 
Your math is wrong, my 21W number was without using Radeon high side at all. If you add 16W to 555x you’ll get perpetum mobile. You’d rather propose that 555x breaks fundamental laws of physics than admit you were wrong. Just look at system total, the system with Vega needs 33W more. Why?
It can be a lot of reasons for this. Again, you appear to not understand that you are talking about High Side power.

Tell me then. Why is system built from Ryzen 7 1800X with Gigabyte B450i Aorus WiFi Pro drawing 5W less under load, from Core i7 8700+ MSI Z370 PC Pro system, despite the fact that CPU package power shows that Ryzen 7 is drawing 22W more power, than Core i7 is, under load?

Because you you have to account VRM and power delivery design inefficiencies. I asked you a question in another thread. How do you know that Power delivery design is the same on Vega laptops? It is not, because You have dynamic, shared TDP, between CPU and GPU, according to that MacBook Users.

P.S. Screenshot of your computer proves that actually, the power draw of memory in MBP is equal to 12W.
 
It can be a lot of reasons for this. Again, you appear to not understand that you are talking about High Side power.

Tell me then. Why is system built from Ryzen 7 1800X with Gigabyte B450i Aorus WiFi Pro drawing 5W less under load, from Core i7 8700+ MSI Z370 PC Pro system, despite the fact that CPU package power shows that Ryzen 7 is drawing 22W more power, than Core i7 is, under load?

Because you you have to account VRM and power delivery design inefficiencies. I asked you a question in another thread. How do you know that Power delivery design is the same on Vega laptops? It is not, because You have dynamic, shared TDP, between CPU and GPU, according to that MacBook Users.

P.S. Screenshot of your computer proves that actually, the power draw of memory in MBP is equal to 12W.
I don’t care what Radeon high side is. I’m not looking at it. Why is the system total 33W higher? Why are the temps 30 degrees higher? So it’s the VRMs that are so messed up in Vega that they need 33W more to supply the same power to GPU? You can’t be possibly serious.
 
I don’t care what Radeon high side is. I’m not looking at it. Why is the system total 33W higher? Why are the temps 30 degrees higher? So it’s the VRMs that are so messed up in Vega that they need 33W more to supply the same power to GPU? You can’t be possibly serious.
The temperature reported on your laptop is for Intel iGPU, and there is no report for Radeon GPU proximity temps. That is why there "is" 30 degrees difference in temps.

On your laptop: Battery: 0.09W, CPU computing highside 8.07W, Radeon HighSide 30.75W, total 36W, System power consumption: 47.15W, DC in: 48.12W

11W is missing.

On the other one: Battery Current: 6.81W, CPU Computing highside: 15.1W, Radeon Highside 51.9W, total: 74W of power. System total: 80W, and DC in: 78W. What is the difference?

Why is GPU power draw higher? Simple reason: Dynamic Power Limit shared between CPU and GPU, which deas not occur in Polaris based Radeon Pro GPUs.

Why is that? My theory is that is because you have separate VRM for GDDR5, on Polaris Based Radeon Pro and you cannot control power between both VRMs. With Vega - you have single VRM, and you can control both, because it is one system-on-package.

Static TDP for the GPU, when both CPU and GPU is simultaneously loaded appears to be 30-35W. Which fits perfectly what I was claiming.

We can settle this debate. I suggest someone should install Windows 10 on their MBP, and download MSI afterburner, and in settings set the server to report, what is the power draw under gaming. That way cen in real time observe what is the power draw balancing, between CPU and GPU.
 
Last edited:
Nicely done.

If you have an iMac then just to try to use it exclusively for some time with Windows loaded as VM and see if works for you. Forget about bootcamp for anything work related, the Mac and windows partitions don't see each other, at least when they are encrypted. And the bootcamp drivers are bad, touchpad goes from the best in the business to the worst, I just disable it and use a mouse, dGPU is always active - drains battery and hot. I wouldn't pay much attention to iOS/MacOS integration, handoff is a nice feature but limited to Apple's native apps and a some third party apps, and I couldn't get it to work reliably every time especially in the third party ones. So it all ended up in cloud storage anyway, which is system agnostic.
 
The temperature reported on your laptop is for Intel iGPU, and there is no report for Radeon GPU proximity temps. That is why there "is" 30 degrees difference in temps.
No, this is Radeon proximity temp. I have Apple store version, he has the one from web. The fact that CPU is 25 deg higher didn't ring a bell? The fin stacks temps, no, nothing? The fact that there is another, separate iGPU temp which is over 20C higher on Vega also doesn't mean anything. That every bloody sensor shows 20+ difference?
Why is GPU power draw higher? Simple reason: Dynamic Power Limit shared between CPU and GPU, which deas not occur in Polaris based Radeon Pro GPUs.

Why is that? My theory is that is because you have separate VRM for GDDR5, on Polaris Based Radeon Pro and you cannot control power between both VRMs. With Vega - you have single VRM, and you can control both, because it is one system-on-package.
So we went from perpeteum mobile, through 50% efficiency VRMs to Dynamic Power Limit shared between CPU and GPU which algorithm, set by OEM, not a physical component, is responsible for missing 26 watts between those two machines. And which by the way is present on Polaris equipped MBP too, and every notebook that has dGPU... And for the single VRM thing - count them on pictures of the internals. Number is the same, the chokes are bigger on Vega because they have to deal with much higher current.
Static TDP for the GPU, when both CPU and GPU is simultaneously loaded appears to be 30-35W. Which fits perfectly what I was claiming.
LOL. So the TDP of a GPU is only measured when the CPU is simultaneously loaded and we call it static TDP. Poor manufacturers, how do you set a comparable test conditions. Do you make this stuff up as you go or do you have a list? I don't even .... What is going to happen when CPU package is, oh, I don't know, 10W? Now this going to be 50W and we call it dynamic TDP and it is totally not your 35W claim. And on Polaris dynamic TDP = static TDP. There, I have an explanation for you.
We can settle this debate.
There was never a debate. My 4 year old can see it, my dog can see it, but its a border collie so it sets a pretty high standard. I'm 100% sure you see it too, so the only logical explanation I have is you spread BS for the sake of spreading BS. I was fully expecting you to be just over optimistic AMD enthusiast, who is going to say, 'sorry I was wrong'. But you have different agenda, which is ok, I just need to account for this. I'm done, discussing this with you with all the fantasy arguments is pointless. I will react only if somebody else is going to fall for your stories. Oh btw - all the sensors are locked on Windows side.
[doublepost=1544229389][/doublepost]
Nicely done.
Thanks again for providing the Vega 20 sensor numbers. You're still the only one who did it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.