Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Doing anything that relies on direct 3D acceleration (e.g. 3d modelling, CAD etc.), employing workflows that use the GPU for number-crunching (a lot of scientific applications), training ML models, processing video (if your software is good at GPU acceleration, e.g. FinalCut X).

And of course, playing games ;)
I have experience in all those fields, and mind you, in low level GPU programming as well (both for 3D graphics and compute). The only one where I can agree that you get a clear 50% benefit is games.

For the other workflows... the majority of the time is going to be spent setting up the model. That wouldn't necessarily be true for a desktop, but a laptop you wouldn't want to use for 8+ hour compute sessions anyway. For an 8 hour work day, I bet it's 1 hour GPU time (at best) and 7 hours non-GPU time. So out of 8 hours you save 20 minutes. That's not 50% in my book.

Obviously there are exceptions. But it's got to be a case where you're prepared to pay $550 for $200 worth of performance, and it's somehow better than choosing an eGPU for example. For those who have that case, they already know and don't need to ask.
[doublepost=1542306357][/doublepost]
In ML you will see much more than 50% performance increase over the 560X.
Theoretically quite a bit more. For real world, who knows? I wish reviewers could run some ML benchmarks instead of just reiterating bruce in fcpx or whatever.
 
If it will help with timeline smoothness in FCPX I am happy to buy one of the Vegas. As soon as I throw heavy effects on my footage this is the thing that bothers me most on my 2.2 i7, 560x.
 
I have experience in all those fields, and mind you, in low level GPU programming as well (both for 3D graphics and compute). The only one where I can agree that you get a clear 50% benefit is games.

For the other workflows... the majority of the time is going to be spent setting up the model. That wouldn't necessarily be true for a desktop, but a laptop you wouldn't want to use for 8+ hour compute sessions anyway. For an 8 hour work day, I bet it's 1 hour GPU time (at best) and 7 hours non-GPU time. So out of 8 hours you save 20 minutes. That's not 50% in my book.

Obviously there are exceptions. But it's got to be a case where you're prepared to pay $550 for $200 worth of performance, and it's somehow better than choosing an eGPU for example. For those who have that case, they already know and don't need to ask.
[doublepost=1542306357][/doublepost]
Theoretically quite a bit more. For real world, who knows? I wish reviewers could run some ML benchmarks instead of just reiterating bruce in fcpx or whatever.

You obviously don't know enough. Using any video editing software the video card comes in play heavily to allow smooth playback, editing, timeline movements, etc. This video below shows how bad the mac mini is with the internal video card doing it. Games will definitely show it no matter what. But for you to say its very little difference, is flat out wrong.

 
You obviously don't know enough. Using any video editing software the video card comes in play heavily to allow smooth playback, editing, timeline movements, etc. This video below shows how bad the mac mini is with the internal video card doing it. Games will definitely show it no matter what. But for you to say its very little difference, is flat out wrong.


There are hundreds of video editing apps out there. I can bet you that the majority of them do not use GPU acceleration at all.

But maybe you have a specific software in mind? FCPX? I don't use it so I'm not saying anything about how it performs. I do understand that it's heavily optimised using GPU code, so obviously any GPU is going to give you a more pleasant experience. But the figure was 50% here. Is a Vega GPU going to make the editing process from start to finish 50% faster? Well, wait for benchmarks. Subjectively then, will you have a 50% more pleasant experience with a Vega than with a 560? Wait for benchmarks.

But I don't even know what you're arguing here. I've said many times that the Vega is a great architecture and a great chip. I have a couple myself, so I have a fair idea of how it performs. But to put it in a laptop, and then pay $550 for it -- it can be worth it for a minority, but for the majority of users who are uncertain about the upgrade, it's just not going to deliver to their expectations. Maybe video editing is one of those minority use cases that benefit more than enough to justify the cost. But it's going to depend on the software, and it's going to depend on the video formats used, and it's going to depend on the workflow. It's a very thin slice of the user base, and you're doing yourself no favours by pretending that what's true for that highly specific case is generally true for everyone. It's not.
 
Last edited by a moderator:
In ML you will see much more than 50% performance increase over the 560X.
Except that non-CUDA alternatives for ML aren’t really yet on a level that is suitable for professional workflows. There are a couple good options, yes (PlaidML comes to mind), but they’re limited in the number of frameworks they support. So even this advantage might remain theoretical for most people.

I work in the field and know literally no one who uses AMD cards in practice. Everyone at the office complains about the difficulty of using the Mac for ML/AI. Hardware power is there with AMD, it’s the software and library support that is the problem.
 
  • Like
Reactions: CodeJoy
Except that non-CUDA alternatives for ML aren’t really yet on a level that is suitable for professional workflows. There are a couple good options, yes (PlaidML comes to mind), but they’re limited in the number of frameworks they support. So even this advantage might remain theoretical for most people.

I work in the field and know literally no one who uses AMD cards in practice. Everyone at the office complains about the difficulty of using the Mac for ML/AI. Hardware power is there with AMD, it’s the software and library support that is the problem.

but AMD Pro Series is great for CAD/CAM and 3D rendering and much cheaper than NVIDIA Quadro
 
but AMD Pro Series is great for CAD/CAM and 3D rendering and much cheaper than NVIDIA Quadro
Sure, AMD is great for some stuff. I’m not saying that everyone should prefer Nvidia, on the contrary. Just pointing out that the ML niche doesn’t have much choice at the moment (and btw nobody really likes this situation, monopolies are always bad and imho it’s crazy that an entire scientific field has to depend on a specific hardware vendor).
 
What program are you using that excedes 16gb?

I work in the creative industry, so basically FCPX. Adobe CC. Usually can’t load more than 2-3 programs without some noticeable stuttering or crashing at rare occasions. My Mac Pro with 64gb don’t usually have this issue.
 
True, but it would probably take another year until the production ramps up to satisfy the demand.

First: you seem to forget that the enclosure costs the same or more. Second: not everybody would want to carry an additional box around with them. Third: did you just call a 580 and Vega 56 "lower end GPUs"??

Of course its worth it. Around 50% more performance for extra $400? It beats any other upgrade you can have on the MBP...

I mean its better value. At the end of the day even at 50% more power they are still 50% slower than a Vega 56. £200 enclosure and you can swap that GPU out for a new one down the line. Anyway we have no actual numbers to support anything but I very much doubt it will give 50% increase because of the constraints of the form factor. The fact is the current macbook pro is a compromise in all aspects, yet is one of the most expensive portable machines you can buy is the issue. Will you actually see the possible gains.

Its all well and good adding these components but the processors in this machine run slower than the majority of other machines with the same components because of the form factor. Whats to say that spending another £400 will get you 50% performance in real life. Doubt it. Apple are just taking the piss.

At the end of the day FCP works fine with the 560X and you can edit on the go smoothly and attach it to an EGPU when you get home and render. Either way if you are outputting anything more than 10 mins its worth doing it at a desk otherwise were talking about a few minutes of difference not hours.

The other elephant in the room nobody seems to mention is the cost of these things. Even being a professional these machines are getting more and more expensive and its more of a ball ache to justify every iteration meaning I tend to try and push mine further and further in terms of lifespan.

A modern middle of the range config for example. 2.6 i7 1tb Vega 16 32gbs £3644 then £399 for applecare... £4043

In all honesty your better off buying a the minimum you can get away with in the field and having a more powerful capable desktop at home. The 2017 i7 iMac even with the quad core outperforms the i9 in almost every task. Then benchmarks may say otherwise but when you actually set these off for an extended period the real life usage is completely different.

The actual performance difference of these processors in this form factor are pretty much insignificant yet in a properly cooled system the differences are much larger.

Just annoys me, pay 30% more get 75-80% performance.

After all the issues with this 16-18 generation of MBP they should be replacing the current GPUs with these not adding another £400 option.
[doublepost=1542360711][/doublepost]
I work in the creative industry, so basically FCPX. Adobe CC. Usually can’t load more than 2-3 programs without some noticeable stuttering or crashing at rare occasions. My Mac Pro with 64gb don’t usually have this issue.

+1 I had lightroom, indesign and safari open and I was hitting 22gb usage.

TBF mac os is pretty good at memory caching and with the speed of the SSDs i doubt most would notice. If your system does have plenty of memory the OS will use it.
 
I think we will have to see the benchmarks but - if it does makes gaming a lot more playable it might become an attractive upgrade for people who like to play games. A lot of games which aren't that demanding still struggle to stay 60FPS+ throughout gaming.
 
yes would be nice for games...BUT only if apple works on the internals...because to have even more powerful gpu on the same 14nm...inside on the same cooling...is not good for a long run
the i9+560X combo already is too much
 
Workloads for upcoming 3-4 years will not outgrow 16 GB's of RAM, and if you are a gamer, you still are perfectly fine with 8 GB's of RAM.

16 GB's is plenty right now, and will be for at least upcoming 3 years. At least - if you use Windows, which has much better memory menagment than macOS.
Some games are pushing 10, 12 GiB RAM.

A basic VM + Firefox will eat up 16GiB RAM.
 
yes would be nice for games...BUT only if apple works on the internals...because to have even more powerful gpu on the same 14nm...inside on the same cooling...is not good for a long run
the i9+560X combo already is too much
And how come 35W power limit on one GPU is different from 35W power limit on the other...?
 
And how come 35W power limit on one GPU is different from 35W power limit on the other...?
35W that can go up to 47W....you know that even the cpu if you dont use a program to limit it also can go beyond the "standard 45W"
But it doesnt matter, with my own hands i felt/saw the differences between i9+560X and the base i7+555X in heat and throttling
 
35W that can go up to 47W....you know that even the cpu if you dont use a program to limit it also can go beyond the "standard 45W"
But it doesnt matter, with my own hands i felt/saw the differences between i9+560X and the base i7+555X in heat and throttling
GPU will not exceed 35W Power Limit state, because that is its power limit.
 
GPU will not exceed 35W Power Limit state, because that is its power limit.
like the cpu limit should be 45W? and thats why under heavy load the i9 reach even 53W-56W?
You will see..Vega20 with i9 will be a party for HEAT in the current design...i hope Apple redesign a little inside to sustain that but i doubt it
 
like the cpu limit should be 45W? and thats why under heavy load the i9 reach even 53W-56W?
You will see..Vega20 with i9 will be a party for HEAT in the current design...i hope Apple redesign a little inside to sustain that but i doubt it
Thats because Core i9 has this Power Limit state set in its BIOS, in the EFI of Apple computer.

Usually GPUs in apple computers have maximum power limit set in their BIOS, and that is in the case of MBP. It HAS to be this way.

It amazes me that people are not able to add 2 to 2, and see that its the CPU which ccauses power/thermal problems, not GPUs in MBPs...
 
Sure, AMD is great for some stuff. I’m not saying that everyone should prefer Nvidia, on the contrary. Just pointing out that the ML niche doesn’t have much choice at the moment (and btw nobody really likes this situation, monopolies are always bad and imho it’s crazy that an entire scientific field has to depend on a specific hardware vendor).
Sadly, by deprecating OpenCL, Apple are actively worsening this situation.
[doublepost=1542438822][/doublepost]
In all honesty your better off buying a the minimum you can get away with in the field and having a more powerful capable desktop at home. The 2017 i7 iMac even with the quad core outperforms the i9 in almost every task. Then benchmarks may say otherwise but when you actually set these off for an extended period the real life usage is completely different.
Agree completely. It's sad that Apple are effectively pricing themselves out, when much of what they're putting into these laptops is actually decent hardware. With current prices, the best value is to get the minimum spec you can get away with, and then either add external devices or just get a desktop computer. You usually get quite a lot for the money you save. It seems completely opposite to what Apple is about, to make things simple and convenient.
 
Thats because Core i9 has this Power Limit state set in its BIOS, in the EFI of Apple computer.

Usually GPUs in apple computers have maximum power limit set in their BIOS, and that is in the case of MBP. It HAS to be this way.

It amazes me that people are not able to add 2 to 2, and see that its the CPU which ccauses power/thermal problems, not GPUs in MBPs...
Core i9 has this Power Limit state set in its BIOS, in the EFI of Apple computer, thats why it draws up to 80W
GPUs in apple computers have maximum power limit set in their BIOS, and that is in the case of MBP. It HAS to be this way. thats why 560X draw up to 56W, so for us who are using 2018 models its clear we dont have any limits and the top high end model can easy work around 95-100C that is doomed for a 3-4 years run
It like saying both base i7 and i9 heat the same way because both have 45W,for me the reality spoke not some words or brochure ,560X also had its issues reaches over 41W under load (probably you dont have the latest 2018 mbp, like 295X from imac working too hot...an base i7 with 555X will run colder under the same work than the same i7 with 560X...like i said, everybody who will buy vega20 will see, that gpu will not works as advertised(Apple lost our trust in what they are saying on paper). People still defending Apple, thats smart, on paper everything looks good for some, but for us, the users, is not working anymore
 
Last edited:
It like saying both base i7 and i9 heat the same way because both have 45W

Depends on what you mean by "heat up". If you keep the CPU at 100% utilisation for certain amount of time, yes, both CPUs will settle in at drawing approx. 45 watts of power. Since that is what the TDP is all about. In burst workflow, the i9 will heat up faster than the i7, but it doesn't really matter since all this means that it will settle around its TDP limit a bit earlier. You will get individual variations with the CPUs, as no CPUs are identical.

.an base i7 with 555X will run colder under the same work than the same i7 with 560X...

Very much possible since the 555X has less cores. So probably its running under the 35 watts. But who cares? The 560X is going to be faster. Temperatures are meaningless.


like i said, everybody who will buy vega20 will see, that gpu will not works as advertised(Apple lost our trust in what they are saying on paper)

This doesn't make any sense to me. How did Apple lose our trust? Why wouldn't that GPU work as advertised? Are you suggesting that HBM won't work somehow or what?
 
  • Like
Reactions: HenryDJP
Core i9 has this Power Limit state set in its BIOS, in the EFI of Apple computer, thats why it draws up to 80W
GPUs in apple computers have maximum power limit set in their BIOS, and that is in the case of MBP. It HAS to be this way. thats why 560X draw up to 56W, so for us who are using 2018 models its clear we dont have any limits and the top high end model can easy work around 95-100C that is doomed for a 3-4 years run
It like saying both base i7 and i9 heat the same way because both have 45W,for me the reality spoke not some words or brochure ,560X also had its issues reaches over 41W under load (probably you dont have the latest 2018 mbp, like 295X from imac working too hot...an base i7 with 555X will run colder under the same work than the same i7 with 560X...like i said, everybody who will buy vega20 will see, that gpu will not works as advertised(Apple lost our trust in what they are saying on paper). People still defending Apple, thats smart, on paper everything looks good for some, but for us, the users, is not working anymore
No my friend ;). Funnily enough, you talk only about CPU power states, but nothing about the GPU power states.

It has nothing to do with defending Apple. No Apple GPU exceeded 35W Power Limit, in recent history. You talk only about CPU Power Limit states. Yes, Core i9, and i7s, six core, will draw up to 80W of power, if there is need. When both: CPU and GPU are running the power envelope becomes constrained for them. GPU will not exceed 35W power limit, because that is the ultimate limit. It is how the balance is done in settings of BIOS of CPU and GPU set in the EFI.

You may have observed differences in Core i7+RP 555, and Core i9+ RP560X.

Why was that? Because the Core i7 runs cooler, and is less power starved than Core i9's are, because it clocks lower. If the CPU will heat up faster, while consuming more power, the GPU will also heat up faster, and operate on higher temperaturess, because they are connected with one heatsink.

What I am saying is that 35W power limit on those GPUs was always 35W ultimate power limit.
Very much possible since the 555X has less cores. So probably its running under the 35 watts. But who cares? The 560X is going to be faster. Temperatures are meaningless.
It should not be this way. Both GPUs have the same power limit set, but different voltage curve, to fully utilize that power limit. IMO, what he describes is the difference in power and thermals of Core i9 vs Core i7, rather than those GPUs.
Sadly, by deprecating OpenCL, Apple are actively worsening this situation.
Everybody are deprecating OpenCL ;). OCL from this moment on, will become part of Vulkan. Vulkan and Metal will be very similar in their capabilities.
 
Very much possible since the 555X has less cores. So probably its running under the 35 watts. But who cares? The 560X is going to be faster. Temperatures are meaningless.
I disagree. I have 555x and the keyboard is really uncomfortable while gaming, to the point where I just feel like not doing it at all. I can mitigate it in MacOS by cranking up the fans (Apple keeps temps to 75C on both, so you have about a 1000rpm buffer on 555, giving about 13C lower temps). I’m hoping to find a solution to do the same on bootcamp.

What I am saying is that 35W power limit on those GPUs was always 35W ultimate power limit.

The 560x goes above 35W, at least that’s what is reported under “Radeon high side” sensor.
 
I've actually hit the point of needing to replace my notebook & am concerned about the thermal issues...i7 or i9 coupled with the Vega 20. I'm holding off until I see what is reported. If it turns out bad, I may give up on having a portable gaming machine.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.