Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No but they're called Macbook Pro and the GPU in them is entry level at best.

Entry level is a) relative and b) a stretch and c) Pro doesn’t mean you can play the latest Call of Duty at 4K. Pretty sure I’ve been doing Professional work on MBPs with far lesser GPUs for years. Pretty sure millions of others have too, including technical, business, graphical and artwork. The attempt to latch Pro on to a single component is weird at best and dishonest at worst.
 
Last edited:
It's more to do with Apple trying to kill CUDA on macOS, which is not cross-platform across hardware vendors.

Apple don't want to be tied to a single vendor, ever. Right now they're in bed with intel for CPU and AMD for GPU, but given the open standards they are promoting, they could change vendors overnight if required. OpenCL / vulkan / metal alternatives also can be used on their own GPUs, whereas CUDA can not.

Continuing to have CUDA applications on the Mac ties them to NVIDIA, so they don't want to encourage CUDA in any way by shipping NVIDIA hardware.
Yes, Apple hates CUDA, but this isn't because they promote open standards. Apple is also against OpenCL/GL, shown by their deliberate non-support of it. I was cheering on Apple's GPU decisions until I found that out.
 
Last edited:
Inflation exists in literally every country in the world, it's about 3% a year on average in the US.

Looks like you made a mistake. You must have meant to type “30%” because that’s the only way you are going to explain away the massive year-over-year price hike when the touch bar was introduced.
 
It works without quicksync. As the Ryzen hackintoshes out there can attest. Sure, it's slower, but they could replace that code if required as i suspect it's provided to the application via OS library.

CUDA is a third party library that Apple has no control over.
Unfortunately being faster than adobe premiere pro is pretty much the only advantage of FCPX. Premiere has better feature set and also runs much slower on OS X.
[doublepost=1531810932][/doublepost]
Entry level is a) relative and b) a stretch and c) Pro doesn’t mean you can play the latest Call of Duty at 4K. Pretty sure I’ve been doing Professional work on MBPs with far lesser GPUs for years. Pretty sure millions of others have too, including technical, business, graphical and artwork. The attempt to latch Pro on to a single component is weird at best and dishonest at worst.
1) Polaris architecture is 3 years old
2) At the time of introduction in Mac in Q4 2016 it was THE slowest dGPU you could get in the market, the Radeon Pro 450 was 1 Tflops and barely on par with a 940MX often found in sub $1000 13” ultrabooks, and well below any Pascal chip. To be include in a $2400 15” laptop is a slap in the face.
3)You think a GPU is only for gaming? Sure it’s a big use, but also CAD, video rendering, and other usage.
4) The whole computing industry is shifting towards GPU computing. GPUs are more important than ever as CPU IPC year-on-year gain has been minimal.
5) Sure you might have done “Pro” work with garbage GPUs, but so did others with dual core CPUs, 8gb ram, and slower SSDs. Apple does not give the option of stronger GPUs. So they should delete the option of quad core and 6 cores too? And option of 32gb ram? Because some “professionals” don’t need them for their workflow?
 
  • Like
Reactions: throAU
Unfortunately being faster than adobe premiere pro is pretty much the only advantage of FCPX. Premiere has better feature set and also runs much slower on OS X.
[doublepost=1531810932][/doublepost]
1) Polaris architecture is 3 years old
2) At the time of introduction in Mac in Q4 2016 it was THE slowest dGPU you could get in the market, the Radeon Pro 450 was 1 Tflops and barely on par with a 940MX often found in sub $1000 13” ultrabooks, and well below any Pascal chip. To be include in a $2400 15” laptop is a slap in the face.
3)You think a GPU is only for gaming? Sure it’s a big use, but also CAD, video rendering, and other usage.
4) The whole computing industry is shifting towards GPU computing. GPUs are more important than ever as CPU IPC year-on-year gain has been minimal.
5) Sure you might have done “Pro” work with garbage GPUs, but so did others with dual core CPUs, 8gb ram, and slower SSDs. Apple does not give the option of stronger GPUs. So they should delete the option of quad core and 6 cores too? And option of 32gb ram? Because some “professionals” don’t need them for their workflow?

1) Core architecture is 12 years old. Look at my big old strawman

2) I travel a lot. I need a mobile solution. I don’t want, need or care about the Mac Pro. Others may but they should make decisions like actual professionals. See 6)

3) No, I don’t think it’s only for gaming. I also rarely see actual professionals select the wrong hardware for their use case, then cry about it.

4) Once again, if the GPU selection available does not meet your needs you, as a professional, should evaluate your options

5) The GPU is only there to provide UI fluidity for me. My work is done with terminals and VMware.

6) So, as a professional, if any of these things impacted my ability to make money I have some choices: -

Things I would do:
- Use Windows or Linux. In my profession either are viable, though I’d almost certainly select Linux as it’s a worthy second place in my use case.

Things I would not do:
- Spend hundreds of hours on a message board telling people who are content with the software and hardware they have or can have they are wrong

You know why? I’m an actual professional. Actual professionals who need to make a living learn, adapt and use whatever tools they need to do their job.
 
Last edited:
You need to get out of the house more...

MSI Gaming Laptop with nVidia 1060

We have some similar models at work for show laptops....they are about twice as thick as the 1st gen 15" touchbar MBPs.

Sure they have a bunch of "gimmick" looking crap; but I have been in the live production business a long time and these MSIs are by far the best ****ing windows laptops I have ever worked with. And until 4k starts becoming affordable, native 1080p displays are actually better than some weird ass Retina display resolution.
[doublepost=1531737852][/doublepost]

This could easily be done if Jonny wasn't so hell bent on making them paper thin.

I am going to add that the MSIs we bought were less than $1500; 16GB RAM, SDD + 1 conventional drive. 4GB GPU.

a 17 inches 1080p plastic laptop with 256GB M.2 SATA + 1TB (7200RPM) drive and the castrated 1060 3gb version for 1800$ don't really sound like a deal to me and this is the best price on Amazon....also it weight almost TWICE when do we start comparing Desktop with Macbook Pro?
 
Unlucky, sure, but uncommon for incidents of people getting multiple replacements? It shouldn't be surprising to see. Credible sounding estimates of the failure rate that I've seen passed around are between 5 and 12%. If it's anywhere around those numbers, there will be scores of people who are unlucky enough to get a bad copy multiple times.
If it is 5 to 12%, then yes. I was going after Apple's statements of "a small percentage of the keyboards" and interpreting this as something around 1 or 2%, but that might have been just me imagining Apple strongly emphasising the 'small' part in an oral statement.
It's probably BOTH. There's probably a flaw in the design, but the flaw is only fully exposed when there's a slippage in the manufacturing process. If it were a design flaw alone, we're going to see the failure rate trend toward 100% in year 1 and trend toward 0% after they adjust the design.
When I talk about a 'design flaw', I meant a design flaw that increases susceptibility to dust getting stuck and causing problems. Such a design flaw doesn't need to result in a 100% failure rate as it still requires the right kind of dust (and probably other factors like humidity) and most importantly a large enough amount of dust plus some bad luck as it will be a stochastic process.
I do believe that it's rational to conclude that dust has something to do with it. I just don't think it's as simple as it looks. Dust is probably not the only culprit and may not even be the main culprit. It's merely the most noticeable one.
So you think there could be two failure modes, one completely independent of dust and one a function of dust ingress (modulated by manufacturing variations)?
 
  • Like
Reactions: smirking
Why does Apple completely ignore GPU performance in the design of it's laptops?

Answer: higher performance GPU's require higher capacity batteries. Higher capacity batteries add more weight.

Yet, Apple still will not spec higher performance GPU's in their 15" models where shaving weight is less of a priority, if a priority at all.

Higher spec GPU's are also more expensive, but then again Apple already charges a ridiculous premium: couldn't they have rolled the price of GPU's into the cost, ever so slightly? After all, if anything, video editing is or was supposed to be, one of apple's primary niches.

Apple's design decisions are absolutely baffling. Unless you assume that apple MUST obsessively pursue the thinnest design at all costs.
 
  • Like
Reactions: geromi912
So you think there could be two failure modes, one completely independent of dust and one a function of dust ingress (modulated by manufacturing variations)?

Heh, I don't know anything. I'm just postulating that we could be seeing multiple methods of failure that result in similar outcomes. There are quite a few people here who are adamant that they've kept their MBP clean and completely free of crumbs and debris, but still ended up with a keyboard failure. Dust and debris doesn't appear to always be at the center of the issue. Earlier on, some people thought their keyboard issues had something to do with heat.
 
The butterfly keyboard is a failed design. Apparently they're using a stronger alloy and added a silicone wrap around the edges of the keys, but the action is still much louder and far less comfortable than the previous design. The minimal key travel is a design flaw, period. It simply doesn't allow a typist to have a good, tactile sense of when or whether they've actually actuated the keys or not.

What's odd is that Apple has literally been buried in an avalanche of complaints for two full years, yet they still refuse to abandon a design which most of it's users hate.
 
Given the strength of sales it’s seems Apple made the right call.

Apple don’t use Nvidia.
What AMD mobile GPU is out there that is significantly faster?



Why does Apple completely ignore GPU performance in the design of it's laptops?

Answer: higher performance GPU's require higher capacity batteries. Higher capacity batteries add more weight.

Yet, Apple still will not spec higher performance GPU's in their 15" models where shaving weight is less of a priority, if a priority at all.

Higher spec GPU's are also more expensive, but then again Apple already charges a ridiculous premium: couldn't they have rolled the price of GPU's into the cost, ever so slightly? After all, if anything, video editing is or was supposed to be, one of apple's primary niches.

Apple's design decisions are absolutely baffling. Unless you assume that apple MUST obsessively pursue the thinnest design at all costs.
 
a 17 inches 1080p plastic laptop with 256GB M.2 SATA + 1TB (7200RPM) drive and the castrated 1060 3gb version for 1800$ don't really sound like a deal to me and this is the best price on Amazon....also it weight almost TWICE when do we start comparing Desktop with Macbook Pro?

The ones we bought were 15",had 4GB VRAM for 1060 and were just over $1000. They're rental laptops, who cares if they are plastic. The primary purpose of them is driving graphics.
 
The ones we bought were 15",had 4GB VRAM for 1060 and were just over $1000. They're rental laptops, who cares if they are plastic. The primary purpose of them is driving graphics.

I can't really finde those a that price, i found a 15" 128GB SSD with i7-6700 and 12GB ram 1080p display for almost 1700$ discounted.

But I understand it may make sense for MSI to place some laptop parts in a plastic box and sell it with small margin, I also understand it make sense for some to buy those products. What I don't understand is comparing them with Macbook Pro...they are simply not the same product, it's like saying a fiat 500 is faster than a Ford Ranger and also cheaper so better.
 
soooooooooooooo...

The last Apple laptop I bought was a Macbook Air in 2012 - trying to get some help with this. Would you rather buy a:

  • 15" 2,6 i7 with 32GB ram or...
  • 15" 2,9 i9 with 16GB ram?

Would be almost the same price. I'll use it as a mobile companion to my maxed out iMac 5k with 40GB ram. I'll do some Ps, Lr, Ai – even some Ae work on it as well as some FCPX editing and maybe basic (!) Davinci Resolve grading. Don't get this wrong – I'll do my main work on my iMac. Only when I am out of office I have to be able to do basic modifications on existing projects.

I am talking future proofness – CPU over Ram or Ram over CPU?
 
soooooooooooooo...

The last Apple laptop I bought was a Macbook Air in 2012 - trying to get some help with this. Would you rather buy a:

  • 15" 2,6 i7 with 32GB ram or...
  • 15" 2,9 i9 with 16GB ram?

Would be almost the same price. I'll use it as a mobile companion to my maxed out iMac 5k with 40GB ram. I'll do some Ps, Lr, Ai – even some Ae work on it as well as some FCPX editing and maybe basic (!) Davinci Resolve grading. Don't get this wrong – I'll do my main work on my iMac. Only when I am out of office I have to be able to do basic modifications on existing projects.

I am talking future proofness – CPU over Ram or Ram over CPU?

I would wait real world test to see if i9 can sustain workload without throtling too much, and for real world test I don't mean random post on this website...
 
I can't really finde those a that price, i found a 15" 128GB SSD with i7-6700 and 12GB ram 1080p display for almost 1700$ discounted.

But I understand it may make sense for MSI to place some laptop parts in a plastic box and sell it with small margin, I also understand it make sense for some to buy those products. What I don't understand is comparing them with Macbook Pro...they are simply not the same product, it's like saying a fiat 500 is faster than a Ford Ranger and also cheaper so better.

I guess my beef is that Apple is selling something marketed as a "Pro" machine should be putting the best available parts in them that they can, within thermal limits, that are available. Right now the best available GPU parts are from nVidia. I'd be willing to sacrifice thinness and battery life to support those GPU options.

Also Apple constantly markets the importance of GPU power; so from my perspective it is a big "**** you" we don't like nVidia because of..."so we won't be putting the best available GPU parts in our machines..."

At some point you need to put pride aside. The fact that nVidia has been making their own drivers for OS X basically tells me they are taking the higher road, they could have just as easily said **** OS X.
 
I guess my beef is that Apple is selling something marketed as a "Pro" machine should be putting the best available parts in them that they can, within thermal limits, that are available. Right now the best available GPU parts are from nVidia. I'd be willing to sacrifice thinness and battery life to support those GPU options.

Also Apple constantly markets the importance of GPU power; so from my perspective it is a big "**** you" we don't like nVidia because of..."so we won't be putting the best available GPU parts in our machines..."

At some point you need to put pride aside. The fact that nVidia has been making their own drivers for OS X basically tells me they are taking the higher road, they could have just as easily said **** OS X.

I agree I would prefer Nvidia, but for the same TDP you get a 1050 non TI so it's not that far away in performance. this is the price to pay for mobility, if you only work on a desk than an external gpu makes so much more sense.

I'm a Pro and I need something to work in mobility, Pro don't mean Arnold Rendering otherwise there would be not laptop at all.
I would not sacrify weight and portability for GPU power because than I would need two laptop. You can argue that they could bring back a 17" workstation as additional product but please let's not transform the macbook in a powerbrick with 2 hours of autonomy.
 
  • Like
Reactions: clauzzz203
I agree I would prefer Nvidia, but for the same TDP you get a 1050 non TI so it's not that far away in performance. this is the price to pay for mobility, if you only work on a desk than an external gpu makes so much more sense.

I'm a Pro and I need something to work in mobility, Pro don't mean Arnold Rendering otherwise there would be not laptop at all.
I would not sacrify weight and portability for GPU power because than I would need two laptop. You can argue that they could bring back a 17" workstation as additional product but please let's not transform the macbook in a powerbrick with 2 hours of autonomy.

I thought that was the whole reason for automatic GPU switching; good battery life while on battery and better GPU performance when plugged in.

But yes I would agree that bringing back the 17" for this purpose would be the way to do it. Pro is probably the worst word used to describe computer usage. I personally hate it.
 
The minimal key travel is a design flaw, period. It simply doesn't allow a typist to have a good, tactile sense of when or whether they've actually actuated the keys or not.

Read through comments on this site. Quite a lot of people prefer the new keyboard's feel and some of them didn't like it at first. I'd be one of them. I'm a touch typist and a keyboard snob with dozens of keyboards. Keyboards are "intimate" pieces of equipment. People have widely varing tastes in how they want them to feel and you can adapt to a lot of different takes on the same device if you want to.
 
  • Like
Reactions: clauzzz203
Why does Apple completely ignore GPU performance in the design of it's laptops?

Answer: higher performance GPU's require higher capacity batteries. Higher capacity batteries add more weight.

Actually, i would posit that running ANY discrete GPU on battery is pretty useless - they'll all drain your battery in an hour or two (and put a huge amount of wear on the battery doing it due to fast drain and heat). So having a garbage discrete GPU that doesn't do the job in the hope of saving power to run on battery is stupid. So the battery excuse doesn't pass muster...

Far better would be to not run the discrete GPU unless manually enabled when on battery. You don't want to run a heavy workload on a macbook with discrete GPU on battery with it on your lap anyway. You won't burn your legs, but its close...

Then, when at a desk, on AC power, turn it on.

But hey, that would require:

- apple build a thermal solution that will handle the power/heat
- apple supply a power adapter that can supply an adequate amount of power (even on the current machines, the top end power supply is marginal.
- apple actually buying high end GPUs

Top end polaris (e.g., RX480, on which the current RX580 is based) is a mid-range GPU in 2016. I know, i own one.

Low end polaris (anything RX560 or lower) in 2018 is a joke. These are supposedly high end professional machines, remember. Not budget mid-range machines.

I say that as a total AMD fanboy with an RX480 (my old one) in the girl's PC and dual vega 64s in mine...


Give the Macbook pro a decent thermal solution. Give it a 150-200 watt power brick.
Give it a Vega 56 or Vega 64, down-clocked to fit inside 100 watts (it can be done by lowering clocks a bit, and will run relatively cool doing it - you'll make up for the lower clocks with the CU count).


That's a Macbook with a discrete GPU i'd actually buy. That is well within Apple's ability to produce. But they are not interested.
 
Last edited:
soooooooooooooo...

The last Apple laptop I bought was a Macbook Air in 2012 - trying to get some help with this. Would you rather buy a:

  • 15" 2,6 i7 with 32GB ram or...
  • 15" 2,9 i9 with 16GB ram?

Would be almost the same price. I'll use it as a mobile companion to my maxed out iMac 5k with 40GB ram. I'll do some Ps, Lr, Ai – even some Ae work on it as well as some FCPX editing and maybe basic (!) Davinci Resolve grading. Don't get this wrong – I'll do my main work on my iMac. Only when I am out of office I have to be able to do basic modifications on existing projects.

I am talking future proofness – CPU over Ram or Ram over CPU?

I'd do the 2.6/32 over the i9/16.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.