Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is it future proof? Will it last a long time?

There's no such thing as future-proofing. By the time the 450 gets outdated, the entire computer will be outdated too, as new technologies arrive. The difference between the two GPU won't change much in the long run. Choose based on what you need now. Both gpus will serve you well in the following years, and when the time comes to upgrade, the faster GPU won't make much of a difference. Raw performance is becoming less and less a reason to upgrade - it's the other technologies that will make you get a new computer - way before your GPU becomes incapable.

And it will last a while - for most tasks, expect 3, 4 or even more years. Even the 2012 MBP is still a very capable machine today, for example.
 
Is it future proof? Will it last a long time?
future proof is a myth. unless you're the type of person who uses the computer until physically breaks or can no longer run any website smoothly which is about 10+ years give or take, the computer itself will be "obsolete" even if you have the highest end options ticked and you have to buy a replacement computer.
 
  • Like
Reactions: aevan
Is it future proof? Will it last a long time?

It isn't even present proof given that nvidia has already released some amazing desktop level performance chips this year. It is a pretty significant leap as far as mobile GPUs go, totally outclassing the ones present in the mbps.

I really hope Apple utilizes nvidia chips in the next iteration, I would probably buy in and USB C should be more mainstream then.
 
Computers last 5 years. End of story. If it's not broken or outdated by then, Apple will introduce a mandatory OS update that will break it or make it so slow as to be useless. You're buying a new computer in 5 years regardless what spec you get now.
 
I really hope Apple utilizes nvidia chips in the next iteration
Unless something change in management, this is highly unlikely (at least for MBPs) cause Apple always prefer for energy efficient then RAW power. None of the NVidia 10-series GPU operates at a TDP close to the Radeon Pro 460 and I don't think NVidia care so much about making their chips more power efficient vs powerful
 
  • Like
Reactions: WRONG
It isn't even present proof given that nvidia has already released some amazing desktop level performance chips this year. It is a pretty significant leap as far as mobile GPUs go, totally outclassing the ones present in the mbps.

I really hope Apple utilizes nvidia chips in the next iteration, I would probably buy in and USB C should be more mainstream then.


Outclassing at what, exactly? For 35W cards, I don't think you can find better. I don't hope Apple utilizes Nvidia chips in the next iteration, I hope they utilize the most efficient ones (whether that's Nvidia or AMD), with the best OpenCL/Metal performance - like they did this year.

I could be wrong, but whenever someone says "I wish they used Nvidia" on these forums, I always imagine they are talking about gaming performance.
 
Maybe it's not clear enough that a lot of people weren't really bothering with Energy efficiency for the simple reason they belong to the category "most of the time the laptop is powered" rather than the "i'm running on battery".

And they do care about horsepower, not fuel saving. And they were asking for it.
And they still do. Wow/Dota2/Other stuff are just softwares that run occasionally on their machine, while the same can't be said for AE PR PS C4D.

People should realize that a certain segment (quite wide, face it) of professionals need power and not efficiency.
 
Maybe it's not clear enough that a lot of people weren't really bothering with Energy efficiency for the simple reason they belong to the category "most of the time the laptop is powered" rather than the "i'm running on battery".

And they do care about horsepower, not fuel saving. And they were asking for it.
And they still do. Wow/Dota2/Other stuff are just softwares that run occasionally on their machine, while the same can't be said for AE PR PS C4D.

People should realize that a certain segment (quite wide, face it) of professionals need power and not efficiency.
And that's why I said Apple should officially give a thumbs up on 3rd party eGPU solutions, probably by opening up API in OSX to make it possible but driver support and eGPU related support fall on the manufacturer and not Apple
 
Computers last 5 years. End of story. If it's not broken or outdated by then, Apple will introduce a mandatory OS update that will break it or make it so slow as to be useless. You're buying a new computer in 5 years regardless what spec you get now.

Screen Shot 2016-11-29 at 16.07.16.png
 
Is it future proof? Will it last a long time?
Outclassing at what, exactly? For 35W cards, I don't think you can find better. I don't hope Apple utilizes Nvidia chips in the next iteration, I hope they utilize the most efficient ones (whether that's Nvidia or AMD), with the best OpenCL/Metal performance - like they did this year.

I could be wrong, but whenever someone says "I wish they used Nvidia" on these forums, I always imagine they are talking about gaming performance.

Why does it need to be 35W? Are the new Nvidia mobile chips in razer for example really that terribly inefficient that they cannot function well in a slim machine?

Apple easily could have gone with Nvidia chips, they would maybe have had to retain the form factor of the previous model but it would have meant a lot better performance overall, especially on adobe software and games.
 
Oh rly?

And how would define the one i have in signature? Gaming oriented?

Cm'on, let us have a fat laugh, it's needed.

It's not gaming oriented, but built for efficiency. The 750 was not the most powerful laptop GPU of that time. I don't understand, what's your point?
 
Why does it need to be 35W? Are the new Nvidia mobile chips in razer for example really that terribly inefficient that they cannot function well in a slim machine?

It needs to be 35W to fit the total TDP budget Apple set up for the machine. That's driven both by cooling in the machine and the power consumption of the device.

If you look at the Razer Blade, which has the Nvidia GTX 1060, it comes with a 165W power supply. The 15" MBP has an 85W one. The GTX 1060 takes 120W of power, more than the entire MBP.
 
Why does it need to be 35W? Are the new Nvidia mobile chips in razer for example really that terribly inefficient that they cannot function well in a slim machine?

Apple easily could have gone with Nvidia chips, they would maybe have had to retain the form factor of the previous model but it would have meant a lot better performance overall, especially on adobe software and games.

So that they could make it as lightweight. Also there are certain other advantages to AMD. Look, I'm not criticizing anyone who prefers performance, I'm just saying Apple is not aiming for that.
 
Look, I'm not criticizing anyone who prefers performance, I'm just saying Apple is not aiming for that.
And what we're desperately trying to tell you is that quite a wide audience of users (those who are actually whining) is disappointed because being slim & energy efficient is not what they needed and most of all what their laptop were when they bought them.

The 750m wasn't the most powerful? True, but it was light years ahead in terms of performance compared to AMD because of CUDA. From the way you speak you've never, NEVER seen how CUDA impacts (compared to OpenCL) in tasks under AE / PR and in general the whole Adobe suite.

Would you mind telling us what kind of users are you? What softwares you run / what kind of work you do / etc?
Because you really sound like a prosumer that is the main target of this MBP.
 
Would you mind telling us what kind of users are you? What softwares you run / what kind of work you do / etc?
Because you really sound like a prosumer that is the main target of this MBP.

This has come up a million times (earlier in this very thread even), but to summarize:

The new MBP is the best laptop out there for:
- Final Cut Pro X
- Photoshop
- Lightroom
- Illustrator (arguably Surface Book Pro is better here, unless you use Wacom)
- InDesign
- Presenting WIP in small collaborative creative settings

...due to a combination of excellent GPU acceleration, unmatched SSD speeds and screen colour gamut, sound, form factor and coolness (of both thermal and cultural natures).

That's a ton of pros. Well-paid ones, too. The pros it does not serve as well (presumably you?) are those editing 4K video on Adobe Premiere [or Avid, Flame, Smoke, etc.], animators with heavy AfterEffects or 3D animation needs, web devs testing on multiple virtual machines at once, video game developers and and gamers themselves.

So basically it's meh for devs and studio, and great for creative and agency. And it's like, mind-blowing for marketing people. They're all going to want one.
 
Last edited:
And what we're desperately trying to tell you is that quite a wide audience of users (those who are actually whining) is disappointed because being slim & energy efficient is not what they needed and most of all what their laptop were when they bought them.

The 750m wasn't the most powerful? True, but it was light years ahead in terms of performance compared to AMD because of CUDA. From the way you speak you've never, NEVER seen how CUDA impacts (compared to OpenCL) in tasks under AE / PR and in general the whole Adobe suite.

Would you mind telling us what kind of users are you? What softwares you run / what kind of work you do / etc?
Because you really sound like a prosumer that is the main target of this MBP.


Well, first of all, I am sorry - but you are not a wide audience. It may feel that way because a lot of people here are into this whole "pro laptops must be workstations" thing, but for the vast majority of people, pro or not, the new MBPs are a perfect balance of power and portability. There is a small group of people that need more, and these people, sadly, need to look elsewhere. I get it. I'm not saying you guys don't matter. But you are a minority, no matter how much it may seem the opposite to you. In reality, the idea that you are a "wide audience" is a cognitive bias.

As for me - well, I've mentioned it several times, I'm an illustrator working in Photoshop, Zbrush and, lately, 3D Coat. I also use 3dsmax in Bootcamp. I don't consider these prosumer apps, nor lightweight.

I am very well aware that CUDA helps certain apps like Adobe Premiere, but it's not like Premiere is unusable without it. Still, I understand that some of you would prefer Nvidia, more RAM, etc. The problem, for me, is that everyone here (and Youtube, it seems) measures performance based on video editing.

Now, I'm not doing video - but a lot of the girls and guys from work create stunning stuff in AE, and a while back, we used to work on underpowered computers - we're a were a struggling developer, and like a lot of gaming studios worked on some really low-end hardware. You'd be amazed what video stuff can be done in AE on computers people here wouldn't use to open Word documents.
This is why I find it funny when I read people here say they cannot work with anything lower than some hardware level that wasn't even available last year and that "no pro can work without _____".

Sometimes, according to the posts here, it feels like:

1. For video, only Nvidia will do, and for 4K, only GeForce 1060 and up will do. Apparently, 4K video editing was impossible before Series 10 GeForce GPUs. Also, 16K videos, running billion VMs and stuff like that is done on laptops now. Desktops are for old people.

2. The only requirements that matter are those for video. Only video editing constitutes a pro, and the only benchmark of anything that is not an office machine or a games console is - Adobe Premiere. People earning money using FCPX are frauds. And everyone not doing video is a prosumer.

3. The most important thing for a video professional is the GPU/CPU temperature when running Mafia 3.


I'm not saying everyone is like this, but these people are most vocal.
 
Last edited:
And I like how Apple gets the blame for Adobe software not working as well as it should. Adobe should know by now that Apple will push OpenCL over CUDA, the Late 2013 MacPro could easily come with Nvidia cards but Apple chose not to, and start optimising their Mac apps for OpenCL. So instead of pressuring Apple, Adobe users should actually pressure Adobe.

And some reviews mentioning Apple not using KabyLake is funny when the appropriate KL CPU for MBPs doesn't exist yet at the moment.
 
It may feel that way...
CUT
I'll sum it up: prosumers/professionals will be fine with most of the specs/options, but the top specs don't fit the category of really demanding users, and these users are those that would have loved to have something more (like the 32Gbs of RAM) that others couldn't care less (those who are fine with 16) but for them are crucial.

And regarding the Gpu it's totally clear that AMD sucks way less fuel than nVIDIA who is like a 6-litre V8 muscle car coming from the seventies but hey, people were ready to pay for it because they wanted it. And it's not a small audience, pro or not pro. I wanted to write more to answer to you point by point but i'll refrain from setting up an e-peen contest.

The funny thing is that it you should be using a Win based system with such applications, not a Mac. And you know it :)
 
The funny thing is that it you should be using a Win based system with such applications, not a Mac. And you know it :)

No, I don't know it. Why? Because I could have a 32Gb Nvidia laptop? This is exactly the attitude that always gets me. The idea that Windows PC is better for "demanding workflows" is exclusive to these Machaters forums. In fact, most of the illustrators I know and follow online are Mac people.

First of all, I used a lot of Windows devices and several Macs, Photoshop and Zbrush work equally good on a Mac performance wise, but the advantages of the system for someone like me are really substantial. Photoshop works better on a Mac. Again, a lot of people here would probably just look at some benchmarks and say it's the same (for less $$$), but as I said, there's more to modern computing than just raw specs. The fact is - there are some really great Mac exclusive things in my workflow. Spotlight search of files that actually works, smart folders, Tags, PiP (for reference videos), turning off Application Frame in Photoshop, scaleable Photoshop UI (not just 100% and 200% modes like in the Windows version of PS), etc.

And not to mention universal copy-paste so I can just grab my iPad Pro and paste stuff from Procreate into Photoshop directly. A-m-a-z-i-n-g.

And, of course, there's the whole feel of the system. I have this great color picker 3rd party app called Frank DeLoupe (whimsical name, what can I say) that integrates directly with Photoshop and is Mac only. I have this great app for collecting references, Notability, that is Mac only. When I collect reference material, I just copy it all and this great little Mac-only app called Paste collects it all and then I just drag it all to Evernote or Notability. Or I just temporarily drag it to Yoink, another great Mac-only app. I also cannot work without region-select screen capture that is built in with the Mac (and automatically sends it to both my Macs and iPad thanks to that great universal copy-paste). I automate my references workflow with Mac-only apps like Alfred and Hazel. I write my design ideas and notes in a beautiful app called Bear, that also syncs to all my iOS devices. And I cannot even function any more with out Mac/iOS-only Fantastical 2.

And I can already hear Windows fans saying - you can get all that on Windows. No, I can't. I tried. I used to work on my iMac back home and on Windows 10 at work, but eventually, I had to get a MBP just because I really wanted to use macOS.


So, why should I be using Windows again? Because Nvidia?
 
So you put me in the category of the haters. The best definition of myself is, having worked a lot on both systems and also in other os environments (for example irix but i'm pretty much sure you don't know what it is, on what machines it ran and for what kind of software it was used), is "i express criticism where needed because i use this for a living" and guess what? In the past 6 years OSX was -THE- choice for me for a lot of reasons and because yes there are exclusives in the mac workflow and the OS itself is something that keeps me away from Win.

Now that i clarified:

- You named 3Ds Max, and why you have it on bootcamp. If you use it OCCASIONALLY then make sure you write down "OCCASIONALLY", because if it was your primary app used you'd be pretty stupid to bootcamp it 24/7. Fact;
- For certain things (such as 3Ds Max heavy use and a plethora of other 3d softwares) Windows have always offered better performance, period;
- If Ps and Zbrush is what you use most of the times then yes, you're good on a Mac. I wonder why you don't run Maya instead of bootcamping Max but no need to answer, must be something i'm missing;

This said, if you "got on" go whine and flame those that actually say windows is better for all workflows, i'm not one of them. You don't have to school me of all the beautiful and "magic" things you can do within the OSX ecosystem because i experience the same benefits, and on the other hand it could be me schooling you about what's best depending on the type of software you run.

I repeat:
"prosumers/professionals will be fine with most of the specs/options, but the top specs don't fit the category of really demanding users, and these users are those that would have loved to have something more (like the 32Gbs of RAM) that others couldn't care less (those who are fine with 16) but for them are crucial."

And regarding the AMD vs nVidia greatest forum war of our time you might want to investigate why people keep putting nVidia Gpus in their MacPros 5.1 rather than AMDs. You should really be wondering why they need power saving that much!

jackoatmon said:
So basically it's meh for devs and studio, and great for creative and agency. And it's like, mind-blowing for marketing people. They're all going to want one.

It's going to be "OOOOHHHH IT'S MAGIC!" for quite a bunch of people, but others that express criticism like me actually look at it in the same way a woodcutter looks at his axe, how sharp, weight, handle and things like this, not if there are inscriptions/decorations on the steel or on the handle.

Hope i made it enough clear.

They wanted to make a lighter machine that consumes less power. Some people actually love this because hey, the previous one was HEAVY AS HELL and was consuming too much power. The love it! On the other hand some people wanted it the same as today in terms of weight and shape but with more power (and with reasons). These are the heretics that need to burn in hell. Ooooook.

Gents, being able to weight the pros and cons is a sign of maturity. A fatter spec one wouldn't have hurt and would have made the "others" happier, period. This derailment has lasted enough posts, sorry to other readers.

Have a nice day everyone.
 
So you put me in the category of the haters. The best definition of myself is, having worked a lot on both systems and also in other os environments (for example irix but i'm pretty much sure you don't know what it is, on what machines it ran and for what kind of software it was used),

Oh, I miss Irix! With it's customized look for Motif, it was easily the best desktop experience on Unix.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.