Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A discrete GPU is not any more "real" than an integrated GPU. Your desire for the GPU to be discrete doesn't make any more sense than any of the following:

"I won't pay 2000 bucks for a laptop unless it has a discrete memory controller, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless it has a discrete north bridge, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless it has a discrete L2 cache, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless it has a discrete floating point processor, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless every core has its own die, not all the cores integrated into one CPU."

Integration has always been the way forward since the beginning o'f integrated circuits, is the way forward today, and will continue to be the way forward long after discrete GPUs have joined 14 inch hard drives in the dustbin of history.

And even with that is it a problem for you?A real one?really!
Are you kidding?
Are you really serious with that?
Do you really want to change the way i spend my money?
really?With this kind of..integration bla bla bla...nanometers bla bla bla...floating bla bla bla..cache bla bla bla..thats ..so ...boring,things then that YOU put so gently in my mouth..because are your arguments..and for whom i ask you to stop because I NEVER TALKED ABOUT THINGS.. like NORTH BRIDGE 0.0 !!!!!!!
Your statements/visions/opinions are just like the ones i saw last year about the Imacs..and to be clear for me its like you want to make me drink that buying an Imac for me is the same than buying a Mac Pro(the last one with pci express slots i mean) ( i then bought an Imac but only because of MacPro delay thinking to use that as display but the New Mac Pro,to be honest,.'m not sure its the one i was waiting,its a really power machine but what about Gpus?I mean without replaceable Gpus?Next Fall i'll know more.Little off topic here..sorry..back on track.)
Now,keep calm because its really simple,for some Pro the "Air Pro with Quad,retina and 5200 turbo" will worth the price for several different kind of pros that won't be.
When an AirMacbookpro will be at 1200 i could back and consider it again,
till that i won't buy it..by the way i'll wait next fall to make my final decision,
but in every case i won't pay 2000 or more greeny for an Air mac Book Pro..
deal with it because its not a problem for me.
I didn't say Apple hasn't the rights to make that...even because i suppose ..all we can do i supposing what Apple has already did but not yet finalized to announce it publicly..not even a chance to change something now.
I just said that an Air Mac book Pro is not the right product for me,while i have several other ones and i'm planning to buy a 5s and a mini retina to stay beside my 27 Imac GTX680Mx.
Its simple.I want/need a real Gpu with no shared memory.
To work.For games i will take a Ps4 because i'm too lazy to install games on my bootcamp partition,on my Win8 installation,with the right drivers and so on.
Macs in my case are for work on hard pro duties while when i'm on active session of musictherapy i use some apps on my ipad or iphone.
I also personally think that the kind of Pro i am and which kind of machine i need are not your business,even if appreciate to hit your attention and your time.Sorry.
Have a nice day.
 
Last edited:
..you're missing the point..not every people can afford (or have space for )"a desk and lap togheter" but they still need to work everywhere as pro as possible...so the macbook was ..a good compromise..for it..without that...good luck Apple...because old classic/r ivy 650 will sky rocket !!!! without..at least a quad core bto for the 13 and a discrete gpu bto for the 15....staying too expensive and too close to Air.
ps..
are you really saying to transport desk by desk a DustBin pro?..pay attention not to grab a trash while you're on the rush then...

Dear god... enough... with the... overuse of the... ellipsis...
 
nobody tells you to use mouse and touch screen at the same time. just switch them according to your needs
I do. I switch between MacBook and iPad according to my needs. I watch movies on my iMac and use my iPhone on the go. If I had the money to buy a car and a boat, I would acquire both. One for land and one for sea travel. What ever I need at the moment.
is that REALLY a mental overhead? one has to have cognitive capacity below average to get overwhelmed by just choosing between whether to use mouse or touch input.
Yes, it really is. Double Desktop = Cognitive Overhead and Added Memory Load
That's not what I'm saying, that's the finding of scientific usability researchers.

Windows 8 — Disappointing Usability for Both Novice and Power Users *bäm*
People aren't buying it because the customers are not exactly well informed about the uses, it can be a great laptop and great tablet as well.
Costumers also did not understand how great the Amphicar was. Instead they insisted on buying boats and cars separately. And they will continue to do so because they are so uninformedly stupid.
Microsoft tablet 2002 is a bad design, its huge, heavy and not ergonomic. but the Surface is different, it runs on a machine good enough to be considered as a laptop and lightweight to be hold as a tablet.
Microsoft Surface 2012 is a bad design, its huge, heavy and not ergonomic. Sure it is way smaller, lighter and easier to use than 2002s Tablet-PCs. But now we life in a world with iPad minis in it and there is a newer higher standard of small, light and usable.
do you even own a Surface to judge their design?
No!! And I will never even touch one. I have used every Windows from 3.11 for Workgroups until Windows 7. So I have an idea of what Microsoft the company is capable of and I entirely understand the market motivation behind combining their desktop monopoly with their tablet attempts. But thats not what I want as a user and customer.
great job by guessing the minds of the rest of population ...
Thank you, but it wasn't me alone.

"Almost everybody in the tech industry already knows the [Windows 8] experience is suboptimal," said Patrick Moorhead, president of research firm Moor Insights & Strategy. "Microsoft was the last one…in the room to realize this was the case." (wsj)

Good luck, Microsoft. Apple and Google will have made you irrelevant as an consumer electronics company within the next three years of ever falling PC sales. And Tablet-PCs are PCs, so keep on dying Surface.
 
Last edited:
Anyone who relies heavily on GPGPU tasks on their Macbook Pro will find the Iris Pro the newest rev a downgrade from the previous generation.

While this is probably true, I really have a hard time believing that Apple would opt for lesser power and performance. Relying on strictly a iGPU would make FCS3 (including Motion and FCP7), Adobe After Effects, and even the new Logic X, which I'm under the impression relies on a dGPU for it's plugin structure.

I mean, it's possible they'd do this. But they'd literally be cutting adrift almost all Adobe users and everyone on FCP X, Logic X, and everyone stranded on FCS3 (for work reasons). These are all programs designed to utilize the dGPU, and in Adobe's case, specifically the NVIDIA GPU. Apple has already lost a lot of market share in the post production editorial world, do they really want to just 'kill the baby' now?

Is it possible that the new MacBook Pro will have two separate GPUs like the current model does? That would make more sense to me. Walking away from an Nvidia dGPU seems suicidal at this point. Historically when Apple makes huge moves like that they alert third-party developer's with plenty of time, don't they?

I don't know... I just can't see Tim Cook onstage introducing a MacBook Pro that's slower than the last version. I could be wrong though. Apple's been pretty weird the last two years.
 
While this is probably true, I really have a hard time believing that Apple would opt for lesser power and performance. Relying on strictly a iGPU would make FCS3 (including Motion and FCP7), Adobe After Effects, and even the new Logic X, which I'm under the impression relies on a dGPU for it's plugin structure.

I mean, it's possible they'd do this. But they'd literally be cutting adrift almost all Adobe users and everyone on FCP X, Logic X, and everyone stranded on FCS3 (for work reasons). These are all programs designed to utilize the dGPU, and in Adobe's case, specifically the NVIDIA GPU. Apple has already lost a lot of market share in the post production editorial world, do they really want to just 'kill the baby' now?

Is it possible that the new MacBook Pro will have two separate GPUs like the current model does? That would make more sense to me. Walking away from an Nvidia dGPU seems suicidal at this point. Historically when Apple makes huge moves like that they alert third-party developer's with plenty of time, don't they?

I don't know... I just can't see Tim Cook onstage introducing a MacBook Pro that's slower than the last version. I could be wrong though. Apple's been pretty weird the last two years.

I've since found out I was wrong about integrated GPU OpenCL performance, and posted a link showing it's just about on par with the 650m in the current rMBP. It's a lateral upgrade in performance, but it will run cooler, and give you a decent boost to battery life.

Yeah, a high end discrete GPU will always be better if you need raw power, but it's looking more and more like integrated chips will be the nice midline option. Since the MBP has always been about balancing the most power in the smallest package with the longest battery life, I could see Apple going integrated only from here on out.

Though how good it ultimately is all depends on Broadwell, and if Intel is able to stay competitive with Nvidia and AMD's midline GPUs past that. The next two rMBPs look pretty good, but I doubt the everyone else is gonna sit around and let Intel take the lead here.
 
Thunderbolt 1 is 10Gb/s
Thunderbolt 2 is 20Gb/s
PCIe x8 v2.0 like the Mac Pro you referenced is 32Gb/s
PCIe x16 v2.0 is 64Gb/s
PCIe x16 v3.0 is 128Gb/s

Keep in mind when Intel and Apple discuss Thunderbolt they are talking in Gigabits not GigaBytes like on the PCIe spec pages. To make it simpler I have converted all GigaByte speeds in to Gigabit (GB -> Gb).

Thunderbolt 2 doesn't even reach the same performance as PCIe 2.0 x8 - It is closer to x4 (16Gb/s) and many sites have shown that modern graphics chips when run at PCIe 2.0 x4 speeds greatly diminish in performance. And the problem is compounded when looking at GPGPU workloads like those created by OpenCL and CUDA as those technologies heavily exchange data with the CPU and system memory.

Here is one benchmark showing an AMD HD 5870. This card launched in September 2009. That makes it almost 4 years old. Now take a look at the performance benchmarks:

Image

As you can see by dropping down from PCIe x16 to x4 the average frames per second fell by 13%. Now keep in mind that may not seem like a lot but remember this testing was done with a 4 year old graphics card that is much slower than modern day processors that one may wish to connect over Thunderbolt in an external chassis.

In-fact my own testing with my GTX 780's has confirmed this hypothesis and not with x8 PCIe 2.0 but x16 PCIe 2.0. I actually saw a 300 point increase in the Unigine benchmark just by changing from PCIe 2.0 x16 to 3.0 x16. That resulted in a 7% performance increase in graphics performance.

Thank you for the excellent post with real test numbers. I would be interested in seeing how the bus width/speed affects more recent processors such as the AMD 7970 and Nvidia GTX 780.
 
I could see Apple going integrated only from here on out.

I don't disagree with this. But now? Hardcore without warning of a transition? Without an Nividia dGPU they would essentially be putting every potential purchaser into one of two corners:

A) Buying the new MacBook Pro and NOT using Adobe software (as their software is written to depend on the CUDA drivers to function), or...

B) NOT upgrading on an Apple path. Possibly even cross-grading to a Windows platform that suits Adobe software better.

Don't laugh at option B. You would be surprised how many Apple-only shops I've worked at that are slowly switching over to PC again after the FCP X debacle.

If they went with a iGPU that doesn't support Adobe... oy vey. What a mess. In that scenario I'm sure Apple would have already updated their apps to work with an iGPU and just release them concurrently. But I'm also sure Adobe would upgrade ONLY their creative clouds apps and nothing backward. Those of us on stand alone Creative Suites would, essentially, be boned.

I'm trying to be optimistic--I need a new MacBook Pro badly--but Apple has been drawing lines in the sand a little too much the last couple years, mostly to their detriment. I mean, does anybody serious think Apple wants to still keep selling the old disk drive MBP? In the words of Will Smith, "Aw, hell no!" Going iGPU cold turkey would, I think, put them in the same position of having to carry the last version of hardware concurrently. All this talk of a 'special chip' for them must be an attempt on their part of trying to bridge that technical gap.

I hope.
 
Haven't we heard for decades that Apple users can't deal with the complexity of a two-button mouse?

How could Apple users possibly deal with choosing between a three-button mouse and touch input?

;)

Don't you see how it works? They used terrible slot loading optical drives as an excuse to drop them as early as possible and single button mice for the same reason:rolleyes:.
 
I've since found out I was wrong about integrated GPU OpenCL performance, and posted a link showing it's just about on par with the 650m in the current rMBP. It's a lateral upgrade in performance, but it will run cooler, and give you a decent boost to battery life.

Yeah, a high end discrete GPU will always be better if you need raw power, but it's looking more and more like integrated chips will be the nice midline option. Since the MBP has always been about balancing the most power in the smallest package with the longest battery life, I could see Apple going integrated only from here on out.

Though how good it ultimately is all depends on Broadwell, and if Intel is able to stay competitive with Nvidia and AMD's midline GPUs past that. The next two rMBPs look pretty good, but I doubt the everyone else is gonna sit around and let Intel take the lead here.

Unfortunately, the below OpenCL benchmark site apparently does not have any OS X benchmarks. Especially unfortunate considering that they have developed OpenCL benchmarks which should be easy to port to OS X. But, just for a hardware comparison, you might want to check this out:

http://www.clbenchmark.com/result.jsp

For Windows, they do have the latest Haswell results.
 
Unfortunately, the below OpenCL benchmark site apparently does not have any OS X benchmarks. Especially unfortunate considering that they have developed OpenCL benchmarks which should be easy to port to OS X. But, just for a hardware comparison, you might want to check this out:

http://www.clbenchmark.com/result.jsp

For Windows, they do have the latest Haswell results.

Wow. I never expected it to bench as well as the high end mobile cards, but some of those results aren't even half as high as the old HD4000. Think it could be a case of immature drivers?

edit: it's really hard finding OpenCL performance benchmarks for the Iris Pro. All I can tell from reading reviews and benchmarks elsewhere is that it just about matches the 650m, which doesn't seem to gel with what I just read above. Maybe I'm looking at the wrong model or something.
 
Last edited:
Wow. I never expected it to bench as well as the high end mobile cards, but some of those results aren't even half as high as the old HD4000. Think it could be a case of immature drivers?

edit: it's really hard finding OpenCL performance benchmarks for the Iris Pro. All I can tell from reading reviews and benchmarks elsewhere is that it just about matches the 650m, which doesn't seem to gel with what I just read above. Maybe I'm looking at the wrong model or something.

These are no doubt early results and the drivers no doubt are immature. It is probably more than drivers, though. It may also be that the new functionality is not being used optimally by OpenCL. Give it a few months. Even so, it will never look like an AMD 7970. It is all about giving most people enough most of the time for a given battery life.
 
So if I can take the new Haswell Retina with improved battery and I can keep gaming for like 6h without needing to charge it, then waiting could be worth it.
 
Worry is wasted energy

All this angst and teeth-gnashing is pointless. NO ONE except for Apple knows precisely what chip configuration will be in the coming Macbook Pro. Maybe the new Mac Pro is an indicator of the path Apple is taking, or maybe not. The fact remains we'll just have to wait until they announce the new MBPs to find out. (Apple's continued success leads me to believe that the those running the company are not as clueless as some here are suggesting.)

"Fear is in the future." As for the now, summer is fleeting, enjoy it while it lasts!
We'll deal with the fall when it gets here.
 
Well a dGPU would probably be better unless intel did something amazing. Is it really a 2.3? Or is it a 2.4? Most best buys don't sell the 2.3 since it was updated to 2.4 several months ago. And just recently any 2.3/2.6ghz "2012" model (not the "newer" 2.4/2.7/2.8s) were put on clearance for just $1299 in any store that still has them (not that many). If you just got it and it is the older 2.3, maybe you can go back to best buy and see if you can get some of your money back...

Haswell should bring better battery life (but probably no dGPU), faster SSDs, hopefully fixes to things like image retention, and 802.11ac.

I personally have given in and ordered one (a 2.6/512). Unfortunately I couldn't get the best buy clearance (Around $1650 with tax), only a few stores had it...) so I ended getting it from B&H ($2149, no tax in my state)

That being said unless apple lowered prices you still saved a haswell would be around $2500 with a 3yr warranty. And again, it sees more likely for apple to drop the dGPU for Iris, which isn't as good for gaming.

nope. I checked again to make sure. and It is the 2.3 Ghz version. Was I ripped offed? Should I go back and say something? What's the difference between the 2.3/2.6 2012 vs the 2.4/2.7/2.8? Is the only difference in the ghz? It is beautiful though! I'm loving the trackpad especially with the mutifinger touch functions!
 
Intel have always been saying that:

2011 "if you want good performance, wait for ivy bridge"

2012 "if you want good performance, haswell will knock our socks off"

2013 " if you care about GPU performance, wait for broad well"

----------



But someone will have to come forward and bring OpenCL up to the level of CUDA. That hasn't happened, but I'm glad there are companies like apple who are pushing for it.

There will always be PR spin saying the next thing will be so much better (then again, they actually never say that, they want you to buy something now). However, if you read a roadmap as well as some leaks, you can learn to identify clear distinctions between chips. For example, Haswell has been known to be HUGE for battery life for months. If someone wanted a laptop and needed long battery life, I would have told them to wait for that (and the new Haswell MacBook Airs prove it was worth waiting for that). Broadwell is similar with graphics performance. They'll be completely redoing the architecture of their execution units which hasn't been done since Clarkdale in 2010, plus combined with a 14nm, we should see a huge performance boost compared to even the Intel HD 5200 we know will show up in this year's Retina MacBook Pro (and when the numbers come out, everyone will call it a sidegrade at best, maybe a minor bump at most compared to the GT650M).

Only wait if you know what you are waiting for.

----------

While this is probably true, I really have a hard time believing that Apple would opt for lesser power and performance. Relying on strictly a iGPU would make FCS3 (including Motion and FCP7), Adobe After Effects, and even the new Logic X, which I'm under the impression relies on a dGPU for it's plugin structure.

I mean, it's possible they'd do this. But they'd literally be cutting adrift almost all Adobe users and everyone on FCP X, Logic X, and everyone stranded on FCS3 (for work reasons). These are all programs designed to utilize the dGPU, and in Adobe's case, specifically the NVIDIA GPU. Apple has already lost a lot of market share in the post production editorial world, do they really want to just 'kill the baby' now?

Is it possible that the new MacBook Pro will have two separate GPUs like the current model does? That would make more sense to me. Walking away from an Nvidia dGPU seems suicidal at this point. Historically when Apple makes huge moves like that they alert third-party developer's with plenty of time, don't they?

I don't know... I just can't see Tim Cook onstage introducing a MacBook Pro that's slower than the last version. I could be wrong though. Apple's been pretty weird the last two years.

Considering 2011 model used ATI GPUs, Apple is perfectly fine alienating people who rely on CUDA. Apple doesn't want to be beholden to using nVidia chips, simple as that.

I wouldn't be surprised if there isn't a dedicated GPU in this year's rMBP. Cutting that chip out removes a lot of added cooling needed to dissipate a max of 90W of heat from the system (45W CPU/45W GPU). Also likely means even MORE room for battery in addition to Haswell CPU improvements.
 
All this angst and teeth-gnashing is pointless.

Not when you're up against a wall like me and needing a new MacBook Pro in the weeks before a new hardware transition. And that factors worse if they go all in on an iGPU and the majority of my professional video editing and graphic and made 'incompatible' in a finger snap.

NO ONE except for Apple knows precisely what chip configuration will be in the coming Macbook Pro.

Exactly. Secrecy has proven itself a powerful sales tool for Apple but sometimes that can be a real problem, especially in the professional market. We're not talking about a prosumer setup at home, but a multi-seat, collaborative workflow that gets threatened every time Apple makes a hard left or right in their hardware implementation. Why do you think they announced the Mac Pro six months before it'll be available? Because any pro shop will have to plan for these machines and starting prepping for it in every way, including the hardware, the software, and their workflow. I don't think it's too much to ask that they treat the MacBook Pro the same way.

(Apple's continued success leads me to believe that the those running the company are not as clueless as some here are suggesting.)

Well, in the consumer space. They seem to have been hellbent on destroying their professional market (the one that kept Apple afloat in the 90's and earlier 2000's) and have left a wake of bad feelings behind them. In my business I don't know anyone who's as positive about Apple as they were five years ago. Pretty much everybody feels burned by the company after the X-Serve and FCP X debacles, and more importantly, by their investment in Apple. Most people are unaware that Apple had sales teams going around Hollywood pushing FCP 3 years ago, trying to get their foot in the door. There is a real and strongly emotional sense of betrayal there.

Look, Apple can continue with their 'we know what's best for you' approach. They just shouldn't be surprised if their 'take it or leave it' attitude makes some professionals leave it.
 
Not when you're up against a wall like me and needing a new MacBook Pro in the weeks before a new hardware transition. And that factors worse if they go all in on an iGPU and the majority of my professional video editing and graphic and made 'incompatible' in a finger snap.

If you're up against a wall like you are and Apple weren't going to be doing an upgrade.. Then what would you do? If you wait for everything then sooner or later you would end up not ever buying anything.
 
Not when you're up against a wall like me and needing a new MacBook Pro in the weeks before a new hardware transition.

The big improvement coming is in battery life, not performance. So, if you don't care that much about battery life, go with a current model. Or even, an earlier used model.
 
The big improvement coming is in battery life, not performance. So, if you don't care that much about battery life, go with a current model. Or even, an earlier used model.

Be very careful with the "battery life" expectations.

If the TDP for a new CPU is much better than the older one, you can reasonably expect correspondingly better battery life.

If the new TDP is about the same as the old TDP - you should assume that YMMV (Your Mileage May Vary).

Many of the power saving improvements in the last few generations of CPUs have focused on improving the power consumption when the system is idle or lightly used.

While this is very important, and for light users most important, power users (those that keep CPUs busy) won't see the same improvements.
 
Be very careful with the "battery life" expectations.

If the TDP for a new CPU is much better than the older one, you can reasonably expect correspondingly better battery life.

If the new TDP is about the same as the old TDP - you should assume that YMMV (Your Mileage May Vary).

Many of the power saving improvements in the last few generations of CPUs have focused on improving the power consumption when the system is idle or lightly used.

While this is very important, and for light users most important, power users (those that keep CPUs busy) won't see the same improvements.

Agree 100%. My unstated assumption is that most users have a relatively low duty cycle, and, that is the group that Apple is aiming to please with the Air and now the MBP-- thin, light, long battery life, fast when you need it (for a little while). Power users who have a high duty cycle will, I think, generally be underwhelmed.
 
Many of the power saving improvements in the last few generations of CPUs have focused on improving the power consumption when the system is idle or lightly used.

While this is very important, and for light users most important, power users (those that keep CPUs busy) won't see the same improvements.

Let's talk about that "lightly used" part for the moment, along with battery life. As cores/threads and available memory have increased (ex. early 2011 MBP w/ ML), I have gotten in the habit of leaving lots of apps running and (idle) browser tabs open. CPU usage remains very light, usually 6-9%. And yet, battery life is far worse than expected based on CPU usage (usually less than one hyperthread on one core's worth of usage). And, the fan sometimes picks way up. Does anyone know why this is? Certainly there are a lot of tabs going on the browsers, but, I think it might be more than that. I assume that the "idle" cores are getting turned on all the time. There is probably a good reason for it; I would just like to be able to predict and/or measure whatever it is, since, sometimes, I would actually like long battery life.
 
The complaints about iGPU vs. dGPU had me thinking, and I googled it a little.

Now, I am not doing 3D animation or HD video, and I don't really know how relevant this is, but according to the Passmark benchtest, the iGPU in my new MacBookAir (Intel HD5000 iGPU) outperforms the dGPU in my 2009 MacPro (nVidia GT120):

http://www.videocardbenchmark.net/gpu.php?gpu=Intel+HD+5000&id=2552
http://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GT+120&id=1402

They score 562 (HD5000) and 311 (GT120) respectively. For an audio dude like me, I'd say there is little to no reason to want a dGPU, no?

As for the heavy-lifters, shouldn't you be on a desktop machine anyway?
 
:Dintel has the best PC processors, it wouldn't have made any sense to use any other companies' processors, really excited.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.