Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
eGPUs are massive, and not portable at all. Some people have to bring their laptops other places besides their desk.

Well it's certainly a question of what you think is massive, but to my notion the eGPU solution for the Razer is not something I consider "massive":
aR_9EfLDx4kE.878x0.Z-Z96KYq.jpg


It might not be for everyone, but for on the road I'd say that you'd manage with the 580. Of course, if you're a video professional you'd need to bring the eGPU with you, but getting a bag to carry that together with the laptop would not be a big deal, and probably nothing that would happen very often. If you're a regular traveling video professional, I think you have a lot of other heavy duty equipment that is more of a burden.

Can't really think of a situation where the loss of the dGPU would be a constraint. Working from a café I don't think you'd do very much (professional) video work. At home or at the office an eGPU would be better and with tenfold performance compared to the dGPU.
 
In mainland Europe, for example, the prices on MBPs are already increased currently because of earlier currency effects. It doesn't come as a surprise to me they're thinking about lowering it again.
I would appreciate it if you were less antagonistic whenever someone shares new info. You'll have to decide for yourself what to believe and what not. I can't share my sources.

Thing is, even today Apple is itself not sure about what way to go with the next MBP. There's discussion on a lot of the things you are covering here. From competitiveness to i/d/eGPU discussions. They have different teams working on different options now. That explains why there are multiple streams of rumors contradicting still. When the process is streamlined and no contradictions reah this forum that's when you know to tune in.
Cheers
 
Are you saying that even 13" rmbp's with igpu's can have dgpu performance with external gpu unit? What type of prices are we talking about if true?

I think the 13" should stay with iGPU to keep the price low since this is still the low end option entry option (hope they phase out the non retina MacBook Pro), although there are companies offering 13" laptops with gtx960s. The razer blade 14" has a gtx 970 and it sure can game, but off course when the fans kick in it is incredibly loud, but in such a small package there was no other way.
 
I think some people have no needs for dGPUs and that's fine, which is why Apple offers both options, but there is a reason why dGPUs exist and there is a market for it and whether it is AMD or Nvidia (I personally favor Nvidia), I think they have their place in a Pro Laptop.

Whatever Intel says (and has been saying about all of their iGPUs) they will never match dGPUs, especially the high end like the GTX 970 and Radeon R9 390. Maybe something like a 940m, and even then the 940m will most likely still be more powerful. Some people might say they don't notice the dGPU's advantage in day to day basic tasks like web surfing and the like, but boot up the occasional World of Warcraft or DOTA2 and it'll definitely be obvious! And while gaming is not Apple's biggest market, a look at all of the youtube videos showing performance of brand new Macs at every refresh still shows there is a market for it and people are interested to know about those performances before purchasing the laptop. If you are gonna have the full package of awesome laptop design, incredible screen, great cpu performance and best in class battery life, you better include the dGPU, especially at the price they are asking!

I'm personally awaiting a refresh for Thunderbolt 3/external GPU but I'll still get the dGPU for on-the-go performance.
Intel says that iGPUs are currently faster than 80% of GPUs out there. And they do may have a point in it. HD540 has already performance of GT940M. What about Iris Pro HD580?
60% faster than that? It makes it in between of R9 M370X and GTX950M. And GTX950M is desktop GTX750 Ti.
What about Kaby Lake Iris Pro which brings not only new architecture, but also new memory controller for iGPU with dual channel EDRAM, that will accompany for 100GB/s of bandwidth. 50% faster than HD580? No. First rumors say it might be 80% faster than HD580.
And Cannonlake Iris Pro will be 50-60% faster than Kaby Lake. Which goes for exactly levels of GTX 970 performance.
Intel delivers more and more. But you already should learnt this lesson, that they have hardware, and design to make it really powerful.
 
Last edited:
There is no way that is possible, don't know if you could show me some benchmarks to backup your claims, but Intel iGPUs won't match those GPUs for years!
Why do you think so?

From technical perspective Intel is in favourable position for mobile other things being equal (TDP, nm, technological advances) because, well, they are right in the CPU and dGPU is an external unit. Why do you think SOCs are so popular in mobile nowadays and repairability and extendability of MacBooks is at 1 of 10 on iFixit for years?

If Intel goes all in it may very well create a 80W CPU with a huge die and TDP part dedicated to iGPU and it will very well outperform any dGPU suitable for mobile. It's just that mobility dictates them lesser TDP not more.
[doublepost=1454070535][/doublepost]
Whatever Intel says (and has been saying about all of their iGPUs) they will never match dGPUs, especially the high end like the GTX 970 and Radeon R9 390.
First - it depends on whether they can't or don't want (yet). I favour the latter.
Second - you see, comparing 145W and 275W GPUs with iGPU is out of question. You may say that Intel iGPU will never outperform them. I may say that there is no way you can insert them in mobile machine without providing something like a liquid-cooling dock or noisy fans anyway. Technically we're both correct. But practically there is little sense in this comparison.
[doublepost=1454070918][/doublepost]
Are you saying that even 13" rmbp's with igpu's can have dgpu performance with external gpu unit? What type of prices are we talking about if true?
Target at something like empty eGPU box for $400-600 + GPU cost. May be cheaper if a lot of companies sell them and competition drives price down because from parts perspective they should cost more like $200-250 IMHO.
[doublepost=1454071593][/doublepost]
Can't really think of a situation where the loss of the dGPU would be a constraint. Working from a café I don't think you'd do very much (professional) video work. At home or at the office an eGPU would be better and with tenfold performance compared to the dGPU.
It will be uncomfortable if you videoedit both at home and work and play games at home. This way you have to have 2 eGPUs or move it home to work and vice versa every day. Still, I favour having 2 eGPUs more than wasting nearly half of TDP having dGPU which is just slightly faster than iGPU.
[doublepost=1454071831][/doublepost]
In mainland Europe, for example, the prices on MBPs are already increased currently because of earlier currency effects. It doesn't come as a surprise to me they're thinking about lowering it again.
Unfortunately Apple has officially said in their last investor relations call they will counter exchange rate problems with higher prices and if profits fall as less devices are bought they hope to replace them using newer categories of devices not lowering the price. It was sad to hear in the region with currency not becoming stronger against $.
 
Last edited:
  • Like
Reactions: senthor
The 2010 13" had an NVidia 320M DGPU. That was done because the only Arrandale CPU with an iGPU was the 45W part. The next generation Sandybridge chips onwards, Apple abandoned the DGPU in favor of the HD graphics.

The classification of integrated vs discrete GPU on the basis of system RAM sharing is a questionable one, and not something with universal agreement. Strictly speaking, anything that's not on die is not integrated, though you can argue differently.

The 320m was not a dedicated GPU. It was integrated and shared system RAM. It's not really up for debate what "integrated GPU" means since the exact definition is a GPU that doesn't have its own RAM. (https://en.wikipedia.org/wiki/Graphics_processing_unit#Integrated_graphics_solutions)

The only difference with it compared to the Intel integrated GPUs is that it wasn't on the same die as the CPU, it was a custom Nvidia motherboard that Apple used. They did the same thing with the 9400m the previous year in the 13". The only reason Apple stopped using Nvidia integrated GPUs is because Intel changed their licensing requirements, making it impossible for nvidia to develop 3rd party motherboards that integrated their own GPUs.

There were Arrandale chips suitable for the 13" but Apple did not use them because GPU performance was so poor. So they decided to stick with the Core 2 Duo in the 13" with a custom GPU solution, and only update the 15" with Arrandale. Notice how all 15" MBPs came with dedicated graphics in 2010 because of the extremely poor intel offering.
 
Last edited:
Intel says that iGPUs are currently faster than 80% of GPUs out there. And they do may have a point in it. HD540 has already performance of GT940M. What about Iris Pro HD580?
60% faster than that? It makes it in between of R9 M370X and GTX950M. And GTX950M is desktop GTX750 Ti.
What about Kaby Lake Iris Pro which brings not only new architecture, but also new memory controller for iGPU with dual channel EDRAM, that will accompany for 100GB/s of bandwidth. 50% faster than HD580? No. First rumors say it might be 80% faster than HD580.
And Cannonlake Iris Pro will be 50-60% faster than Kaby Lake. Which goes for exactly levels of GTX 970 performance.
Intel delivers more and more. But you already should learnt this lesson, that they have hardware, and design to make it really powerful.
And how far off is Cannonlake? You don't think that Nvidia or AMD would have faster mobile chips by then?
 
And how far off is Cannonlake? You don't think that Nvidia or AMD would have faster mobile chips by then?
Oh they totally will. And it will always be the case that a dedicated GPU with dedicated Thermal headroom will beat a integrated solution in a similar TDP window.

But still, theirs a high element of "good enough" involved here. And Intel is very rapidly approaching it.
 
And how far off is Cannonlake? You don't think that Nvidia or AMD would have faster mobile chips by then?
The matter is not about how fast high end mobile chips will be but how fast they will be in thermal constrained environment. How much faster will be 20-35W GPUs than iGPU? Would it be viable option then? Cost of silicon, cos of GPU, cost of power in laptop. Thats what you have to account here.

Cannonlake chips will come in 2017. 14nm GPUs will stay with us for much longer than 28 nm lived. 5 -6 years is not that impossible. TSMC will have 10 nm in 2017, but not viable for massive production of any sort of chips for consumer market, because of the price of silicon wafers(300 thousand dollars per one wafer). Of course this is all about first batches of fab process, the prices will go down eventually.

That is why Integrated GPUs will be the go-to solution really soon. I do not know why people are so negative about them. In fact you will get performance of iGPU AND performance from External GPU like AMD Vega10. What can possibly be wrong with that?

I am not questioning that Polaris GPU will be extremely fast and efficient. The matter is: how fast it will be, and for how long? The biggest leaps in GPU performance were made not by Nvidia and AMD, but by Intel itself. And they will not stop this growth.
 
Well it's certainly a question of what you think is massive, but to my notion the eGPU solution for the Razer is not something I consider "massive":
aR_9EfLDx4kE.878x0.Z-Z96KYq.jpg


It might not be for everyone, but for on the road I'd say that you'd manage with the 580. Of course, if you're a video professional you'd need to bring the eGPU with you, but getting a bag to carry that together with the laptop would not be a big deal, and probably nothing that would happen very often. If you're a regular traveling video professional, I think you have a lot of other heavy duty equipment that is more of a burden.

Can't really think of a situation where the loss of the dGPU would be a constraint. Working from a café I don't think you'd do very much (professional) video work. At home or at the office an eGPU would be better and with tenfold performance compared to the dGPU.

Why not just carry an Imac or Mac Pro and monitor with you? Weighs about the same as that combo.
 
Why not just carry an Imac or Mac Pro and monitor with you? Weighs about the same as that combo.

can you use the Imac or Mac Pro unplugged? the blade stealth has a battery...

I don't get the logic of people who says why get the stealth / external GPU if you have a desktop...
 
I genuinely suggest waiting for what producers will come up with to the table of eDGPUs ;).

There will be a lot of surprises ;).
 
The matter is not about how fast high end mobile chips will be but how fast they will be in thermal constrained environment. How much faster will be 20-35W GPUs than iGPU? Would it be viable option then? Cost of silicon, cos of GPU, cost of power in laptop. Thats what you have to account here.

Cannonlake chips will come in 2017. 14nm GPUs will stay with us for much longer than 28 nm lived. 5 -6 years is not that impossible. TSMC will have 10 nm in 2017, but not viable for massive production of any sort of chips for consumer market, because of the price of silicon wafers(300 thousand dollars per one wafer). Of course this is all about first batches of fab process, the prices will go down eventually.

That is why Integrated GPUs will be the go-to solution really soon. I do not know why people are so negative about them. In fact you will get performance of iGPU AND performance from External GPU like AMD Vega10. What can possibly be wrong with that?

I am not questioning that Polaris GPU will be extremely fast and efficient. The matter is: how fast it will be, and for how long? The biggest leaps in GPU performance were made not by Nvidia and AMD, but by Intel itself. And they will not stop this growth.
28nm was the longest used process in modern gpu history but 2016 is 16nm and 14nm for big gpu and they will increase they performance about 1.6x times. 14nm iris pro with eDram is on performance level of 940m (28nm) using much less W but whait is 940m? It's 64bit bus with prehistoric gddr3 pice of **** graphic that can't even hold modern games on medium details :) For gamers discrete graphic is must and I don't know why Apple use so expensive cpu with this iris edram if these gpus are still useless.
 
Last edited:
Does the eGPU?

This.

Yeah it doesn't have a battery but the laptop has a keyboard, trackpad and the convenience it brings to the user to just plug the eGPU to have more graphic power... and you don't have to transfer data back and forth. remove the eGPU and it still works as a laptop.
 
Yeah it doesn't have a battery but the laptop has a keyboard, trackpad and the convenience it brings to the user to just plug the eGPU to have more graphic power... and you don't have to transfer data back and forth. remove the eGPU and it still works as a laptop.

And if you know you're going to game, leave the power adapter at home and just bring the eGPU since most of the TB3 models will pass thru power. I'm sure there will be more compact enclosures for smaller cards. We can't look at the Razer box, an enthusiast solution with a large footprint and lots of neon lights, as the design standard for eGPU. Its designed to look "cool". Something like the Akitio box would be better.
 
28nm was the longest used process in modern gpu history but 2016 is 16nm and 14nm for big gpu and they will increase they performance about 1.6x times. 14nm iris pro with eDram is on performance level of 940m (28nm) using much less W but whait is 940m? It's 64bit bus with prehistoric gddr3 pice of **** graphic that can't even hold modern games on medium details :) For gamers discrete graphic is must and I don't know why Apple use so expensive cpu with this iris edram if these gpus are still useless.
No. HD540(from 15W Macbook Air CPU) is on par with GT940M. HD580 which is Iris Pro will be 60-70% faster than that. And that will make it between R9 M370X and GTX950M.
 
No. HD540(from 15W Macbook Air CPU) is on par with GT940M. HD580 which is Iris Pro will be 60-70% faster than that. And that will make it between R9 M370X and GTX950M.
Give me link to some benchmark because just on paper iris 580 is about 1.5x iris pro 540 so you predict its scale better than lineral? And 950m is 1.9x 940m on 3dmark 11 benchmark. My prediction of iris 580 is 1.3x picie of ****, still useless
 
After much thought, I don't think the 15" rMBPs will see an update until June.

In March, I expect the rMB to receive a silent update to Skylake and maybe also the 13" rMBP, though I find this less plausible. I also expect a large amount of whining about the injustice of Apple's decision.

Then in June I expect a full realignment in Apple's laptop line:
--MBAs discontinued
--13" rMBPs discontinued
--15" rMBPs discontinued
--rMB line with 12" and 14" models (iGPUs only)
--rMBP line with 14" and 16" models. New form factor. TB3, USB3. (OLED screens? I'm not even sure if that is possible.) Introduced as a workstation for professionals. ("We did not forget those who got us here" mantra.) Xeon processors. GT4e on 14". dGPU for 16". The wait until June will become clear when Apple introduces Polaris (or Pascal) dGPUs at 14nm (or 16nm). The transition from Haswell CPUs with GT3e to Xeon CPUs with GT4e has been proposed by Intel already. The Xeon is really just a Skylake-H with a GT4e and a few other instructions enabled.

Apple may simply converge the two lines but I think they will continue to segment their machines between consumer and prosumer/professional lines.

My only concern is whether the glitch in the Skylake CPUs which causes them to crash under heavy workloads will affect these new machines.
 
We can't look at the Razer box, an enthusiast solution with a large footprint and lots of neon lights

Sure, you could strike the neon lights, but the footprint won't get much smaller than that. It's probably quite similar to the Akitio in size.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.