Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mcarling

macrumors 65816
Oct 22, 2009
1,292
180
So you think Apple will offer nothing higher than their current $1999 model this year and that Iris Pro itself will drive a 4K+ workflow fine for years to come? One that likely will include external monitors of equal native resolution and also increasingly larger media files.

I don't know if you even understand what performance means but simply driving a display is not enough. Back in 2013 with this 2015 IGP, it'd be a different story. But going into 2016 and beyond? Demanding users would be very disappointed.

I believe that Intel integrated graphics will continue to improve faster than discrete graphics. It's easy to understand why. Each time the number of transistors doubles, a discrete GPU can only have twice as many transistors but the integrated graphics gets more than twice as many because Intel allocate less than twice as many to the CPU side of the integrated CPU/GPU leaving more than twice available for the GPU. It's been this way for several years already and will continue this way until Intel have killed off the discrete GPU business, first in laptops and then in desktops.
 

plastictoy

macrumors member
Jan 20, 2014
59
0
a discrete GPU can only have twice as many transistors but the integrated graphics gets more than twice as many because Intel allocate less than twice as many to the CPU side of the integrated CPU/GPU leaving more than twice available for the GPU

It's really hard to parse your words. Do you mean Intel allocates less transistors to their CPUs and thus increases the space available for GPUs? I don't know the technicals but so far, it's been mainly the same number of EU cores with architecture changes in Broadwell. They do allocate a lot of physical space for the GPU but it hasn't been a radical improvement year after year. To do a proper job, there wouldn't be a viable on-chip solution that warrants the increased costs and heat output.

Without eDRAM, Intel's graphics doesn't scale up all that well. I'm not going to rewrite what I've already explained several times but the simple conclusion is there's no way Intel can match what AMD or Nvidia has in the pipelines. Not on Skylake or Cannonlake.

If your argument is relying on the "good enough" rationale, then you clearly are missing the whole point. Why didn't Apple settle with D300s in their Mac Pro and not opt for higher-end cards? Because well, one is more money but another is that those upgraded cards are a huge boost in performance. You might not need it, heck, 99% of the consumer base won't, but for that 1% or whatever percentage, an increased workflow depends on the GPU. Same reason on the iMacs and everything else beyond stock IGP configurations. Why offer discrete? This is largely a rhetorical question because the answer is clear: performance. Intel is good but it's not offering an assuring replacement for content creators anytime soon.

What you believe about Intel is irrelevant. Time after time, manufacturers still offer discrete options because they know their target market will require more. More in perhaps driver support or hardware acceleration or multiple display output with something solely made to do one main task well.
 

dusk007

macrumors 68040
Dec 5, 2009
3,411
104
The biggest indication that Apple wants to drop dGPUs from all Apple notebooks is the simple fact that they didn't upgrade to the 850m which has been out for months and is about twice as fast as the really bad (for todays standard) 750M.

By the by since it has come up. 4k support is already a given on all Intel GPUs of the current generation. A lot of resolution doesn't require that much performance just for the pixels. ROP performance is most important and actually really good on the Iris Pro. Just for high resolution 750M has no reason to exist in those notebooks. It is other functions in shader performance and drivers where it has an edge but support of 4k external screens is no better or worse with it.

Apple wants to kill of dGPUs, so much is certain. The question is if they dare do it when the Maxwell GPUs are so much better now and Intel's next gen is not expected to make an equal jump in efficiency to keep up.
Integrated GPUs are fast enough if anybody considers the 750M fast enough. The only people who can still say that Iris Pro is just for low expectations enough must also complain that the 750M is a crap GPU and want notebooks with much bigger GPUs. Because the difference is small between those.
 

plastictoy

macrumors member
Jan 20, 2014
59
0
It's inevitable the 750M will go but to expect Iris Pro to fully compensate for a missing void is absurd. So what is the end result, the same performance of 2013 in a 2015/2016 model at a lower power consumption? Nothing more? It'll sell to those who simply buy the Pro line for that social badge but it doesn't sound like any model cut out for future workloads. Real workloads for those who push all the cores and pixels.

Processors don't change the workflows all that much if chosen correctly. New generation IPC is only 10-15%, ignorable seconds and minutes here and there. Memory is locked. But GPUs do show their age fast so it'd be unfortunate to put down $2000+ on a machine meant for a different, earlier time.

I don't think Apple wants to kill dedicated GPUs as much as they want to eliminate another potential point of failure. Bad history with them, bad for battery life, heat production, three points to engineer neurotically over. But there's no way around the performance hurdle without the aid of Nvidia/AMD.
 

dusk007

macrumors 68040
Dec 5, 2009
3,411
104
Processors don't change the workflows all that much if chosen correctly. New generation IPC is only 10-15%, ignorable seconds and minutes here and there. Memory is locked. But GPUs do show their age fast so it'd be unfortunate to put down $2000+ on a machine meant for a different, earlier time.
For the majority of people the processor is all that matters. The GPU is only ever really needed for the game at the side. There are far more writers, teachers, programmers, engineers (that don't need CAD which would be better served by a non Apple machine) and musicians than people who work in media production that requires GPU speed. One can also ignore almost all the people that need GPUs for Photoshop as for 90% of the specific things that are accelerated not a huge GPU is need just a GPU and lots of things only depend on the CPU again and most people also don't work with enough big files to really push it.
Just subtracts all those that rely on the CPU and enough RAM and you will see the ones left overall are a miniscule number. Maybe a vocal minority but still a small minority.
I am a programmer, I like OSX but I ever only need a GPU for the game at the side.

The thing is that most people don't need the strong GPUs and Intel's can serve the rest just fine. It is the same discussion as the 17" MBP. Sure some would love it. But in a survey of all Mac buyers they are in single digits if that. They just don't matter to Apple. In a way like the Mac Pro is more for prestige than anything else.
I think Apple should bring back the 17" notebook put a serious dGPU in it like a 965M and then they can kill the dGPU in the 15".
The 15" with its pathetic 750M doesn't even serve prestige. Apple used to bring updates like the 6770 from the 6750 in the christmas season. This time nothing even though the 850M would have been a huge upgrade and was out all summer long. More than enough time but nothing. The next Iris Pro will be slightly faster than the 750M and that makes it easier for marketing department to explain the lack of dGPU next iteration. It is the only reason I can see for why they haven't bothered with the 850M still.
 

plastictoy

macrumors member
Jan 20, 2014
59
0
It's like talking to a brick wall.

Can you ever get past the specious argument of "good enough for most"? That isn't the issue. I'm sure the majority will be buying the 13" model of either the Air or Pro, not dropping money on the 15". Certainly not on the $2500 high-end configuration. But for those who are, Apple would be stupid to not toss in a dedicated option. Look online and notice the two different configurations sold. One for people who don't need much, another basically an outdated top-end model which might be updated with something better. What Apple chooses to not update changes nothing about real-world demands. As a programmer, I'm surprised you understand none of this but then again anyone can claim to be anything online.

It's not the same discussion as the 17". That machine is physically larger affecting portability and usage. It's a different form factor unlike the 15" with simply internal modifications and corresponding temperature thresholds. Now that most expecting a larger replacement have moved down to the next fastest mobile machine, there is a need to keep the performance up.

These Pro machines are not supposed to be gimmicks although what you're proposing fits the goal of turning them into petty badges of honor. Plenty of professionals max out the Mac Pro, plenty use their only dedicated GPU option right now on one mobile machine, and almost every time a conscious person is buying the 15", they'll ask the question: discrete or integrated? Because for anyone who knows what they might be doing (hint: it's not the casual bunch you mentioned squandering money), they understand the importance of a dedicated GPU. And they will notice the removal of one no matter how much Apple tries to spin it.

EDIT: Tell me, why'd you opt for the discrete? The GT3e Iris Pro certainly works fine enough, as sanctioned by Apple, so what's the upgrade for?
 
Last edited:

mcarling

macrumors 65816
Oct 22, 2009
1,292
180
It's really hard to parse your words. Do you mean Intel allocates less transistors to their CPUs and thus increases the space available for GPUs?
If you prefer to discuss this in terms space rather than number of transistors, that's fine. The portion of the total die area occupied by the integrated GPU has been increasing and will continue to increase until the discrete GPU market is dead. For the mobile segment, that will be just a few more years. For the desktop segment, the discrete GPU market will struggle on several more years.

I don't know the technicals but so far, it's been mainly the same number of EU cores with architecture changes in Broadwell. They do allocate a lot of physical space for the GPU but it hasn't been a radical improvement year after year.
Intel have incrementally increased the percentage of die area allocated to the integrated GPU every year. Contrast that to discrete GPUs where the percentage of the die area allocated to the GPU remains fixed. In other ways, integrated and discrete GPUs have been advancing at similar rates, but the increasing die area available to the integrated graphics is helping integrated graphics to close the performance gap.

Without eDRAM, Intel's graphics doesn't scale up all that well.
<sarcasm mode on>
Well, then it's a good thing for discrete graphics that Intel's integrated graphics doesn't have eDRAM.
<sarcasm mode off>

I'm not going to rewrite what I've already explained several times but the simple conclusion is there's no way Intel can match what AMD or Nvidia has in the pipelines. Not on Skylake or Cannonlake.
Intel have no need to match what AMD or Nvidia have in the pipelines. Intel have closed the gap enough that the other advantages of integrated graphics more than compensate.

If your argument is relying on the "good enough" rationale, then you clearly are missing the whole point.
That's what discrete FPU fans were saying 25 years ago as integrated FPUs were killing off the discrete FPU market.

Because well, one is more money but another is that those upgraded cards are a huge boost in performance. You might not need it, heck, 99% of the consumer base won't, but for that 1% or whatever percentage, an increased workflow depends on the GPU.
So Apple should screw over 99% of the customers in order to please 1% of the customers? Sigh ....

What you believe about Intel is irrelevant. Time after time, manufacturers still offer discrete options because they know their target market will require more.
Exactly the same argument we heard from the discrete FPU fans 25 years ago. History proved the argument wrong then and it's again proving the argument wrong now.

It's inevitable the 750M will go but to expect Iris Pro to fully compensate for a missing void is absurd. So what is the end result, the same performance of 2013 in a 2015/2016 model at a lower power consumption? Nothing more?
Nothing more? Integrated graphics offers much more than just adequate performance. Remember, we have to look at this from Apple's perspective. Upgrading from discrete graphics to integrated graphics (once the performance is tolerable) allows for a massive reduction in upfront cost and (in expectation) a massive reduction in warranty costs in the future. It also frees up space for other things like more DRAM chips or a smaller motherboard to accommodate a large battery.

It'll sell to those who simply buy the Pro line for that social badge but it doesn't sound like any model cut out for future workloads. Real workloads for those who push all the cores and pixels.
A high-end 15" rMBP with Iris Pro 7200 integrated graphics will outperform the 750M and sell to those with heavy workloads who want mobility.

I don't think Apple wants to kill dedicated GPUs as much as they want to eliminate another potential point of failure.
This sentence is probably your strongest argument. Killing off the discrete GPU market is a strategic priority for Intel, not Apple. Apple want to eliminate a common point of failure and eliminate an expensive part which they don't need.

Bad history with them, bad for battery life, heat production, three points to engineer neurotically over.
The best engineering solution to that set of problems is integrated graphics.

But there's no way around the performance hurdle without the aid of Nvidia/AMD.
The only performance hurdle of relevance is that each generation of MBP have graphics performance that is at least as good as the previous generation and that overall performance of the MBP is better, especially with a price reduction of several hundred dollars/euro.
 

plastictoy

macrumors member
Jan 20, 2014
59
0
Are you a shill of Intel's or something? Try to actually look at their product announcements and compare what is being offered in most devices instead of assuming Intel is on track to some sort of graphics domination. Their chips shrink but the GPU is taking up what could otherwise be a much smaller processor or space for additional cores. But since a SoC solution is needed in the mobile market aimed for, graphics is given the priority these recent generations and processors mainly shrunk down for power reasons, not performance. Performance is hampered due to TDP so these HDXXXX specs mean little when it bogs down under heavy use.

There's no struggle in desktops. For businesses that need performance, they get Quadro for the drivers or take the gamble of a GeForce or Radeon in professional applications. For high performance, they take the Tesla or Titan or outsource work into the cloud. Guess what cloud providers run? Solutions provided by Nvidia and/or AMD, mostly the former.

You're looking at very basic needs which I have iterated several times that they have been fulfilled. Intel's real problem is getting those basic users to actually upgrade since daily performance hasn't been an issue since 45nm and the demise of AMD's processors in the enthusiast arena.

Forget it, there's no point explaining this any further.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
A high-end 15" rMBP with Iris Pro 7200 integrated graphics will outperform the 750M and sell to those with heavy workloads who want mobility.

Based on numbers we know about new GPUs from Intel we can say that Broadwell Iris Pro should be on par with Nvidia GTX670MX, which is pretty nice GPU.

HD7200 from Skylake should be 50% faster than Broadwell one. At least 50%, cause there will be a little advantage cause of faster DDR4 memory.

I really don't think that there will be need for dGPUs in laptops, not few years from now, but when the Skylake is out...
 

Count Blah

macrumors 68040
Jan 6, 2004
3,192
2,748
US of A
The biggest indication that Apple wants to drop dGPUs from all Apple notebooks is the simple fact that they didn't upgrade to the 850m which has been out for months and is about twice as fast as the really bad (for todays standard) 750M.

Months?!?!? Sorry, but apple doesn't bump dGPUs mid-year because something better came along. We're lucky they even offered a 2Gig dGPU. I would have figured they'd hold out until the rest of the PC world was at 4 or more across the board, before we would have seen 2Gig dGPUs.
 

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
Months?!?!? Sorry, but apple doesn't bump dGPUs mid-year because something better came along. We're lucky they even offered a 2Gig dGPU. I would have figured they'd hold out until the rest of the PC world was at 4 or more across the board, before we would have seen 2Gig dGPUs.

It has been out since March 2014..so almost 1 year.

Apple are just lazy.
 

ixxx69

macrumors 65816
Jul 31, 2009
1,294
878
United States
Intel have no need to match what AMD or Nvidia have in the pipelines. Intel have closed the gap enough that the other advantages of integrated graphics more than compensate.
Not for a number of applications areas and display demands. For the time being, Intel graphics are a great compromise for the majority, but there's still a significant market that wants better graphics than Intel can currently offer. And I need to get work done now, not in five years.

That's what discrete FPU fans were saying 25 years ago as integrated FPUs were killing off the discrete FPU market.
I don't think that's a good example because integrating the FPU into the CPU was a win-win... it's not like there was a loss in performance when the FPU was integrated.

So Apple should screw over 99% of the customers in order to please 1% of the customers? Sigh ....
Who exactly is Apple screwing over by keeping a dGPU?

Upgrading from discrete graphics to integrated graphics (once the performance is tolerable)...
Well, see, that's the key. No one would argue against Intel's integrated solution if it offered the same performance as dGPU solutions. The question is, when will that happen? When will Iris drive two 5K displays at the same time with enough performance left over to play a video game?

A high-end 15" rMBP with Iris Pro 7200 integrated graphics will outperform the 750M and sell to those with heavy workloads who want mobility.
You speak of the 750M as though it's the benchmark standard that once equaled, they're done.

The only performance hurdle of relevance is that each generation of MBP have graphics performance that is at least as good as the previous generation and that overall performance of the MBP is better, especially with a price reduction of several hundred dollars/euro.
Am I understanding correctly that you're saying that GPU performance never has to get better from here on out?

It's not like we're irrationally holding onto this notion of the dGPU. You know, we'd all think it's totally awesome if Intel was able to pack all the performance of AMD/Nvidia's top GPUs into their CPU. And maybe some day that will happen... but it's not happening by Skylake. Don't get me wrong, I'm rooting for Intel... but let's check back on that around 2020.

As far as what Apple chooses to do, who knows. But they would be absolute idiots to abandon the dGPU market at this point. Once you think your only market is the "average user", it's just a race to the bottom.
 

dusk007

macrumors 68040
Dec 5, 2009
3,411
104
It's like talking to a brick wall.
Sorry if it feels that way. The point I am trying to get across is that most people regardless of the price they pay, unless they need it for a game at side, pay for the form factor, the OS, build quality, the screen quality, the keyboard, touchpad everything but the dGPU. A dGPU is a bonus and only need for non work related things with these people. Not having a dGPU doesn't make it less professional, it makes it less of a toy aka gaming console.
I think it is valid to hope for the added utility of gaming or having something useful for when you attend a lan party once a year but for professional purposes the dGPU is not relevant (for most people). It is foolish to equate professionalism with dGPUs when that only holds true for a small minority.
EDIT: Tell me, why'd you opt for the discrete? The GT3e Iris Pro certainly works fine enough, as sanctioned by Apple, so what's the upgrade for?
As I said occasionally I do want to play a game and the notebook should handle it as well as possible. The upgrade is for joy not work.

----------

Months?!?!? Sorry, but apple doesn't bump dGPUs mid-year because something better came along. We're lucky they even offered a 2Gig dGPU. I would have figured they'd hold out until the rest of the PC world was at 4 or more across the board, before we would have seen 2Gig dGPUs.
They did in the past.
They bump the CPUs when the 100Mhz mid year speedbumps show up. Like the 4850 to the 4860hq.
They did bump GPUs to on such small refreshes and the 850M is not a small bump it is twice as fast. There should have been a refresh before chirstmas shopping time. Such as this time.
 

FrozenDarkness

macrumors 68000
Mar 21, 2009
1,727
968
There will never be a day when Intel graphics will beat discrete graphics. You can't battle the laws of physics. More area means more power. that's really all there is to it.

you can only argue if that power is necessary and we honestly have only hit that question with CPUs, definitely not with GPUs.
 

Count Blah

macrumors 68040
Jan 6, 2004
3,192
2,748
US of A
It has been out since March 2014..so almost 1 year.

Apple are just lazy.

Doesn't fit into apple's plans - plain and simple. It's sad, but true. Apple doesn't release a new model/version every other month, like some manufacturers.

Accept it, or move on. It's one of the many things apple users have to deal with - plan accordingly.

----------

Sorry if it feels that way. The point I am trying to get across is that most people regardless of the price they pay, unless they need it for a game at side, pay for the form factor, the OS, build quality, the screen quality, the keyboard, touchpad everything but the dGPU. A dGPU is a bonus and only need for non work related things with these people. Not having a dGPU doesn't make it less professional, it makes it less of a toy aka gaming console.
I think it is valid to hope for the added utility of gaming or having something useful for when you attend a lan party once a year but for professional purposes the dGPU is not relevant (for most people). It is foolish to equate professionalism with dGPUs when that only holds true for a small minority.

As I said occasionally I do want to play a game and the notebook should handle it as well as possible. The upgrade is for joy not work.

----------

They did in the past.
They bump the CPUs when the 100Mhz mid year speedbumps show up. Like the 4850 to the 4860hq.
They did bump GPUs to on such small refreshes and the 850M is not a small bump it is twice as fast. There should have been a refresh before chirstmas shopping time. Such as this time.

CPUs are very much different that dGPUs. If it's the same footprint, and instruction set, there is little to nothing to get in the way of making the switch. But for a GPU, you have many more factors involved in a switch. Apples and oranges.
 

t0mat0

macrumors 603
Aug 29, 2006
5,473
284
Home
http://www.kitguru.net/components/c...ake-processors-for-desktops-to-third-quarter/

Has it only been the MBA where Apple has a CPU that wasn't announced/had a product with a CPU that was simultaneously launching?

A source with knowledge of Intel’s plans said that the chipmaker will delay commercial launch of the “Skylake” processors to the third quarter of 2015. Traditionally Intel released new microprocessors in the first half of the year (it did so with Westmere, Sandy Bridge, Ivy Bridge, Haswell and Haswell Refresh), but this time the company will not ship its new chips earlier than sometimes in Q3 2015.
 

mcarling

macrumors 65816
Oct 22, 2009
1,292
180
There will never be a day when Intel graphics will beat discrete graphics. You can't battle the laws of physics. More area means more power. that's really all there is to it.
You're ignoring the speed of communication between the CPU and GPU, which is much faster in the integrated case than the discrete case.

you can only argue if that power is necessary and we honestly have only hit that question with CPUs, definitely not with GPUs.
Both CPUs and GPUs are doing computations. Your assertion that there is an upper limit for one and not the other requires some kind of an argument to support it.
 

leman

macrumors Core
Oct 14, 2008
19,183
19,029
There will never be a day when Intel graphics will beat discrete graphics. You can't battle the laws of physics. More area means more power. that's really all there is to it.

That might very well be true, but you are not taking into account that one day integrated graphics will be fast enough for all practical purposes of personal computing. As the technology progresses, there is a call of higher integration. This is why specialised accelerators (coprocessors) have always ended up as part of the CPU. This happened to math coprocessors, specialised sound cards, and I have no doubt that it will happen to GPUs as well. Modern CPUs are taking a lot of traits of GPUs anyway, with ultra-wide registers, FMA instructions etc. One we get even wider registers and high-bandwidth memory — both things that are not far off — GPUs as an class of devices should start loosing their relevance. I am quite sure that in not so far future, 3D graphics will be done on the CPU, using the SIMD instructions.
 

Pelea

Suspended
Original poster
Oct 5, 2014
512
1,444
how will integrated graphics ever be as good as not having dedicated ones..? developers are still adding more and more realistic effects in gaming, thus requiring more graphics power.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
how will integrated graphics ever be as good as not having dedicated ones..? developers are still adding more and more realistic effects in gaming, thus requiring more graphics power.

And what is right now in games that is not able to run on 4 year old right now cards eg. Radeon HD7870?

For 1080p its completely enough, and soon Integrated GPUs will be even faster than that GPU.
 

Loops

macrumors regular
Oct 5, 2010
104
8
Ah... Intel.

Announce to the world six days ago that Skylake will ship in Q2.

Then announce three days later it will ship in Q3.
 

Pelea

Suspended
Original poster
Oct 5, 2014
512
1,444
Ah... Intel.

Announce to the world six days ago that Skylake will ship in Q2.

Then announce three days later it will ship in Q3.

so it's gonna be delayed till 2095 at that rate >.<
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.