Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacDevMike

macrumors regular
Jun 13, 2012
122
39
Discovery Bay, Ca
Unless the 850M will cost less than the 750M, it's difficult to imagine any reason why Apple would want to go through the engineering and production expense and risk of switching. All this talk of an 850M this year seems to be based on nothing but wishful thinking.

Definitely might be wishful thinking, but they have updated the card every year for the last two years. 650 was 2012, 750 in 2013, and putting the 850 in 2014 just makes sense. At some point, Nvidia doesn't want to sell 750's anymore, they have moved manufacturing to this years cards. And with the quantity of units Apple requires, they will move to what Nvidia is offering.
 

mcarling

macrumors 65816
Oct 22, 2009
1,292
180
Definitely might be wishful thinking, but they have updated the card every year for the last two years. 650 was 2012, 750 in 2013, and putting the 850 in 2014 just makes sense.

Apple have changed the GPU when and only when changing the CPU, because that is when a thorough redesign of the motherboard is necessary anyway. That has just happened to be once per year since the switch to Intel CPUs.
 

Narcaz

macrumors 6502
Jul 18, 2013
419
558
History repeats itself. Last year everyone hoped for a big refresh and most people got disappointed. I don't know, why this should be different this time. Anything else than an incremental update seems unlikely.

Processors maybe get an 0,1 upgrade. What else is possible? IGZO displays of course. Maybe there a finally ready for mass production.The 850M is possible for the top model, but beside the point mcarling made, there is still no real world benchmark. I bet nVidia's marketing used the 750m with DDR3 for the comparison. So it could be that the 50-80% faster shrink to to 20-40%. Why should Apple invest to implement this small upgrade when they plan to kill the dGPU in the long run? If the digitimes report from october is true, Apple only wanted the model with the Iris Pro, but they were surprised that it was slower than expected. So they implemented the 750m last minute.

My point is: lower your expectations and be nicely surprised, if they introduce something better than expected. I think last years refresh was a pleasant surprise. Some people of the forum community underestimate the late 2013 rMBP: but I like it: faster PCI-Express SSD, brighter screen, beefier processor, more battery life, Wifi AC and TB2 with 4k/60hz support and the 750m. That was way more than I expected for an incremental update and so I bought it.
 

Teuthos

macrumors member
Mar 21, 2014
45
0
History repeats itself. Last year everyone hoped for a big refresh and most people got disappointed. I don't know, why this should be different this time. Anything else than an incremental update seems unlikely.

Processors maybe get an 0,1 upgrade. What else is possible? IGZO displays of course. Maybe there a finally ready for mass production.The 850M is possible for the top model, but beside the point mcarling made, there is still no real world benchmark. I bet nVidia's marketing used the 750m with DDR3 for the comparison. So it could be that the 50-80% faster shrink to to 20-40%. Why should Apple invest to implement this small upgrade when they plan to kill the dGPU in the long run? If the digitimes report from october is true, Apple only wanted the model with the Iris Pro, but they were surprised that it was slower than expected. So they implemented the 750m last minute.

My point is: lower your expectations and be nicely surprised, if they introduce something better than expected. I think last years refresh was a pleasant surprise. Some people of the forum community underestimate the late 2013 rMBP: but I like it: faster PCI-Express SSD, brighter screen, beefier processor, more battery life, Wifi AC and TB2 with 4k/60hz support and the 750m. That was way more than I expected for an incremental update and so I bought it.

Good point about Nvidia using the DDR3 version for the comparison. But is the GDDR5 chip really that much fastser?

Also, has Intel given any release dates for when it might release a new Iris chip? By the looks of the increases on the 800M series, it looks like Intel has even further to go before their iGPUs actually manage to catch up.

About the upgrades - I agree with you, the late 2013 upgrade was, in my opinion, a good and significant upgrade - especially moving the 13-inch model from HD4000 to Iris 5100.
 

Teuthos

macrumors member
Mar 21, 2014
45
0
@Teuthos: 750m DDR3 vs. 750m GDDR5 version (which is ~33-80%faster): http://www.gaminglaptopsjunky.com/gt-750m-gddr5-vs-gt-750m-ddr3-gaming-performance-tested/ Just speculation, but if nvdias marketing used the 750m DDR3 for the 850m benchmark, Apples 750m with 2gb GDDDR5 won't look that bad (still slower but not 50-80%) in comparison.

Thanks for the link.

I did some very rough calculations on the GDDR5 750M versus the 850M in terms of speed.

On the link you posted the average difference between DDR3 and the over-clocked GDDR5 version (I believe Apple uses a slightly overclocked version?) is a staggering 57% frame rate difference in the games tested.

While I couldn't find any exact benchmarks for the 850M, it is said to be comparable to the 765M in terms of performance, so I essentially compared the 765M to the 750M.

For the new game Titanfall, the difference on high settings between the 765M frame rate (54,3 fps) and the 750M frame rate (33,3 fps) is 63,1%. Using the average increase of 57% in frame rate, that means that the slightly over-clocked GDDR5 would be somewhere around 52 fps. The lowest difference between any game given stats for in the link was 43,75%, and thus even if frame rates only increased by that much with the GDDR5 variant, the 750M would mathematically still achieve about 48 frames per second. That's 13% less than the 765M.

So if games really are on the average that much faster with the GDDR5 version, if the 765M is actually close to the 850M and if I'm not missing something, there's not that much of a difference.

And yes, I checked - the 33,3 frames per second for the 750M on Titanfall (chosen not because I play it but because it's a new game) is on a 967MHz 750M, or in other words, a DDR3 variant, not a GDDR5.

(Frame rates not listed in the link were taken from Notebook Check)
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
GT750M in MBP is underclocked(925 MHz on core vs. 967 plus Turbo Boost. Apple disables that feature in MBP).

GTX860M will be 5-7 W TDP lesser than GT750M, but again, Apple will define on their own the TDP of the GPU by turning off the Turbo Boost option.

Looking at this, its easy to say that Apple will OC the GTX850M to the levels of GTX860M(1100 MHz on Core? 6000 MHz on Memory? That would be awsome). And it can be at 25W of TDP.

The question is: What TDPs will Intel bring with Broadwell cause opinions are split in here... ;)


Edit: P.S. GTX850M on stock clocks is somewhat like GTX770M. On that clocks I posted above it will be faster than GTX680M. Which is mind blowing.
 
Last edited:

Narcaz

macrumors 6502
Jul 18, 2013
419
558
It is amazing , that nVidia manages to squeeze a lot out of the 28nm process with Maxwell. But an overclocked 850m might be just wishful thinking for gamers. It would be nice, but for a lot of reasons (product development, heat, power drain etc.) this is unrealistic. We will know for sure, when there are the first real world benchmarks.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
It is amazing , that nVidia manages to squeeze a lot out of the 28nm process with Maxwell. But an overclocked 850m might be just wishful thinking for gamers. It would be nice, but for a lot of reasons (product development, heat, power drain etc.) this is unrealistic. We will know for sure, when there are the first real world benchmarks.

OC'd version of GTX850M is GTX860M.

Apple in rMBP 2012 OC'd GT650M from 775 MHz on Core, and 1000 MHz on Memory(4000 MHz effective) to 900 MHz on core, and 1254 MHz(5016 MHz effective) on memory.

Thats what Im referring to.

And looking how Apple is working with GPU's lately, they will tinker somewhat the clocks on the GPU, and they will turn the Turbo Boost on it.
 

MacDevMike

macrumors regular
Jun 13, 2012
122
39
Discovery Bay, Ca
Thanks for the link.

I did some very rough calculations on the GDDR5 750M versus the 850M in terms of speed.

On the link you posted the average difference between DDR3 and the over-clocked GDDR5 version (I believe Apple uses a slightly overclocked version?) is a staggering 57% frame rate difference in the games tested.

While I couldn't find any exact benchmarks for the 850M, it is said to be comparable to the 765M in terms of performance, so I essentially compared the 765M to the 750M.

For the new game Titanfall, the difference on high settings between the 765M frame rate (54,3 fps) and the 750M frame rate (33,3 fps) is 63,1%. Using the average increase of 57% in frame rate, that means that the slightly over-clocked GDDR5 would be somewhere around 52 fps. The lowest difference between any game given stats for in the link was 43,75%, and thus even if frame rates only increased by that much with the GDDR5 variant, the 750M would mathematically still achieve about 48 frames per second. That's 13% less than the 765M.

So if games really are on the average that much faster with the GDDR5 version, if the 765M is actually close to the 850M and if I'm not missing something, there's not that much of a difference.

And yes, I checked - the 33,3 frames per second for the 750M on Titanfall (chosen not because I play it but because it's a new game) is on a 967MHz 750M, or in other words, a DDR3 variant, not a GDDR5.

(Frame rates not listed in the link were taken from Notebook Check)

Just go look at the 850m Performance link on NVidia's website. With a 750m, you currently get a Vantage score of about 11,300 on the current gen. The 850m with an i7 scores almost 17,000 on Vantage. Major difference.

http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-850m/performance
 

simon48

macrumors 65816
Sep 1, 2010
1,315
88
Not only is he probably right. But in ten years you'll come home and attach your cell phone to an external monitor, keyboard, and mouse. It will be that powerful.

But why? I've never understood this view. Yes, you could do it (and they're powerful enough now), but desktops and laptops will still get faster over time. A smart phone's power in 10 years will still be less powerful then a computer is then.

----------

Just go look at the 850m Performance link on NVidia's website. With a 750m, you currently get a Vantage score of about 11,300 on the current gen. The 850m with an i7 scores almost 17,000 on Vantage. Major difference.

http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-850m/performance

But Nvidia is sure to show the DDR3 scores for the 750m.
 

leman

macrumors Core
Oct 14, 2008
19,202
19,062
But why? I've never understood this view. Yes, you could do it (and they're powerful enough now), but desktops and laptops will still get faster over time. A smart phone's power in 10 years will still be less powerful then a computer is then.

This is true, but its also quite likely that a smartphone will be as fast as any normal user would need.

----------

But Nvidia is sure to show the DDR3 scores for the 750m.

Have you even looked at the link? It doesn't contain any scores for the 750m at all. The around 11K 3DMark Vantage score for 750M is indeed the value for the GDDR5 model and can be seen from various independent benchmarks. For instance here: http://www.notebookcheck.net/NVIDIA-GeForce-GT-750M.90245.0.html
 

simon48

macrumors 65816
Sep 1, 2010
1,315
88
This is true, but its also quite likely that a smartphone will be as fast as any normal user would need.

The power needed for the "normal user" is a constantly moving target. Power usage drives hardware and hardware drives power usage, it's a two-way push, pull relationship.

I don't see smartphones becoming everyones "God" device after its power hits a "magic" point. Maybe it will become a "God" device, but I don't see its speed to being "the" factor in it happening. It's more making the entire docking system seamless and making people want/like to do it. I don't see smartphones getting to a speed that makes everyone want to dock it, making sense and appealing to people.

----------

http://www.tomshardware.com/news/in...tium-anniversary-edition-broadwell,26326.html

I'm guessing that this launch of new Haswell CPUs in July will power refreshed MBPs.

"enthusiast-class CPU models" don't sound like rMBP CPUs.
 

Narcaz

macrumors 6502
Jul 18, 2013
419
558
Sounds like a september refresh. Like the chinese forum suggested. Would be nice if IGZO is ready.

@nVidia debate. I wanted to point out, that this 850m or Maxwell Hype is just based on some charts from nVidias marketing. Nothing objective, they can choose every best case scenario they want. And don't forget, that Apple moved from an overclocked 650m 2012 to Iris Pro in the base model and a crippled 750m in the top model due to heat and power problems. I wouldn't be surprised, if they skip Maxwell or only implement it in the top model.

But as mcarling pointed out, it seems unlikely to develop and test and new mainboard design for half a year, when you plan to kill the dGPU with Broadwell anyway. And I know that this is at this point just speculation, but the driving force behind Apple moves are sales and margins, not selling the best hardware. If the sale numbers of the 15'' rMBP with Iris Pro are good for them and the 750m falls behind, I don't see a reason to keep the dGPU model around. It would hurt me as a hobby gamer, but from an economic standpoint it would be totally reasonable to loose it, like the 17'' Macbook Pro.
 

leman

macrumors Core
Oct 14, 2008
19,202
19,062
The power needed for the "normal user" is a constantly moving target. Power usage drives hardware and hardware drives power usage, it's a two-way push, pull relationship.

Not quite. The momentum is provided by the demand for better content. We are now approaching the point where content quality is close to being 'perfect'. If I can supply user with the content of such high quality that any further improvement will not be perceivable - this constitutes a 'hard limit' for my reasonable performance demands. There is simply a point beyond which any performance increase is unnecessary - given the specific tasks the device is aimed at. Sure, we still have a way to go - there screen resolution is still not where it should be (I want to see at least 350-400 ppi monitors); same goes for the dynamic range of the monitors. But this is not that far off.

Or, looking at it from another angle. Say you work with an office suite. 'Fast enough' for you would mean instant response while working with multiple complex layouts. The performance increases quicker than the complexity of layouts. In fact, a modern ULV CPU already gives enough performance for these tasks. Do you really believe that an office suite will need that much more performance in 5 or 10 years? I don't think so - not unless the whole paradigm changes.

I don't see smartphones becoming everyones "God" device after its power hits a "magic" point. Maybe it will become a "God" device, but I don't see its speed to being "the" factor in it happening. It's more making the entire docking system seamless and making people want/like to do it. I don't see smartphones getting to a speed that makes everyone want to dock it, making sense and appealing to people.

Modern smartphones are already approaching or even surpassing laptops released 5-6 years ago. If this trend continues, they will surely reach the 'fast enough' status in not so distant future. Again, this mostly depends on whether we will see some sort of principally new content in the future or if we continue the current paradigm for a forceable time.


"enthusiast-class CPU models" don't sound like rMBP CPUs.

There will be plenty of mobile processors in the refresh as well.

http://wccftech.com/intel-haswell-r...nching-mid-2014-unlocked-design-improved-tim/
 

leman

macrumors Core
Oct 14, 2008
19,202
19,062
@nVidia debate. I wanted to point out, that this 850m or Maxwell Hype is just based on some charts from nVidias marketing. Nothing objective, they can choose every best case scenario they want. A

The chips are out and they have been benchmarked by independent reviewers. The 'hype' is certainly real.
 

mr.bee

macrumors 6502a
May 24, 2007
750
468
Antwerp, belgium
And I know that this is at this point just speculation, but the driving force behind Apple moves IS CONSUMER EXPERIENCE (and strategy), not selling the best hardware. If the CONSUMER EXPERIENCE of the 15'' rMBP with Iris Pro are good for them and the 750m falls behind, I don't see a reason to keep the dGPU model around. It would hurt me as a hobby gamer, but from an STRATEGIC standpoint it would be totally reasonable to loose it, like the 17'' Macbook Pro.

/fixed :D

The Iris Pro gives a very good consumer experience for most consumers and professionals. (CS has but a very marginal advantage with 750m)
a dGPU will only serve niche high end gaming and 3D/video rendering.
so a dGPU will only be suitable in a high end configuration as long as required according to Apple.

weight, noise, power consumption all are part of the consumer experience And I don't want it when I use my laptop 80% of my time for work purposes.Technology wise you can't bring a good experience and a high end gaming laptop at the same time.

on top, they are working on a new Apple TV with a focus on gaming. Apple prefers selling you a gaming console then focusing on cramping a dGPU in their Macbooks.
 

Narcaz

macrumors 6502
Jul 18, 2013
419
558
@mr.bee thanks for the support, but you make it sound like Apple is really interested in your consumer experience and not in your money. In this case a good consumer experience is just an euphemism for their interest in high margins and sale numbers. Apple did not care about the consumer experience of the uninformed peoples, who bought an iPad 2 last month. They only cared about their money.
 

xlii

macrumors 68000
Sep 19, 2006
1,867
121
Millis, Massachusetts
But why? I've never understood this view. Yes, you could do it (and they're powerful enough now), but desktops and laptops will still get faster over time. A smart phone's power in 10 years will still be less powerful then a computer is then.



As all three get more powerful a smart phone sized device will be powerful enough to satisfy the average computer user's processing needs. There will always be those few who need cutting edge...
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Sounds like a september refresh. Like the chinese forum suggested. Would be nice if IGZO is ready.

@nVidia debate. I wanted to point out, that this 850m or Maxwell Hype is just based on some charts from nVidias marketing. Nothing objective, they can choose every best case scenario they want. And don't forget, that Apple moved from an overclocked 650m 2012 to Iris Pro in the base model and a crippled 750m in the top model due to heat and power problems. I wouldn't be surprised, if they skip Maxwell or only implement it in the top model.

But as mcarling pointed out, it seems unlikely to develop and test and new mainboard design for half a year, when you plan to kill the dGPU with Broadwell anyway. And I know that this is at this point just speculation, but the driving force behind Apple moves are sales and margins, not selling the best hardware. If the sale numbers of the 15'' rMBP with Iris Pro are good for them and the 750m falls behind, I don't see a reason to keep the dGPU model around. It would hurt me as a hobby gamer, but from an economic standpoint it would be totally reasonable to loose it, like the 17'' Macbook Pro.
The problem is: 90% of all people that bought rMBP in this year bought higher end model with GT750M. Just look at signatures in this forum. 90% of them are models with GT750M.

Secondly. No, we dont debate about Maxwell GPU based on Nvidia marketing materials. We debate because there are tests of new Maxwell GPU and say this:

GTX860M is twice as fast, as GTX760M with 5-7 lower TDP than GT750M.

Apple will not kill the dGPU in MBP like you guys would want to. People who would not buy 15 inch computer without decent GPU is way more than people that would buy 15 inch computer with iGPU even it is quite fast.

But Maxwell will still be 70-100% faster, than Iris Pro from Broadwell CPUs.
 

brdeveloper

macrumors 68030
Apr 21, 2010
2,629
313
Brasil
Newer chipset and that's all. They're so much busy developing the 12" (igzo) retina Air. I would expect 32GB in 2015.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.