Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TallGuyGT

macrumors 6502
Aug 8, 2011
375
950
NYC
Still haven't seen a definitive release date for the Broadwell chips that will be used in the rMBP. I wonder if GPU (and maybe a screen resolution increase) will be the upgrades for the 2014 models. Maybe lower prices or higher specs for the same price too (base RAM and SSD size is very low on some models for the price).
 

TechZeke

macrumors 68020
Jul 29, 2012
2,441
2,238
Dallas, TX
Nivida just announced the 800 line of GPUS and the x50 has moved from GT to the GTX line. This means it is basically a slower x60. The 850GTX is a decent amount faster then the 760GTX and should be a huge improvement over the 750gt in the late 2013 rmbp.

http://www.anandtech.com/show/7834/nvidia-geforce-800m-lineup-battery-boost

Things look very interesting for the refresh.

GT 650M/750M should have been GTX in the first place. The desktop 650/750 is GTX.

IIRC, the 650M was only like 10% slower than the GTX 660M.
 

dusk007

macrumors 68040
Dec 5, 2009
3,409
98
Even a 860M shouldn't be a GTX. GTX used to be the extreme version and only refer to the high end. 7800GTX and maybe its dual GPU cousin like the X90 versions.

850M, 860M and even 870M are mainstream to lower high end and should be just GT. Especially now with the low end dying out it becomes even more ridiculous to dillute the GTX brand with even lower GPUs. It is just a name.

850M according to Nvidia is only meant to be used with DDR3. While companies have the option to run them with GDDR5 probably it makes the whole naming confusing with different 850Ms being not equal. I hope they just move it up and use an 860M with the rest of the industry just using the GPUs as they are meant so the name actually means something again. For comparison the status quo is weird.

Apple should just use a 860M. Clock it a little lower if they have to. Ultimately it is the same chip as long as its used with gddr5 but still 850Ms should just be cheap mainstream with DDR3 and GDDR5 should be 860M. Much less confusing for buyers.
 

theromz

macrumors regular
Original poster
Aug 22, 2013
116
0
Apple should just use a 860M. Clock it a little lower if they have to. Ultimately it is the same chip as long as its used with gddr5 but still 850Ms should just be cheap mainstream with DDR3 and GDDR5 should be 860M. Much less confusing for buyers.

Your explaining the 850GTX there, quoting the anandtech article.

One nice benefit of moving to the GTX class is that the 850M will require the use of GDDR5.

The only difference between the 850GTX and 860GTX is clock speeds now.
 

leman

macrumors P6
Oct 14, 2008
18,452
17,085
850M according to Nvidia is only meant to be used with DDR3.

You are wrong. Check Nvidia's website for product details. It was certainly the case with 650M and 750M, true, but it seems to be changing now.

Apple should just use a 860M. Clock it a little lower if they have to. Ultimately it is the same chip as long as its used with gddr5 but still 850Ms should just be cheap mainstream with DDR3 and GDDR5 should be 860M. Much less confusing for buyers.

But 860M is more expensive to buy. What Apple does is buying the (cheaper) x50M GPUs, and overclocking them to x60M GPU's levels. Of course, they have to refer to it as 850M to avoid legal problems.
 

dusk007

macrumors 68040
Dec 5, 2009
3,409
98
The 650M was used quite often with GDDR5 with the 750M there are only about 2 notebooks left I know of that still use GDDR5 all others either go for different GPUs or DDR3.
But 860M is more expensive to buy. What Apple does is buying the (cheaper) x50M GPUs, and overclocking them to x60M GPU's levels. Of course, they have to refer to it as 850M to avoid legal problems.
Sure but they only overclocked the 650M and so did pretty much everyone else in the market. It is still weird that Apple is so cheap when they charge the prices they charge customers. There is maybe a 10$ difference between the chips if at all for the manufacturer.
The 750M has turbo disabled and is clocked below even 750M specs. The 760M was a different chip so they could not reach it anyway but they did go the opposite way. It suggests that TDP between those a 650M stock and 750M stock settings changed quite a bit.
I am guessing though that Nvidia aims the 860M at 18-22mm notebooks and thus at a space that was occupied by 650M GDDR5 models.
I think the 850M actually has a lower TDP target and using it with GDDR5 and it runs up into 860M territory anyway.
There is a reason GPU manufacturers keep the TDP so secret. They always want to be able to carry their known names to wherever they can get the most design wins.

I wish Nvidia would stop that nonsense. It would even make economic sense for them to force manufacturers to buy a certain GPU for certain memory. The difference in performance is there and selling under the same brand name is just deceiving customers.
 

827538

Cancelled
Jul 3, 2013
2,322
2,833
I think people need to read up on these chips.

Apple will refresh the rMBP lineup with Broadwell and Maxwell, and will wait until those chips are ready (I'm talking 20nm TSMC Maxwell here). There will be no resolution bump, A - it's unnecessary, B - it would be minor at best, C - Apple will be busy improving the displays for their Thunderbolt/iMac/Air lineup. It will perhaps be an upgrade to IGZO with IPS but at current resolutions.

The only difference between the 850M GTX and 860M GTX is clock speed, their architecture and core counts are the same. Remember Apple used overclocked 650M's and 750M's for their Ivy and Haswell rMBP's.

I believe Apple will probably use a slightly overclocked 850M GTX with 2GB GDDR5 for their top end rMBP. They will not want to increase the TDP of the chip, while with Maxwell on a new 20nm process (down from 28nm) they can keep the same TDP while massively boosting performance by around 60% max, so you are potentially talking (providing not CPU or RAM/VRAM bandwidth constrained) an FPS boost from 35FPS to 56FPS in a best case scenario which is fantastic!

Couple that with a cooler, less energy hungry Broadwell i7 chip and we will have one hell of a rMBP. I don't believe much else will change though, perhaps an IGZO IPS display will be used. Basically faster, cooler, longer battery life which is all good. I will definitely be upgrading simply for Maxwell as I do game a fair bit on my already very capable rMBP. Student discounts + solid resale value make it worth my while.

It all comes down to can Intel pump out enough Broadwell chips by late Q3/ early Q4 and can TSMC provide enough 20nm Maxwell chips in time (most likely). Intel is the big variable, they've had troubles with yields - who can blame them, at 14nm FINFET lithography is about as cutting edge a field as humanity does. So we will just have to be patient and wait and see.

Best case scenario at the end of October/ start of November we will get this upgrade, worst case we will have to wait one or two months more.
 

ha1o2surfer

macrumors 6502
Sep 24, 2013
424
46
The GDDR5 variant of the GT750M that Apple uses actually outperforms the GTX660M.

not sure it can outperform my GTX 660m.. It would need to be not throttling AT ALL to even come close.

In another thread I see a haswell macbook with a 4850HQ and a 750m getting 650ish points in luxmark. That is very low compared to a GTX 660m and a 3840QM (similar CPU performance)
 
Last edited:

yjchua95

macrumors 604
Apr 23, 2011
6,725
233
GVA, KUL, MEL (current), ZQN
not sure it can outperform my GTX 660m.. It would need to be not throttling AT ALL to even come close.

In another thread I see a haswell macbook with a 4850HQ and a 750m getting 650ish points in luxmark. That is very low compared to a GTX 660m and a 3840QM (similar CPU performance)


I was quoting directly from NotebookCheck.com, but you strike a good point. Considering that the 750M is on a RMBP, throttling would prevent it from reaching its full potential.
 

ha1o2surfer

macrumors 6502
Sep 24, 2013
424
46
I was quoting directly from NotebookCheck.com, but you strike a good point. Considering that the 750M is on a RMBP, throttling would prevent it from reaching its full potential.

And you could also argue LuxMark is just a synthetic benchmark. I scored a good 200+ points above the rMBP config which is significant in Luxmark IMO. With the Intel HD 4000 and the 3840QM the iris Pro pretty much wipes the floor.. no question on that at all.
 

dusk007

macrumors 68040
Dec 5, 2009
3,409
98
I tThe only difference between the 850M GTX and 860M GTX is clock speed, their architecture and core counts are the same. Remember Apple used overclocked 650M's and 750M's for their Ivy and Haswell rMBP's.
The 650M was overclocked. I don't understand how everybody tries to sweep that under the rug. The 750M is actually substantially underclocked. It has disabled Turbo and is clocked at 925Mhz. The DELL XPS 15s 750M runs at about 1158Mhz in games with Turbo mode and starts out a 950Mhz minimum clock. Even if you use all the overclocking headroom possible you just get to the level of a stock 750M as the Dell has and as it is specced by Nvidia.

As for the rest of the post. Intel in its latest roadmaps delays Broadwell quite a lot. Such that it is very unlikely that an 15" MBP will show in 2014 at all. They will come in 2015. All they still have in the roadmaps for Q4 2014 are a couple of dual cores with the small GPUs.

From the total silence on all things 20nm and everybody apparently skipping it, it also remains to be seen how long a Maxwell 20nm is actually out. I think it is quite likley by the time those show Nvidia will rename that to 950M or a new naming sheme as they usually do when they hit the 9/10 numbers.

Maybe Apple will refresh the MBP in not to long a time with the current 850/860M and the big refresh with 14nm/20nm will be in 2015.


The Razer is really a weird case. If those press releases aren't all wrong they actually put the relatively inefficient power hungry 870M into a 14" and the 860M into the 17" model. How does that make any sense?
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
For some reasons Apple doesn't allow Turbo mode on GPU's. I Think that, what we will see in MBP when Broadwell comes out, is GTX850M OC'ed from stock Clocks to around 1000 MHz on Core and, presumably the memory clock will remain the same as stock(5000 MHz).


It will be just 29 MHz lower than GTX860M.
<imagination on>
But i imagine Apple putting GTX860M and OC'ing the base core to 1100 MHz...

</imagination off>
 

Vanilla35

macrumors 68040
Apr 11, 2013
3,344
1,453
Washington D.C.
The Razer is really a weird case. If those press releases aren't all wrong they actually put the relatively inefficient power hungry 870M into a 14" and the 860M into the 17" model. How does that make any sense?

It's because the 14" one has Alien technology and the 17" one doesn't ;)

For some reasons Apple doesn't allow Turbo mode on GPU's. I Think that, what we will see in MBP when Broadwell comes out, is GTX850M OC'ed from stock Clocks to around 1000 MHz on Core and, presumably the memory clock will remain the same as stock(5000 MHz).


It will be just 29 MHz lower than GTX860M.
<imagination on>
But i imagine Apple putting GTX860M and OC'ing the base core to 1100 MHz...

</imagination off>

I agree they'll probably keep a 850M and overclock it. To be honest they may not even do that though. They could very easily put a 850M in and overclock it to performance in between the 850M and 860M (swaying towards 850M though), however the 850M (stock clocks) already performs well (the 750M wasn't as strong even at release) so they might just keep it stock. Would mean less heat. **I just realized the stock 850M has lower TDP and less heat, so they might actually go through with the overclocking.
 

whitedragon101

macrumors 65816
Sep 11, 2008
1,336
332
850M according to Nvidia is only meant to be used with DDR3. While companies have the option to run them with GDDR5 probably it makes the whole naming confusing with different 850Ms being not equal.

Quote from Anandtech article :

"One nice benefit of moving to the GTX class is that the 850M will require the use of GDDR5. With previous generation mobile GPUs, NVIDIA often allowed OEMs to use either GDDR5 or DDR3. "

http://anandtech.com/show/7834/nvidia-geforce-800m-lineup-battery-boost

However. It is interesting that nVidia's website shows the 850m as GDDR 3 or 5 and the 860m as GDDR5 only. Either nVidia's website or the Anandtech article have their wires crossed.
 

iKrivetko

macrumors 6502a
May 28, 2010
652
551
The Razer is really a weird case. If those press releases aren't all wrong they actually put the relatively inefficient power hungry 870M into a 14" and the 860M into the 17" model. How does that make any sense?

The 14" has a qhd+ display, whereas the 17" is only fhd, so it is was absolutely logical to give more juice to the 14".
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.