Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
None of those cards you mentioned are considered top tier performance cards. Which is what my point was. It would be nice to have a Titan in a MBP though. I'm looking forward to Maxwell as well.

Im not saying changing but surely they have kept themselves in the mainstream area. There is such a gap in performance of higher end parts, that to consider this 650m anything else from mainstream is absurd.

So the point is apple = mainstream performance, nothing more nothing less

Of course none of the GPUs are considered top of their tier. However they have been getting higher and higher on the list. For example, on Notebookcheck.net, the latest GPUs used by Apple have been in the bottom half of Class 1, whereas previously they have been distinctly Class 2. No doubt hastened by the creation of BootCamp, Apple are no longer designing MBPs to play Mac games, but any Windows game too.

And the 6750M-6770M and 650M-660M are not Mainstream, nor are they mid-range. They are Performance cards, that can play any computer game at reasonable settings. What they are not are Enthusiast level cards, like the 7950M or 680M.
 
the problem is that that thing throttles, while still the cpu and the gpu are still in the 80c range, the problem is lack of power provided, dont know whos the engineer that gets to decide those things, put it should be put to the gallows

Indeed. However I wouldn't assume that just because the Chronos throttles like a V12 with a clogged fuel filter doesn't mean an Apple designed notebook with the same specs would do the same. After all, Apple have been running their hardware at over 90˚C for years now, and with higher power draw than the power adapter can provide, without throttling (except with buggy firmware, like the RMBP).

That said, I think Apple need to think about providing a nice 120W power adapter with future models.
 
Of course none of the GPUs are considered top of their tier. However they have been getting higher and higher on the list. For example, on Notebookcheck.net, the latest GPUs used by Apple have been in the bottom half of Class 1, whereas previously they have been distinctly Class 2. No doubt hastened by the creation of BootCamp, Apple are no longer designing MBPs to play Mac games, but any Windows game too.

And the 6750M-6770M and 650M-660M are not Mainstream, nor are they mid-range. They are Performance cards, that can play any computer game at reasonable settings. What they are not are Enthusiast level cards, like the 7950M or 680M.
I dot really use notebookcheck as a source for anything. They are not a good review site despite their focus on notebooks.

Still they are only high tier of midrange, nothing more nothing less.

Take a look at the difference from the 670mx to the 650m, I discount the 660m because that is only a higher clocked 650m with GDDR5 (not all models of the 650m have GGDR5), its astonishing.
Indeed. However I wouldn't assume that just because the Chronos throttles like a V12 with a clogged fuel filter doesn't mean an Apple designed notebook with the same specs would do the same. After all, Apple have been running their hardware at over 90˚C for years now, and with higher power draw than the power adapter can provide, without throttling (except with buggy firmware, like the RMBP).

That said, I think Apple need to think about providing a nice 120W power adapter with future models.

The problem is not the temps, but the power consumption. I agree they should put 120w on those things. the 85w should go for the 13
 
I dot really use notebookcheck as a source for anything. They are not a good review site despite their focus on notebooks.

Still they are only high tier of midrange, nothing more nothing less.

Take a look at the difference from the 670mx to the 650m, I discount the 660m because that is only a higher clocked 650m with GDDR5 (not all models of the 650m have GGDR5), its astonishing.


The problem is not the temps, but the power consumption. I agree they should put 120w on those things. the 85w should go for the 13

I guess it is just semantics and perspective, however I still consider the 650M as performance class, or maybe performance thin. I also think the 670MX is enthusiast level. I consider mainstream to be basically Intel integrated graphics, as it's the most common GPU, and midrange to be around the 20-30W range.
 
I guess it is just semantics and perspective, however I still consider the 650M as performance class, or maybe performance thin. I also think the 670MX is enthusiast level. I consider mainstream to be basically Intel integrated graphics, as it's the most common GPU, and midrange to be around the 20-30W range.

that actually works, but from the perspective of dgpus I go as follow:

entry - 630m and downwards
mainstream - 640m to 660m
high end - 670mx to 680mx

you can put your choice of amd gpus in there as well.

For example the 670mx is around the 675m/580m/485m (they are the same gpu), and that is just 100% better performance to the 650m, or a little bit lower in the 660m, not much since the difference between those is around 10-15%

and thats is why I use the 670mx, because of the performance gap is so large

Using the same metric I put the HD4000 as entry level, as it should be, despite mainstream adoption, very large gap in performance to the 650m
 
http://www.techpowerup.com/gpudb/2224/.html

Looking at this, GT750M will have 33W TDP. Its 17W's less, than Radeon HD8870M.

In fact - GT750M will be a lot faster than GT660M, and will be very close to 8870M. 967 MHz on core clock, plus Boost, plus 5000 MHz on GDDR5 memory clocks.

It will be good GPU for rMBP.
 
Boost of 750m that u never see on rmbp because heat problems..and on sheets considering boost the750m is around 25%slower then 8870m-8850m
 
http://www.techpowerup.com/gpudb/2224/.html

Looking at this, GT750M will have 33W TDP. Its 17W's less, than Radeon HD8870M.

In fact - GT750M will be a lot faster than GT660M, and will be very close to 8870M. 967 MHz on core clock, plus Boost, plus 5000 MHz on GDDR5 memory clocks.

It will be good GPU for rMBP.

actually depends.

that 33w is for the mainstream TDP, not for the real TDP under load and with the gpu boost 2.0 on.

That means the TDP for that gpu is going to be actually higher than the one from the 660m, when under load and with gpu boost 2.0 turned on.

Its really simple, the 640m/650m/660m are the same gpus, and the 740m/750m are the same gpus, we leave magic and unicorns out of the equation always.

From the numbers I got from leaked things, the 3d mark11 for that gpu is P2800, now if you look at the P4000 from the 8870m, there is a good gap still. While that number may seem to be too low for you tastes, I do have a 650m overclocked to 1037mhz core and 1250mhz memory that is P2600, and those clocks are actually pretty close to the ones in the 750m with the gpu boost 2.0 turned on

While you will say that gpu boost 2.0 is whatever the roxx is, I will say that its a sham, that uses a way lower tdp rating for design and in the end we have to see if the card that we will be getting will actually make use of it, so in other words you can have the same performance of a 650m or you can have 30% than that, depending if the cooling was designed for the awful 33w or for the real one with gpu boost 2.0
 
Looking at the PS of MBP's - The GPU for them MUST be at max of 35W of power consumption.

I dont know if AMD with Radeon HD8800M's can be close to that number.
 
Looking at the PS of MBP's - The GPU for them MUST be at max of 35W of power consumption.

I dont know if AMD with Radeon HD8800M's can be close to that number.

Not true. Apple have been allowing the internal battery to provide the shortfall in power of the power supply for years. Which is why the battery slowly drains even when plugged in when gaming.
 
New Samsung chronos has a 90w total power drain
I have read review and on stock clock (725/1125 gddr3) on 3dmark11 8870m reach P3300 with a tiny overclock P3400 (god know how I can push the limit on this card)

On other hand nvidia 650m reach p2300 on stock clock(900/1250 gddr3 on rmbp) but this card has a LARGe over clock potential until to 1150/1555 core/memory and at this clocks it reach p3100
Sourcehttp://forum.notebookreview.com/app...p-overclocking-results-gaming-benchmarks.html

Nvidia 750m just start on base clock (967/1000-2500 for gddr3 or gddr5 version) p2900 but considering that the hardware is quite the same and the frequency are strictly close of overclocked version of 650m,has I admit that this is an overclock nvidia homemade? or had i hope there will be a great potential of overclock yet?

The consideration could be 750m better that a ipotetical 8870m with low overclock potential

----------

Consider also that every card show and inside notebook reviewed (such as in rmbp) is gddr3 rather than gddr5 if rmbp will have gddr5 version 750m 2gb I will buy!
 
Last edited:
New Samsung chronos has a 90w total power drain
I have read review and on stock clock (725/1125 gddr3) on 3dmark11 8870m reach P3300 with a tiny overclock P3400 (god know how I can push the limit on this card)

On other hand nvidia 650m reach p2300 on stock clock(900/1250 gddr3 on rmbp) but this card has a LARGe over clock potential until to 1150/1555 core/memory and at this clocks it reach p3100
Sourcehttp://forum.notebookreview.com/app...p-overclocking-results-gaming-benchmarks.html

Nvidia 750m just start on base clock (967/1000-2500 for gddr3 or gddr5 version) p2900 but considering that the hardware is quite the same and the frequency are strictly close of overclocked version of 650m,has I admit that this is an overclock nvidia homemade? or had i hope there will be a great potential of overclock yet?

The consideration could be 750m better that a ipotetical 8870m with low overclock potential

----------

Consider also that every card show and inside notebook reviewed (such as in rmbp) is gddr3 rather than gddr5 if rmbp will have gddr5 version 750m 2gb I will buy!
I dont understand your point.

the power consumption of the chronos is 90w because the adapter is 90w, if you give it more it will use more, given that there is a throttling problem there

the 750m is a 650m with higher clocks, the 650m in the rmbp is GDDR5 not 3

I would also take a look at the 7850m benchies that people did and the m6000 as well

aside that whats your point?
 
the 750m is a 650m with higher clocks, the 650m in the rmbp is GDDR5 not 3
I Don t think so. What kind of gddr5 should use memory clock as that? Maybe the first produced by quimonda or Samsung. Considering Oc and double data rate are 1550x2=3100mhz while the real gddr5 have frequencies between 1600x2 base clock to 3500x2 oc for example 680gtx can hit 7ghz.

Your rmbps i think are gddr3

----------

You are on right there are memory clock really bad outside on gddr5

Look for 8870 here the version http://www.cpu-world.com/news_2013/2013010803_AMD_introduces_Radeon_HD_8000M_GPUs.html
 
I Don t think so. What kind of gddr5 should use memory clock as that? Maybe the first produced by quimonda or Samsung. Considering Oc and double data rate are 1550x2=3100mhz while the real gddr5 have frequencies between 1600x2 base clock to 3500x2 oc for example 680gtx can hit 7ghz.

Your rmbps i think are gddr3

the point being?

they are GDDR5, no 650m uses GDDR3, they can use DDR3 with higher core clocks
 
Do not forgot that haswell inner GPU and 650 m nvidia has quite same fps

https://www.youtube.com/watch?v=VPrAKm7WtRk&feature=youtube_gdata_player

So how is a realistic fps Max out 2880 on gw2 should I ll wait for in June? Up to 20?25?

LOL ! Without showing FPS, if one runs 25 fps and the other 70 fps, it's not possible to see it. Haswell have a 47 W TDP, so 2 W more than IVB. A iGPU within such a low TDP will never offer same performances as a 30~35 dGPU with it's own memory.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.