Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No, round 2 was the 9600M GT (for example in my Mid-2009 17" MBP). This is round 3.

More info regarding the 9600M GT disaster:
http://www.theinquirer.net/inquirer...s-apple-macbook-pros-nvidia-bad-bump-material

I'm confused. Other than that really old article I've not heard of the 9600m being bad, like the 8600m was. I searched google and found people whose cards overheated (in various laptops, not just apple) but they cooled them off and they worked just fine. The 8600m problem was a catastrophic, permanent failure of the card.

My 3 1/2 year old MBP with the 9600m has never given me issues. Never overheated. I've played all kinds of games on it for long periods of time.

What problems are you having?
 
Two pages in almost every post seems to be a contradiction of the one before it, evidently there is a lot of misinformation out there regarding GPUs.
 
Wishful thinking here but maybe Apple will give us the choice... AMD as standard and Nvidia as BTO. It would serve them best to offer both if they can implement it on the motherboard design... Too many users require one or the other in terms of OpenCL or CUDA, apple stand to lose one type of customer or another so why not offer both and keep everyone happy!!

Although correct me if I am wrong but Nvidia GPUs do support OpenCL, apple just doesn't activate it in their drivers?
 
Wishful thinking here but maybe Apple will give us the choice... AMD as standard and Nvidia as BTO. It would serve them best to offer both if they can implement it on the motherboard design... Too many users require one or the other in terms of OpenCL or CUDA, apple stand to lose one type of customer or another so why not offer both and keep everyone happy!!

Although correct me if I am wrong but Nvidia GPUs do support OpenCL, apple just doesn't activate it in their drivers?
http://www.apple.com/macosx/specs.html

Where do you get the idea that nVidia GPUs don't support OpenCL in OS X? nVidia and Apple collaborated quite heavily to develop OpenCL before submitting it to Khronos for standardization so OpenCL is actually closer to CUDA than CTM/Brook+ which was ATI's competing proprietary GPGPU technology at the time. As such even first gen nVidia DX10 GPUs (8000 series) support OpenCL in Snow Leopard and up, whereas OpenCL requires later 2nd gen DX10.1 GPUs (HD4000 series) for ATI. Post standardization, AMD's support for OpenCL has been stronger though, abandoning their own proprietary APIs, whereas nVidia has continued to support CUDA alongside OpenCL.
 
I'd rather have the option to choose upcoming Nvidia Quadro Kepler (2100m, 4100M) . That's why I am waiting for the HP or Lenovo high end refresh.

But Retina is cool.

Woah, when is NVIDIA updating their Quadro line?!? I haven't heard any rumors on that yet.
 
As someone had mentioned earlier, probably the real purpose for going retina would be for text and GUI purposes, and not exactly for gaming.

For gaming, i think if someone were to play at the standard resolution and not the retina resolution, i dont think it any blockiness would be very noticeable to the human eye at normal viewing distance, but who knows for sure right?

Time to wait and see :p
 
When are ARM based MacBook Pro's going to be released?

RISC is better than CISC, right?!
 
GT 650 2 GB, USB 3.0, Retina, Ivy Bridge, etc. One ultimate machine I'll be getting.

As long as X-Plane 10 runs better (> 20%) on this new iMac than a current 6 Core 3.33 Mac Pro w/ Radeon 5870, then this might be the first time I will consider an iMac over a new Mac Pro...
 
Where do you get the idea that nVidia GPUs don't support OpenCL in OS X

Hmm not quite sure where I stated that Nvidia GPUs don't support it in OS X?

I did say correct me if I am wrong but as i understand it both the GPU and the OS support it but it is turned OFF by default in Mac OS X.
 
Whoa! So many Mac-related rumors today. These are making me really excited.

I just went to my last-ever college final today and I graduate on Friday. Hope I can still pull a student discount!! Fingers crossed…

If you go a build to order and time it JUST right, there is normally a time period where they stop making the BTO's because the new model is just around the corner (rather than getting a large bunch of returns). Get the student discount in that period, and you will likely get a discounted new model.
 
as long as it's a decent card.... decent or "better" :)

But, importantly, I think the baseline models all need better graphics. The air can get by without one (it would still be nice though), but a macbook "pro" that costs $1000 with no decent gpu... that's not very good...
 
If Apple are going to nVidia then it makes me happy that I bought the MacBook Pro and iMac when I had the chance with the AMD GPU inside. I swear Apple never learns from its past mistakes - it keeps doing the same stupid sh-t over and over again expecting different results each time. nVidia is beyond help and should be avoided at all costs even if there nVidia is marginally faster at a cheaper price.

Does this card exist yet? Any benchmark comparison to the current MBP gfx cards?

All you had to do was Google it:

http://www.notebookcheck.net/NVIDIA-GeForce-GT-650M.71887.0.html

Hmm not quite sure where I stated that Nvidia GPUs don't support it in OS X?

I did say correct me if I am wrong but as i understand it both the GPU and the OS support it but it is turned OFF by default in Mac OS X.

This is what you said originally:

Wishful thinking here but maybe Apple will give us the choice... AMD as standard and Nvidia as BTO. It would serve them best to offer both if they can implement it on the motherboard design... Too many users require one or the other in terms of OpenCL or CUDA, apple stand to lose one type of customer or another so why not offer both and keep everyone happy!!

Both AMD and nVidia support OpenCL with CUDA being an nVidia only technology. The push by Apple is towards OpenCL which is loosely based on CUDA so no one loses in the end. You on the other hand made the claim that OpenCL is exclusively AMD, and CUDA is exclusively NVidia and doesn't support OpenCL. Either correct you original statement or provide evidence that nVidia doesn't support OpenCL as you claimed.
 
Last edited:
This is good news. As an avid PC gamer who has been building my own PCs for many years, Nvidia has constantly been my choice for graphics cards.

I can understand the apprehension, since I also owned a old MBP with a faulty 8600M back a few years ago, but you people have to realize that stuff like that happens and it's not something that was common practice with Nvidia or they would not be where they are at today, Which is the #1 GPU company in the world.

The 650M is a great choice as it is a true mobile variant of Kepler, and is a top end mid range card. Realistically Apple could not use anything faster, at least from Nvidia in the MBP due to heat and power. I think as long as they give us the GDDR5 version we should be good to go!
 
Why does MacRumors (and the media in general) always seem to neglect the MBP13 in their upgrade discussions and folks are left to determine for themselves exactly which of the 3 models are being talked about?

The MBP13 may indeed be the bastard child of the lineup but it's still the best selling macbook of the 3.

P.S. Don't blame me if the MBP13 shouldn't have the "Pro" moniker. I didn't name it.
 
Why does MacRumors (and the media in general) always seem to neglect the MBP13 in their upgrade discussions and folks are left to determine for themselves exactly which of the 3 models are being talked about?

The MBP13 may indeed be the bastard child of the lineup but it's still the best selling macbook of the 3.

P.S. Don't blame me if the MBP13 shouldn't have the "Pro" moniker. I didn't name it.

If seems like there was a very specific controlled leak about the 15inch model, so that is what all the news is about. It would be really cool if the 13inch pro could get all these goodies, but I am not holding my breath. You'd think Apple would do something to distinguish the MBP13 from the MBA13, but I just don't know if they can.
 
Better buy Applecare and plan on replacing after AC expiry,or budget for a new logic board, given their history.
 
Anyone else concerned that quadrupling the number of pixels is going to do seriously bad things to video performance?

I'm not sure why Macbook Pros need retina screens. The effective resolution at typical viewing distance is *already* retina on the premium display 15" Macbook Pros!
 
Both AMD and nVidia support OpenCL with CUDA being an nVidia only technology. The push by Apple is towards OpenCL which is loosely based on CUDA so no one loses in the end. You on the other hand made the claim that OpenCL is exclusively AMD, and CUDA is exclusively NVidia and doesn't support OpenCL. Either correct you original statement or provide evidence that nVidia doesn't support OpenCL as you claimed.

Apologies, I should have made myself clearer...

What I meant was that AMD is stronger for openCL than Nvidia and CUDA is non existent on AMD... this is where the end user has a choice to make.

I personally would choose Nvidia as i use After Effects primarily for my work... openCL does accelerate this but nowhere near the acceleration that Nvidia offers.

For users of FCPX then the obvious choice would be an AMD gpu as its stronger in its implementation of openCL than Nvidia are.

There are many variants on this argument but i think my example demonstrates that unfortunately the users cannot get the best of both worlds, not with current offerings from both AMD and Nvidia.
 
Apologies, I should have made myself clearer...

What I meant was that AMD is stronger for openCL than Nvidia and CUDA is non existent on AMD... this is where the end user has a choice to make.

I personally would choose Nvidia as i use After Effects primarily for my work... openCL does accelerate this but nowhere near the acceleration that Nvidia offers.

For users of FCPX then the obvious choice would be an AMD gpu as its stronger in its implementation of openCL than Nvidia are.

There are many variants on this argument but i think my example demonstrates that unfortunately the users cannot get the best of both worlds, not with current offerings from both AMD and Nvidia.

That is more the result of the drivers than anything else. OpenGL is the same situation where a while ago people were reporting on AMD GPU's performance in reference to OpenGL being a whole lot better than nVidia GPU's even though in theory nVidia should have out performed it. The big question is whether Apple is going to put the hard word on nVidia and force them to produce some drivers that don't royally suck because given the past experience I don't really hold out much hope of improvement.

Personally, putting aside performance issues the biggest problem I have is simply the fact that nVidia drivers suck in terms of reliability, they never fix long standing bugs and most importantly nVidia have an atrocious record when it comes to product quality - the nVidia 8600 fiasco being the tip of a fairly large iceberg. If I am going to hand over several thousand for an iMac and MacBook Pro (which I did recently - see signature) I want to be assured that the GPU isn't going to die and require replacing every so many months as with the case of the defective 8600 GPU's which get replaced with GPU's that have the same defect.
 
Last edited:
Actually that is only partially true, some of them are rebranded, the 650m is
Kepler.

http://www.notebookcheck.net/NVIDIA-GeForce-GT-650M.71887.0.html

Actually it is partially true on both sides. The lower end 7000M series chips on the AMD side are also re-brands.

Both vendors do this for the last couple generation upgrades. The bottom end has rebrands and the mid-upper end has the newer stuff.

For example:

A table of AMD offerings:
http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units#Radeon_HD_7xxxM_Series

The "Turks" / "Redwood" code names are from last generation. The "Cape Verdi" / "Pictarian" are the new "Next Generation" updates that are the true Southern Islands. You can also look at the Fab process to separate "older" from "new". The newest ones are 28nm.


Nvidia is running the same game:
http://en.wikipedia.org/wiki/Compar...ocessing_units#GeForce_600M_.286xxM.29_series

Code names with "GFxxx" are older Fermi. Names with "GKxxx" are newer Kepler. Likewise the Fab processes vary. Nvidia is a bit worse since they have variants that are both.

You really have to look at the very specific numbers to tell not just the first digit.
 
I recently saw a Samsung MBP clone with a nVidia 650 and an ASUS gaming laptop with a 660. Both with Ivy Bridge mobile cpu's. Seems like the most likely configuration.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.