Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just to be clear, the GTX 5xx series is faster, cooler and more power efficient than the 6xxx series.

Add to that all the talk of open source drivers, nouveau has always proven to be far more stable and powerful than the open ATi drivers.

Then we have closed drivers on Linux, flgrx is a joke, doesn't update to new xorg-server versions unless ubuntu uses it.

nVidia have come a long way since fermi, ATi has remained static since 4xxx

There are so many things wrong with that response, I hardly know where to begin. The GTX580m for example uses twice as much power as the 6970m for a 13% greater performance level (on average), the 580 (desktop) is a far larger GPU than the 6970, and thus costs more to produce and uses 40% more power than the 6970 for 10-15% greater performance. The 6970's competitor is the 570, and both have a comparable cooling mechanism that allows the 570 to reach the high 90s, while keeping the 6970 squarely in the 80s (this is on equal fan spin rate).
And now the biggest laugh of your post: NVidia have come a long way since Fermi. Funny that, since the 500 series IS FERMI, simply with the horrific yield and power/performance ratios FIXED (compared to the 400 series).

There is no way in hell Apple is going to be as moronic as the average fanboy on this matter, and hence have stayed and will continue to stay the hell away from NVidia, at least in their mobile market. NVidia's last good series, to be brutally honest, was the 8000 series. End of story. The only cards that could be considered worth it in their lineup presently are the 4/560.
 
There are so many things wrong with that response, I hardly know where to begin. The GTX580m for example uses twice as much power as the 6970m for a 13% greater performance level (on average), the 580 (desktop) is a far larger GPU than the 6970, and thus costs more to produce and uses 40% more power than the 6970 for 10-15% greater performance. The 6970's competitor is the 570, and both have a comparable cooling mechanism that allows the 570 to reach the high 90s, while keeping the 6970 squarely in the 80s (this is on equal fan spin rate).
And now the biggest laugh of your post: NVidia have come a long way since Fermi. Funny that, since the 500 series IS FERMI, simply with the horrific yield and power/performance ratios FIXED (compared to the 400 series).

There is no way in hell Apple is going to be as moronic as the average fanboy on this matter, and hence have stayed and will continue to stay the hell away from NVidia, at least in their mobile market. NVidia's last good series, to be brutally honest, was the 8000 series. End of story. The only cards that could be considered worth it in their lineup presently are the 4/560.

The next gen amd gpu isn't the 6xxx northern island series in case you haven't noticed. Rather amd now joints nvidia with a compute/graphics balanced floating point gpu. It's new and no one here knows whether it is still more power efficient then nvidia's reworked femi on 28 nm.

And you would be a troll to claim all nvidia chips are more power hungry then amd's. Some are actually good buys.

Whether nvidia or amd shows up we won't know till 2012. Even if apple pick one now they can change their minds later on. At least for the iMacs, after all it's just a mxm swap if one company f up. Mbp if it takes unusually long to show with the unexpected graphics then it might also be the case.
 
Here's how it works: When you design a project in Xcode, you link frameworks to it. You do so in order to call various classes from it in order to use it in your code. If you don't use it, you don't call it. It really is that simple. Apple uses OpenCL in CoreImage.

EDIT: Just to make you happy, they likely ALSO use GLSL, but GLSL cannot do the same things OpenCL can, so it's entirely plausible that they use both.

EDIT: Also, you might want to consider how people discover what Apple uses in their frameworks. This is actually one of the simplest ways to do so. It's an information source in and of itself.

Following the otool based argument you are putting yourself in a more and more ridiculous position, first needing to accept that CoreImage uses OpenCL on Mac, then that it does on iPhone (when there is indeed no evidence yet that the OpenCL implementation on iOS is even functioning yet). Maybe they plan to use it in the future and there is just some stub functionality. Maybe it's just so the framework can report "yup, this device supports OpenCL" and then proceed to do nothing with that information. Who knows why it is there? What we do know is the Apple docs show CoreImage as built on top of OpenGL. It's not just likely they use GLSL, it is their stated position that they do.

graphics_arch.gif


Apple (Last updated: 2011-10-12) said:
Up until now OpenGL, the industry standard for high performance 2D and 3D graphics, has been the primary gateway to the graphics processing unit (GPU). If you wanted to use the GPU for image processing, you needed to know OpenGL Shading Language. Core Image changes all that. With Core Image, you don’t need to know the details of OpenGL to harness the power of the GPU for image processing. Core Image handles OpenGL buffers and state management for you automatically. If for some reason a GPU is not available, Core Image uses a CPU fallback to ensure that your application runs. Core Image operations are opaque to you; your software just works.

I'm aware that OpenCL can do things that GLSL cannot -- I've developed using both. In terms of Image processing, however, there is not much use for OpenCL when you've already got a GLSL pathway as you don't need to do scattered writes, don't need shared memory, etc. In fact some OpenCL supporting Macs (Radeon 4000 series, Radeon 5000/6000 series prior to Lion) lack CLImage support, which means OpenCL lacks GLSL's image filtering, memory read/writes cannot be cached, and OpenCL/OpenGL interoperability cannot be done efficiently, making GLSL a better choice for now.

Could they provide an OpenCL renderer for CoreImage? Absolutely. But otool output is not evidence that it's there yet.

Anyway, if someone just provides me with a single reference from Apple that says "yes, we use OpenCL here to accelerate such and such" I'll be happy to accept that it's used. For now I don't believe it.
 
Last edited:
What a great news. I couldn't believe my eyes when I first read the title of this article. These AMD chips have always felt wrong to me. With Ivy Bridge (tock cycle = refined, perfected), bigger display and now an nVidia chip, these upcoming MacBook Pros / Airs are shaping up to be the absolute perfection!
 
Last edited:
Why noone else thought about that Apple may wants to bring 3D displays? After all apple always had media pros in mind and 3D is a rising trend in the industry. Plus maybe with 3D, blu-ray is back in the game!
 
The next gen amd gpu isn't the 6xxx northern island series in case you haven't noticed. Rather amd now joints nvidia with a compute/graphics balanced floating point gpu. It's new and no one here knows whether it is still more power efficient then nvidia's reworked femi on 28 nm.

And you would be a troll to claim all nvidia chips are more power hungry then amd's. Some are actually good buys.

Whether nvidia or amd shows up we won't know till 2012. Even if apple pick one now they can change their minds later on. At least for the iMacs, after all it's just a mxm swap if one company f up. Mbp if it takes unusually long to show with the unexpected graphics then it might also be the case.

*sigh* I'm fully aware of this, and I even stated examples of decent current gen NVidia cards in my post. I'm a comp. science major after all:p The 7XXX series AKA Southern Islands (originally Northern Islands, not sure why they flipped it) for all intents and purposes is slated to be the equivalent of going from the 4XXX to 5XXX series. NVidia's 28nm process (6XX) is called Kepler, and to be quite honest its being touted the same way the 4XX series was, so I'm hesitant to buy into its hype.

The fact of the matter is, that presently, AMD has far more experience in manufacturing/designing 40nm and bellow circuits, which is why the 5XXX series so completely and utterly demolished the 4XX series; the 4XX series was more expensive, slower, hotter and more power hungry when compared with the contemporary AMD card aimed at the same market. Even though the 5XX series improved upon the 4XX series, a refined Fermi if you will, AMD still continues to dominate each specific market point. The only place where NVidia is winning, an area that they have been desperate to continue winning in at all costs by having GPUs twice the size of AMD's offerings by cramming MOAR transistors in, is the ultra high end disregarding price (if you think an extra $200 is worth an extra 10% performance).

In the mobile market, the difference is even more apparent. The GTX580m has a TDP of 100 watts (that's desktop quad core CPU territory!), while the 6970m has a 45-55 watt TDP for 90% of the performance. Overclock anyone? 6990m anyone? NVidia's offerings in the mobile chipset market are even more laughable than what was offered in the desktop 4XX series. If they want to experiment with NVidia in the iMacs and Mac Pros by offering support for those cards alongside AMD cards, then go ahead; choice is good. IF, however, they want to have a stable, power efficient and powerful offering in their MBP line (something far more static in design), then they should stick with AMD. I just want to be clear; I have no horse in this race, and merely want the best product to be offered. My desktop GPU setup has been (in this order) 9800GTX+, 4870, 5870, 5970, GTX460, 6970 (across multiple systems of course). If someone wants a budget system with a GPU around the $100 mark, you can't pass up a GTX460; it's a brilliant card. But that, unfortunately, is the exception, not the rule.

In summary, Apple's iMac/Mac Pro line should support both for maximum versatility, their mobile/Mac Mini lines should stick with AMD for the foreseeable future (hoping Apple puts a 6990m in an iMac speed bump; that would be a momentous leap forward for that form factor).
 
fpsBeaTt, none of what you said matters.

Why?

Because AMD/ATI's drivers are still the absolute WORST. They are literal JUNK. I have never seen a truly stable AMD/ATI GPU driver. Every AMD/ATI GPU I have ever tried from the old Rage series in the 90s to the recent 6000 series releases has had some sort of crippling issues.

Whats more important? Using less power or being able to actually play your games at full quality, full speed, etc. the day they're released and not have to wait for multiple driver fixes from AMD to finally get things right? Like with Rage.

Even AMD's chipset drivers are iffy.

So no matter what advantage AMD/ATI GPUs may have (and thats usually personal opinion, not fact, especially when you look at current benchmarks of current nvidia and AMD GPUs), it doesn't matter because AMD can't write drivers to save their own lives.

Every time I hear Radeon fans going on about how the drivers have improved, I give Radeon GPUs a shot to see. I mean, the Radeon GPU in my Xbox 360 works fantastic. Clean install of Windows, try some games.. Always ends up going back to Fry's within a few days of purchase because of some stupid bug.

My all time favorite AMD/ATI driver bug was a few years ago. Tried playing GTA3 and Vice City and neither game had road textures. So you were basically driving/walking on air. Freakin hilarious. That happened across a couple of different ATI GPUs with driver revisions a year apart. When I searched forums, the ATI fans were blaming Rockstar. Funny, that didn't happen on my nvidia GPUs. Or even the integrated Intel GPU I had in my first MacBook.

So when AMD/ATI can write a GPU driver thats even half as good as nvidia drivers, then we can discuss them being on equal playing fields. Until then, nvidia makes the better product regardless.
 
So when AMD/ATI can write a GPU driver thats even half as good as nvidia drivers, then we can discuss them being on equal playing fields. Until then, nvidia makes the better product regardless.

Nvidia Quadros are also significantly more popular than Firepro cards, with most of that coming down to the drivers. Even if CUDA was a complete non-issue, I doubt they'd be eclipsed by AMD in popularity there. If anyone brings up price, all workstation cards are expensive. Their pricing in theory is mostly an issue of development time for drivers, quality control, and sometimes more vram.


In the mobile market, the difference is even more apparent. The GTX580m has a TDP of 100 watts (that's desktop quad core CPU territory!), while the 6970m has a 45-55 watt TDP for 90% of the performance. Overclock anyone? 6990m anyone?

Are we talking idle or under load here? Every article and test I can find shows the 6970m around 100W. It's essentially built from underclocked desktop parts. It works fine in the imac, but Apple wouldn't use that in a laptop.
 
Last edited:
http://www.computerbase.de/news/2011-11/details-zu-anstehenden-28-nm-grafikloesungen-von-amd/

http://www.computerbase.de/news/2011-11/details-zu-anstehenden-28-nm-grafikloesungen-von-nvidia/

Mobile HD 7000 and 600M information just sprung up. nVidia might be going to optical shrinks of Fermi to 28nm. The HD 7000 lists true Southern Islands parts, at least based on the code names, in 28nm but I have seen slides for relabeled HD 6000M paired with Llano under the HD 7xxx moniker as well. It is a mess.
 
fpsBeaTt, none of what you said matters.

Why?

Because AMD/ATI's drivers are still the absolute WORST. They are literal JUNK. I have never seen a truly stable AMD/ATI GPU driver. Every AMD/ATI GPU I have ever tried from the old Rage series in the 90s to the recent 6000 series releases has had some sort of crippling issues.

Whats more important? Using less power or being able to actually play your games at full quality, full speed, etc. the day they're released and not have to wait for multiple driver fixes from AMD to finally get things right? Like with Rage.

Even AMD's chipset drivers are iffy.

So no matter what advantage AMD/ATI GPUs may have (and thats usually personal opinion, not fact, especially when you look at current benchmarks of current nvidia and AMD GPUs), it doesn't matter because AMD can't write drivers to save their own lives.

Every time I hear Radeon fans going on about how the drivers have improved, I give Radeon GPUs a shot to see. I mean, the Radeon GPU in my Xbox 360 works fantastic. Clean install of Windows, try some games.. Always ends up going back to Fry's within a few days of purchase because of some stupid bug.

My all time favorite AMD/ATI driver bug was a few years ago. Tried playing GTA3 and Vice City and neither game had road textures. So you were basically driving/walking on air. Freakin hilarious. That happened across a couple of different ATI GPUs with driver revisions a year apart. When I searched forums, the ATI fans were blaming Rockstar. Funny, that didn't happen on my nvidia GPUs. Or even the integrated Intel GPU I had in my first MacBook.

So when AMD/ATI can write a GPU driver thats even half as good as nvidia drivers, then we can discuss them being on equal playing fields. Until then, nvidia makes the better product regardless.

Wow, I actually don't think I've seen a post more transparently BS in my life; the only issue I've EVER had with AMD/ATI drivers, and I stress (ever) in my entire experience was with a couple of games not utilising the 5970's internal crossfire (MW2 and WaW). Other than that, no problems whatsoever; they've been completely stable in both Windows and OS X for me, as well as on my friend's iMac with his 6970m and my younger brother's 5850m. If anything, I've had more kernel panics than anything due to either some NVidia routine conflict. In Windows XP alone, 70% of major bugs were due to NVidia driver conflicts. Quite frankly, I know from my own experience, and the experience of numerous others, (and a friend of mine that has two GTX 580s, who's had what you describe as an AMD/ATI experience with them, along with another that had two GTX 260s), that based solely on that post, you're full of crap.

It depends on the system and many other variables that need to be taken into account; if you were one of the unlucky few who's had nothing but trouble with AMD/ATI cards, well it sucks to be you. But from the vast majority of people I've conversed with, read about on forums and know personally, the statistically prevalent problems are NVidia driver based. I myself am right now using a 9800GTX+ as an interim card, and the drivers are complete and utter crap; I've had that many conflicts with them, it isn't funny (and before you spout off some BS about it being a conflict with my AMD chipset, I'm using an X58 board). The only person I can think of with an NVidia setup that hasn't had any noticeable trouble yet is my friend who used to have the two GXT260s, and is now using two GTX460s.

So, a word of advice: if you're going to spout completely biased crap on a public forum, be prepared to have those assertions squashed by an expert on the subject.
 
ATI/AMD vs nVidia = My opinion is that both are better than Intel and I wish Intel would stop wasting space for their substandard graphics solutions.

From past experience, I've had more nVidia parts die on me, and I'm not happy with their mobile parts, but I'm just stating this from the perspective that I haven't had any ATI parts die on me, while equivalent nVidia parts did. But those weren't chips that blew up, it was the stock cooling solution. The fan ripped itself apart. That probably says more about the OEM's being too cheap.

Here's an observation regarding parts suppliers. Can you name all the hardware products Apple makes and tell me what differentiates them?

iPod, iPhone, iPad = Portable touch computers
MacBook Air, MacBook Pro = Laptops (The Air uses SSD's and no media drive)
iMac = All-in-One desktop
MacMini = Desktop
MacPro = Workstation
Apple TV = Media Extender

With the laptops and the iMac the only significant difference is the screen size. Even then they're still one brand.

Now off the top of your head name any of Apple's competitors equivalents, go ahead even look at their websites (look at Samsung and Sony in particular.) These companies produce too many slight variations of the same thing.

Apple's stuff has model numbers, but they're not propped up in marketing materials, model numbers mean absolutely nothing. MacBook Air Late 2011 model has more meaning.
 
ATI/AMD vs nVidia = My opinion is that both are better than Intel and I wish Intel would stop wasting space for their substandard graphics solutions.

From past experience, I've had more nVidia parts die on me, and I'm not happy with their mobile parts, but I'm just stating this from the perspective that I haven't had any ATI parts die on me, while equivalent nVidia parts did. But those weren't chips that blew up, it was the stock cooling solution. The fan ripped itself apart. That probably says more about the OEM's being too cheap.

Here's an observation regarding parts suppliers. Can you name all the hardware products Apple makes and tell me what differentiates them?

iPod, iPhone, iPad = Portable touch computers
MacBook Air, MacBook Pro = Laptops (The Air uses SSD's and no media drive)
iMac = All-in-One desktop
MacMini = Desktop
MacPro = Workstation
Apple TV = Media Extender

With the laptops and the iMac the only significant difference is the screen size. Even then they're still one brand.

Now off the top of your head name any of Apple's competitors equivalents, go ahead even look at their websites (look at Samsung and Sony in particular.) These companies produce too many slight variations of the same thing.

Apple's stuff has model numbers, but they're not propped up in marketing materials, model numbers mean absolutely nothing. MacBook Air Late 2011 model has more meaning.

On your comment about intel, this is where I feel AMD has a huge advantage for SOC integration. AMD Fusion's platform is on course for rapid ascension; they have the expertise in CPU and GPU manufacturing, and are using that unique position to their advantage.
 
Who cares.

I see no benefits or drawbacks from the move if NVIDIA can continue to provide a reliable product. I cringe every time I see a pre-unibody MBP with a GeForce 8600...a ticking time bomb.
It's a drawback for Bootcamp users, since NVidia drivers have issues that cause audio-dropouts for years and fail to fix them.
 
I'm fine with AMD but I think they should stick with AMD since NVIDIA caused a lot of ****storm with the 8600M GT chip'd MacBook Pro. It could be nice if they keep both, but I don't think that would be possible since Steve wanted less SKUs as possible. AMD does a better job in graphics though.
 
Wow, I actually don't think I've seen a post more transparently BS in my life; the only issue I've EVER had with AMD/ATI drivers, and I stress (ever) in my entire experience was with a couple of games not utilising the 5970's internal crossfire (MW2 and WaW). Other than that, no problems whatsoever; they've been completely stable in both Windows and OS X for me, as well as on my friend's iMac with his 6970m and my younger brother's 5850m.

Earlier in the thread you state your first desktop GPU was a GeForce 9800GTX. So you have very little real world experience and you missed out on ATI's worst driver releases over the last decade.

So your "ever" statement doesn't hold much water when talking to someone like me, who has been in this game since the mid 90s.

In Windows XP alone, 70% of major bugs were due to NVidia driver conflicts.

If that statement is even REMOTELY true, it shouldn't be difficult for you to post a link to a respectable source to back that up.

Now let's talk about some real world stuff. As I've stated many times, I've built hundreds of systems over the last decade and a half. Thanks to nvidia's driver support, every system I built in the late 90s was able to upgrade to Windows XP. And because of the components I chose, all of those systems are still usable today for basic browsing and basic "Office" work. Not a single one has had an issue related to the GPU.

I know from my own experience, and the experience of numerous others, (and a friend of mine that has two GTX 580s, who's had what you describe as an AMD/ATI experience with them, along with another that had two GTX 260s), that based solely on that post, you're full of crap

Again, you're trying to tell someone who has built hundreds of systems over the course of over a decade that something is wrong with a product that they have had a 100% success rate with.

If your friend is really having driver issues like I described with AMD/ATI drivers, then it shouldn't be too difficult for them to register for this forum and start posting screen shots.

It depends on the system and many other variables that need to be taken into account; if you were one of the unlucky few who's had nothing but trouble with AMD/ATI cards, well it sucks to be you.

If you "read on forums" like you say, you'll see what you say is completely opposite of what actually does happen on forums. Head over to some hardware enthusiast forums like Futuremark. Even the diehard Radeon fanboys openly discuss severe driver issues that simply don't happen with nvidia. Of course, they'll blame everyone but AMD, but they still admit reality.

the statistically prevalent problems are NVidia driver based.

Again, 100% success rate here. Dating all the way back to the original RivaTNT 16MB card. First with the manufacturer drivers then nvidia's own very first Detonator driver release.

I myself am right now using a 9800GTX+ as an interim card, and the drivers are complete and utter crap; I've had that many conflicts with them, it isn't funny (and before you spout off some BS about it being a conflict with my AMD chipset, I'm using an X58 board).

I have an AMD chipset, nvidia GPU, and my old trusty Chaintech AV710 soundcard running on Windows 7 x64. The AV710 even has the high res alternate output with higher quality DAC running perfect, just like it did 6 years ago in Windows XP. If you're an "expert" as you say, you'll know what that means. I have absolutely zero driver issues. My system is absolutely rock solid.

If you are legitimately having in-game issues with the 9800GTX+, it shouldn't be difficult at all for you to post screenshots. IF you are having problems, my guess is you didn't do a fresh install of Windows when going from AMD to nvidia.

So, a word of advice: if you're going to spout completely biased crap on a public forum, be prepared to have those assertions squashed by an expert on the subject.

Yeah, an expert who admits their first real GPU experience began with a GPU released barely over 3 years ago. Trying to tell someone who has had a 100% success rate with a certain manufacturer over the course of hundreds of systems since the late 90s.

It's a drawback for Bootcamp users, since NVidia drivers have issues that cause audio-dropouts for years and fail to fix them.

How would nvidia drivers cause that? My unibody MacBook has a GeForce 9400M and Realtek soundcard. The two have absolutely nothing to do with each other in regard to drivers, and I have 0 sound issues in either Windows 7 (or XP or Vista since my MacBook has run all of them natively at one point) or Mac OS X. The two other Macs I owned also had Realtek soundcards.

I'm fine with AMD but I think they should stick with AMD since NVIDIA caused a lot of ****storm with the 8600M GT chip'd MacBook Pro.

That was Apple's piss poor cooling design that caused the nvidia chips to fail. Apple's cooling design has caused a lot of problems over the year. Case cracking and discoloration for me on the plastic MacBooks, as well as hinge warping on any of the unibody systems. I've also seen pre-unibody Macs (PowerBook, MacBook Pro) warp due to excessive heat not being properly dealt with thanks to Apple's cooling design.

AMD does a better job in graphics though.

Got proof?
 
Earlier in the thread you state your first desktop GPU was a GeForce 9800GTX. So you have very little real world experience and you missed out on ATI's worst driver releases over the last decade.

So your "ever" statement doesn't hold much water when talking to someone like me, who has been in this game since the mid 90s.



If that statement is even REMOTELY true, it shouldn't be difficult for you to post a link to a respectable source to back that up.

Now let's talk about some real world stuff. As I've stated many times, I've built hundreds of systems over the last decade and a half. Thanks to nvidia's driver support, every system I built in the late 90s was able to upgrade to Windows XP. And because of the components I chose, all of those systems are still usable today for basic browsing and basic "Office" work. Not a single one has had an issue related to the GPU.



Again, you're trying to tell someone who has built hundreds of systems over the course of over a decade that something is wrong with a product that they have had a 100% success rate with.

If your friend is really having driver issues like I described with AMD/ATI drivers, then it shouldn't be too difficult for them to register for this forum and start posting screen shots.



If you "read on forums" like you say, you'll see what you say is completely opposite of what actually does happen on forums. Head over to some hardware enthusiast forums like Futuremark. Even the diehard Radeon fanboys openly discuss severe driver issues that simply don't happen with nvidia. Of course, they'll blame everyone but AMD, but they still admit reality.



Again, 100% success rate here. Dating all the way back to the original RivaTNT 16MB card. First with the manufacturer drivers then nvidia's own very first Detonator driver release.



I have an AMD chipset, nvidia GPU, and my old trusty Chaintech AV710 soundcard running on Windows 7 x64. The AV710 even has the high res alternate output with higher quality DAC running perfect, just like it did 6 years ago in Windows XP. If you're an "expert" as you say, you'll know what that means. I have absolutely zero driver issues. My system is absolutely rock solid.

If you are legitimately having in-game issues with the 9800GTX+, it shouldn't be difficult at all for you to post screenshots. IF you are having problems, my guess is you didn't do a fresh install of Windows when going from AMD to nvidia.



Yeah, an expert who admits their first real GPU experience began with a GPU released barely over 3 years ago. Trying to tell someone who has had a 100% success rate with a certain manufacturer over the course of hundreds of systems since the late 90s.



How would nvidia drivers cause that? My unibody MacBook has a GeForce 9400M and Realtek soundcard. The two have absolutely nothing to do with each other in regard to drivers, and I have 0 sound issues in either Windows 7 (or XP or Vista since my MacBook has run all of them natively at one point) or Mac OS X. The two other Macs I owned also had Realtek soundcards.



That was Apple's piss poor cooling design that caused the nvidia chips to fail. Apple's cooling design has caused a lot of problems over the year. Case cracking and discoloration for me on the plastic MacBooks, as well as hinge warping on any of the unibody systems. I've also seen pre-unibody Macs (PowerBook, MacBook Pro) warp due to excessive heat not being properly dealt with thanks to Apple's cooling design.



Got proof?

You demand proof, yet you provide none yourself. Good job. Really, I can't be bothered arguing this as, simply put, I have better things to do than to prove a point to some random on the internet that doesn't know what he's talking about. I chuckled at every point you made, in the same way I chuckle at creationists (not saying the same disassociation with reality exists, but it's close). Long story short, most of NVidia's offerings are overpriced crap atm with AMD snagging most of the consumer market, credence being lent to this only today with a friend informing me that a GTX 540 being used for a multimedia machine had a dud MPEG2 en/de-coder. Anyway, thread closed.

----------

That was Apple's piss poor cooling design that caused the nvidia chips to fail. Apple's cooling design has caused a lot of problems over the year. Case cracking and discoloration for me on the plastic MacBooks, as well as hinge warping on any of the unibody systems. I've also seen pre-unibody Macs (PowerBook, MacBook Pro) warp due to excessive heat not being properly dealt with thanks to Apple's cooling design.

I'm sorry, but I can't contain myself. I have to at LEAST respond to this vitriol, which is wrong on so many levels.

1. Piss poor cooling design they used for the ATI card that preceded it with the same TDP. Hurrr.

2. Case cracking etc. on plastic MacBook, that at that time didn't have a discrete GPU in them.

3. Never had any problem on my 2006-2008 MacBook Pros in terms of heat.

4. The problem wasn't caused by heat, but a design flaw that every other OEM experienced. This is common knowledge to all but the most moronic NVidia fanboys.

5. Again, Apple's cooling solution worked JUST FINE for every other GPU within that enclosure.
 
This is not so good for the current owners of Macs with AMD GPUs,
because it means that the support of AMD drivers would be lower. :confused:
 
Earlier in the thread you state your first desktop GPU was a GeForce 9800GTX. So you have very little real world experience and you missed out on ATI's worst driver releases over the last decade.

So your "ever" statement doesn't hold much water when talking to someone like me, who has been in this game since the mid 90s.

The talk was about the current time.
I agree that several years ago AMD sucked - and my choice was NVidia.
But then NVidia thought they are a "king of the hill" and became more passive,
while AMD used all the resources to overtake them.
Currently, my choice is AMD - today, they provide more power per the buck.


That was Apple's piss poor cooling design that caused the nvidia chips to fail. Apple's cooling design has caused a lot of problems over the year. Case cracking and discoloration for me on the plastic MacBooks, as well as hinge warping on any of the unibody systems. I've also seen pre-unibody Macs (PowerBook, MacBook Pro) warp due to excessive heat not being properly dealt with thanks to Apple's cooling design.

That is not the Apple's fault! That is the fault of NVidia,
who have used the wrong proportions of the compound for solder.
So, it is impossible to fix the affected machines without replacing the GPU
(and the replacement part must be from a new batch)


I am not the fan of AMD, and not the fan of NVidia.
I just choose what is better at the moment.
When the situation will change in the favor of NVidia, I will switch again.
 
Last edited:
This is not so good for the current owners of Macs with AMD GPUs, because it means that the support of AMD drivers would be lower. :confused:

Why are you digging up an article form Nov 21, 2011?

Btw, I doubt a return to nVidia given that there is no relationship between Intel and nVidia - Ivy Bridge is shown to have massive improvements in GPU performance and I doubt we'll see it appear in MacBook Air anytime soon and AMD has the performance/price/power edge over nVidia consistently for the last few years. nVidia more or less no longer care about their traditional markets - they're throwing some bones at the PC vendors but all its focus these days are on its ARM 64bit platform and SoC division.
 
BS. Apple is committed solely to OpenCL, just like AMD. CUDA is not part of Apple's future and Nvidia knows it. Nvidia's implementation of OpenCL on the GPGPU and CPU lags seriously behind Apple and AMD.

This report is bunk.

You might be interested to know that OpenCL routines in Apples Quartz Composer framework only work on Nvidia cards. ATI yet to get one card on the table so all the latest two releases of MBPs and iMacs and MacPros with ATI cards can't run OpenCL in QC. It's possible to write kernels outside QC and target them to specific ATI cards I believe.

----------

I don't understand this move.

AMD has absolutely destroyed nVidia in the graphics segment with everything but the high-end gaming graphics that requires a dedicated gaming laptop.

The 6xxx series trounces the nVidia 5xx series.

The 7xxx series is going to make even a bigger jump than the 5xxx to 6xxx jump was. 28nm core, completely new architecture.

Unless nVidia's new architecture is a vast improvement over what they currently have, I don't get this move.

Still don't know anybody who has had a single OpenCL kernel run in Quartz Composer framework on and ATI card despite the years they've had to get it together. Nvidia supported OpenCL in QC Framework in last couple of generations.

Not saying which cards are 'better' or 'worse' since that requires a more nuanced comparison.
 
If CoreImage is using OpenCL the the APIs use OpenCL.

We can argue about how much OpenCL is used in CoreImage or about where else it is used, but due to where the technology is implemented I think it is fair to say apps do benefit from OpenCL.

So are you actually asserting that CoreImage leverages OpenCL or just waving that idea around in your hands?

CoreIMage leverages the GPU and always has done since Tiger, nothing to do with OpenCL, that was the point of it, providing a Framework for optimised 2D graphics chains that will run on the GPU to speed performance. OpenCL kernals need to be written and usually the code needs to be tailored per target (GPU) card. Guess what ATI cards FAIL in Quartz Composer and always have done since OpenCL came about.
 
I just stumbled on this article and haven't read all of the thread. I am saving up to purchase a macbook pro. PLEASE I hope by the time I purchase one it wont have a NVIDIA graphic chip. THEY RUN TOO hot. I got a killer i7pc laptop with the nvidia card,gtx360m it still works pretty good but after about a year and half it starts having thermal shutdowns..yes I have done the usual, re-paste, clean the fans,laptop cooler ect. the damn thing idles at 80c and thats light web browseing. I also had another laptop that was completely unusable after 14 months....the nvidia chip got so hot it unseated itself from the board, in fact when they do warranty repair on these, they just reflow the board. so thats why mine lasted 14months, 1 repair once its back its out of warranty. I think some of the macs had the same issue am I not correct?
 
re: nVidia chips running too hot?

Honestly, I think your experience simply reflects problems seen with SOME nVidia chips. The 8600M series was the most notorious for having problems (and was used in many laptops, including some Macbook Pros, Dell, Toshiba, HP, and other systems).

I don't know that anyone's determined that across the board, nVidia GPUs run too hot? I own a mid-2010 vintage 17" Macbook Pro with the nVidia GT330M video chipset in it and really haven't experienced any problems with it, to date. (I remember reading a couple of reviews where overheating problems were discussed with this particular system configuration, but it was determined to be caused by the Core i7 processor at 100% CPU for extended periods of time, vs. anything with the nVidia video itself. The identical machine with a Core i5 didn't have the issues.)


I just stumbled on this article and haven't read all of the thread. I am saving up to purchase a macbook pro. PLEASE I hope by the time I purchase one it wont have a NVIDIA graphic chip. THEY RUN TOO hot. I got a killer i7pc laptop with the nvidia card,gtx360m it still works pretty good but after about a year and half it starts having thermal shutdowns..yes I have done the usual, re-paste, clean the fans,laptop cooler ect. the damn thing idles at 80c and thats light web browseing. I also had another laptop that was completely unusable after 14 months....the nvidia chip got so hot it unseated itself from the board, in fact when they do warranty repair on these, they just reflow the board. so thats why mine lasted 14months, 1 repair once its back its out of warranty. I think some of the macs had the same issue am I not correct?
 
The real question is which manufacturer currently makes the fastest chips with a TDP of between 15 and 27watts ?

The answer is AMD by a county mile (and has been for several chip generations).


This basically means unless nVidia has something very special up its sleeve a change to nVidia notebook chips would mean lower performance.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.