Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I Have a 2010 baseline MacBook Pro w/ nVidia 320M and I can't complain. Maybe I'm just a lucky man, but i have never experienced *ANY* problem with their graphic cards. However, I can't say the same about ATI...
 
The nvidia GPU's in the i5 Macbook Pro (330M series) had some heat problems. One of my friends with this kind of Macbook Pro says his macbook will kernel panic if the GPU gets too hot.

Tell your friend to take it in for repair then...its still under Applecare and they will swap out the logic board in no time.
 
Change the "256MB" options to 512MB and the 1GB to 2GB and I'd be much happier.
It is a balance between available PCB space for VRAM and Apple's product tiers. 512MB/1 GB of VRAM when others are cramming 1.5/3GB onto every card possible. Even entry models are bloated with VRAM. It is DDR3 though...
 
ATI vs. Nvidia.

Six of one, half dozen of the other.

I'd say five of one (ATI), and a baker's half dozen of the other (Nvidia).

With the singular exception of Vista beta and early release drivers, I've had much better luck with Nvidia drivers over ATI.

However, this anecdote is nonsense - because I had simply stopped purchasing systems with ATI graphics. All of my client systems (around 100) had Nvidia graphics - so of course any Vista beta issues would be in Nvidia drivers.

Currently no issues.


Perhaps because it was a "chip on Steves shoulder" decision and with him gone, we'll begin to see more unbiased decisions.

For the sake of Apple shareholders, one should hope so. (ouch - down $50 in the last month...)


I don't know why Apple doesn't just buy AMD and design their own chips like they do with the ARM chips. It seems to work well with the iPad/iPhone. The market cap for AMD is approx $4bn compared to Intel's market cap of $125bn.

Apple doesn't really design ARM CPUs. Apple buys the CPU design from ARM, and builds custom silicon that incorporates the ARM-designed processors along with Apple's "special sauce".


Then again Steve could have said no more nVidia after that happened the last time...

Steve who? It's Tim now.


Apple should just buy AMD and be done with it, they'd have ways to design and build ARM processors and ATI graphs.. sadly if they bought they(AMD) would also loose the license from intel to make desktop/notebook processors :mad:

Apple doesn't make any desktop/notebook processors.
 
I doubt this will ever happen, but if they would ever support the quadro line of cards I would extremely happy for this move. However, given that I doubt I would ever see this line of cards in a MBP, my preference is Nvidia as I have never had issues in a Desktop or laptop using Nvidia. Had slew of ATI issues, with drivers compatibility issues with the 3D apps that I use etc...
 
Perhaps because it was a "chip on Steves shoulder" decision and with him gone, we'll begin to see more unbiased decisions.

right, because Steve wouldn't have known that the company was going to do a massive GPU switch before he passed. This idea just popped into Tim Cooks head yesterday. :rolleyes:
 
Well Core Image certainly links against OpenCL.

Sorry, but OpenCL is not embedded in Apples lowest level graphics libraries.
Is CoreImage to high of a level for you?

Do an otool -L CoreImage | grep OpenCL
Do the same for QuartzComposer
There are some portions of Apple's graphics APIs that are GPU accelerated (compositing in quartz for example) or even accelerated using GPGPU (CoreImage for example). But none of the APIs use OpenCL.
You contradict yourself here. If CoreImage is using OpenCL the the APIs use OpenCL. Obviously the APIs abstract the specifics of what is being done away, but the fact is CoreImage uses OpenCL if it is available. Frankly I can't understand how you can make the statement above. CoreImage is a low level component of Apples software stack and thus is used by the higher level elements in the stack.

We can argue about how much OpenCL is used in CoreImage or about where else it is used, but due to where the technology is implemented I think it is fair to say apps do benefit from OpenCL.
Almost no serious applications use OpenCL yet.
I have to disagree here too. Apps do use OpenCL through Apple libraries as demonstrated above however minor that may be at the moment. I suspect though what you are getting at is the direct linking of Apps to OpenCL and the use of the OpenCL API. Here the adoption of this technology is only taking place in apps where it has a direct and noticeable impact. That is apps that can exploit the SIMD like nature of GPUs.

The problem is I don't think a lot of people understand where using OpenCL type acceleration is valuable. It won't be used in a text editor, spreadsheet or other apps dealing with none parallel data. Could OpenCL be used to a greater extent in general apps? Possible but it will never be wide spread.
A notable counter example is Final Cut X which states OpenCL support amongst its requirements.

Which supports my point an app needs to have a data set amendable to processing on a GPU in the first place.

This whole discussion is senseless and makes about as much sense as arguing about the use of an encryption processor on a CPU. Obviously that sub unit has a specific function for which it accels, you would not expect apps that don't need it to use the encryption processor anyways would you? So too with GPGPU computing.
 
i'm not looking forward to this move, from pervious experiences of macs I've owned:

The first unibody 15" MBP had an issues with the nvidia card where it would flicker when in performance mode.

The 13" MBP, 2-3 generations ago with the integrated 330m also had an issue when running dual displays. The effect was both screens would turn a colour (normally white) and you'd need to to a hard reboot to fix.

With AMD I've had no issues.
 
I doubt this will ever happen, but if they would ever support the quadro line of cards I would extremely happy for this move.

Already supported:


http://store.apple.com/us/product/H3314LL/A

NVIDIA Quadro 4000 for Mac

Excellent graphics performance for complex applications.

The NVIDIA Quadro 4000 for Mac is the professional graphics solution with best-in-class performance across a broad range of design, animation and video applications. With 256 CUDA cores and 2GB of high-speed memory, it's ready to tackle the most demanding visual and compute applications. Fast double precision, a unique Quadro feature, enables high accuracy and fast results critical for many CAE or Scientific applications.

•Supercharge your CUDA based applications

•2GB of high-speed memory

•Enable a robust 3D Stereoscopic viewing experience

•Fast double precision, a unique Quadro feature​


----------

right, because Steve wouldn't have known that the company was going to do a massive GPU switch before he passed. This idea just popped into Tim Cooks head yesterday. :rolleyes:

Another "Steve was a superhero who worked 16 hour days up until the day he died (when he only worked an 8 hour day because he passed early-afternoon)" post.... :rolleyes:

Enough of the idle¹ worship. This is a rumour about a system probably many months away from shipping. These decisions are based on long term roadmaps, not on ideas that "pop into heads" or a narrow look at current offerings. Apple has probably debated "Nvidia vs. ATI" for every system that they've introduced.

(Note that Apple announced the switch to Intel during the Pentium 4 "Netburst" days - Apple wasn't switching to Netburst, Apple was switching to Intel's roadmap across workstation/desktop/mobile products.)

¹ not a typo
 
Last edited:
If this rumor is true, I hope Apple doesn't remove the Optical Drive from the 13" MBP at all to get space for a dedicated Nvidia GPU.

The Optical Drive is utterly useless. But a Nvidia GPU is even worse. I've had nothing but bad experiences with them.
 
This is just fun to watch.

AMD is amazing. AMD sucks. NVidia is amazing. No NVidia is terrible.

It's a fight to the bitter end.
 
Another "Steve was a superhero who worked 16 hour days up until the day he died (when he only worked an 8 hour day because he passed early-afternoon)" post.... :rolleyes:

Enough of the idle¹ worship. This is a rumour about a system probably many months away from shipping. These decisions are based on long term roadmaps, not on ideas that "pop into heads" or a narrow look at current offerings. Apple has probably debated "Nvidia vs. ATI" for every system that they've introduced.

(Note that Apple announced the switch to Intel during the Pentium 4 "Netburst" days - Apple wasn't switching to Netburst, Apple was switching to Intel's roadmap across workstation/desktop/mobile products.)

¹ not a typo
Power Mac G5 developer kits for the Xbox 360? Power Mac G5 cases with Pentium 4 innards and Intel GMA 900?

Human sacrifice! Dogs and cats, living together...


I don't necessarily think nVidia is bad - I remember the 8800GTX days where nVidia completely dominated ATI and ATI had no response for years.... but ever since ATI/AMD released the 4xxx series, it has been king and it is seeming that nVidia has put most of its focus on processors for tablets and phones nowadays.
G80 and G92 were King until ATI released the HD 4800 line. Even at that point the Radeons were on par with nVidia's massive GT200 chips. Do not look at the prices that an 8800 GTX Ultra could fetch from those days...

The HD 5800 Series brought price increases (x2 in some cases) with some gains over the 4870/4890 lineup. They can hold their own now in the GPU market.
 
This is just fun to watch.

AMD is amazing. AMD sucks. NVidia is amazing. No NVidia is terrible.

It's a fight to the bitter end.

I don't necessarily think nVidia is bad - I remember the 8800GTX days where nVidia completely dominated ATI and ATI had no response for years.... but ever since ATI/AMD released the 4xxx series, it has been king and it is seeming that nVidia has put most of its focus on processors for tablets and phones nowadays.
 
AMD graphics= Fast but not good on battery
NVIDIA graphics= OK and OK on battery
Intel graphics= SUCKS but the best on battery :D
 
Is CoreImage to high of a level for you?

Do an otool -L CoreImage | grep OpenCL
Do the same for QuartzComposer

For the people that are afraid of the command line, here's the output:
Code:
$ otool -L CoreImage|grep OpenCL
	/System/Library/Frameworks/OpenCL.framework/Versions/A/OpenCL (compatibility version 1.0.0, current version 1.0.0)
Code:
$ otool -L QuartzComposer|grep OpenCL
	/System/Library/Frameworks/OpenCL.framework/Versions/A/OpenCL (compatibility version 1.0.0, current version 1.0.0)
 
Is CoreImage to high of a level for you?

Do an otool -L CoreImage | grep OpenCL
Do the same for QuartzComposer

You contradict yourself here. If CoreImage is using OpenCL the the APIs use OpenCL. Obviously the APIs abstract the specifics of what is being done away, but the fact is CoreImage uses OpenCL if it is available. Frankly I can't understand how you can make the statement above. CoreImage is a low level component of Apples software stack and thus is used by the higher level elements in the stack.

We can argue about how much OpenCL is used in CoreImage or about where else it is used, but due to where the technology is implemented I think it is fair to say apps do benefit from OpenCL.

I have to disagree here too. Apps do use OpenCL through Apple libraries as demonstrated above however minor that may be at the moment. I suspect though what you are getting at is the direct linking of Apps to OpenCL and the use of the OpenCL API. Here the adoption of this technology is only taking place in apps where it has a direct and noticeable impact. That is apps that can exploit the SIMD like nature of GPUs.

The problem is I don't think a lot of people understand where using OpenCL type acceleration is valuable. It won't be used in a text editor, spreadsheet or other apps dealing with none parallel data. Could OpenCL be used to a greater extent in general apps? Possible but it will never be wide spread.


Which supports my point an app needs to have a data set amendable to processing on a GPU in the first place.

This whole discussion is senseless and makes about as much sense as arguing about the use of an encryption processor on a CPU. Obviously that sub unit has a specific function for which it accels, you would not expect apps that don't need it to use the encryption processor anyways would you? So too with GPGPU computing.

I don't contradict myself by saying CoreImage uses GPGPU, because I never said it uses OpenCL. CoreImage is built on top of GLSL, not OpenCL. Where are you getting your information that CoreImage uses OpenCL in any substantial way? That it is linked with the library is not evidence.

It's used in Quartz Composer to help you develop OpenCL Kernels, from what I can gather.
 
I don't contradict myself by saying CoreImage uses GPGPU, because I never said it uses OpenCL. CoreImage is built on top of GLSL, not OpenCL. Where are you getting your information that CoreImage uses OpenCL in any substantial way? That it is linked with the library is not evidence.

It's used in Quartz Composer to help you develop OpenCL Kernels, from what I can gather.

I don't think you know why programmers link frameworks to their code... It's generally to use the framework.
 
I don't think you know why programmers link frameworks to their code... It's generally to use the framework.

Do you believe that CoreImage uses OpenCL in iOS 5 as well? Don't you think this is something Apple would have bragged about? Afterall, they'd be the first company ever to use OpenCL on a mobile device.

Code:
otool -L /Developer/.../[B]iPhoneOS5.0.sdk[/B]/.../CoreImage | grep OpenCL
	/System/Library/PrivateFrameworks/OpenCL.framework/OpenCL (compatibility version 1.0.0, current version 1.0.0)

I guess all of iOS must benefit from OpenCL being utilized in the lowest level of the drawing APIs as well! The only trouble is, it's not!

So the fact that these libraries show up when using otool is not solid evidence that they are used in any substantial way.

I'm plenty willing to believe you that CoreImage uses OpenCL if you can provide me a source from Apple that says it does. Otherwise I'm inclined to believe what published sources say -- that it is built on top of GLSL.
 
Last edited:
Easy. Nvidia likely made Apple an offer they couldn't refuse i.e. cheaper component pricing.

Or putting dead horse on Tim's bed :p

Anyway .. I don't like nVidia cards either, failed on me once on DESKTOP. Any fanboy telling you nVidia is more respectable on gaming community is total BS.

I never had issue with Radeon card, it's not 90's when nVidia Riva TNT is a godly GPU compared to ATI Rage (remember those old days, huh? :D)
 
So what happened to the licencing issue? Intel had strictly forbidden discrete graphics to be used, everyone had to use their HD3000 built into the Sandy Bridge. What changed? How come they are allowing Apple to use nvidia chips all of a sudden? Did intel change its licence, did Apple persuade them, or are intel and nvidia suddenly friends again?

Not sure anything has changed,... but I understand the licensing issue was over the DMI bus. OEM's could always attach what ever they wanted to the PCIe bus. With Intel's IO hub taking such a massive space and not being something you can get ride of then having space and heat removal for a GPU on top of the IO and CPU was the issue. Plus some of the i3's don't have many/any PCIe to connect to. But there was never a blanket ban but you could only do it as an extra GPU.

What might be the interesting change is Intel said this year they would be open to custom fab of SOC's based on the iCore. Apple could take a punt on making a Custom iSOC that deletes the need for the intel IO hub and makes room for dedicated GPU on all machines or they could work with nVidia to do much the same thing(ie Commit to buy) and offer an alternative ix platform.

But if there is any truth to the rumour it's probably far less interesting and just switching out dedicated GPUs in machines that have them already.

Business Colleagues are like Facebook "friends" only 1 in 11 is a true friend.
 
I know anti-glare is an option for MacBook Pro (it's my current screen). I really wish it was on more devices. I don't need "bright, vibrant colors" or whatever the glossy screen is supposed to provide. I just want to be able to use my computer in any room without having to close all the drapes first.

Fun fact: it isn't mentioned in the official specs, but MBPs with the anti-glare option are also lighter than their glossy counterparts. (The front of the screen on anti-glare models is plastic, and much lighter than the glass pane used on glossy models.) The difference isn't huge, but it is noticeable.
 
A lot of people don't think of Apple this way, but, Apple cares so much about it's bottom line that when it comes to many decisions regarding GPU vendors it has very much to do with pricing as much as it does features.

When Apple is making hunders of thousands (millions?) of these macs - affordability scales pretty high here. I've personally been told this by high level techs.

AMD makes a bid, NVIDIA makes a bid - Apple likes the one with the most bang for it's buck, basically.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.