I Have a 2010 baseline MacBook Pro w/ nVidia 320M and I can't complain. Maybe I'm just a lucky man, but i have never experienced *ANY* problem with their graphic cards. However, I can't say the same about ATI...
The nvidia GPU's in the i5 Macbook Pro (330M series) had some heat problems. One of my friends with this kind of Macbook Pro says his macbook will kernel panic if the GPU gets too hot.
It is a balance between available PCB space for VRAM and Apple's product tiers. 512MB/1 GB of VRAM when others are cramming 1.5/3GB onto every card possible. Even entry models are bloated with VRAM. It is DDR3 though...Change the "256MB" options to 512MB and the 1GB to 2GB and I'd be much happier.
ATI vs. Nvidia.
Six of one, half dozen of the other.
Perhaps because it was a "chip on Steves shoulder" decision and with him gone, we'll begin to see more unbiased decisions.
I don't know why Apple doesn't just buy AMD and design their own chips like they do with the ARM chips. It seems to work well with the iPad/iPhone. The market cap for AMD is approx $4bn compared to Intel's market cap of $125bn.
Then again Steve could have said no more nVidia after that happened the last time...
Apple should just buy AMD and be done with it, they'd have ways to design and build ARM processors and ATI graphs.. sadly if they bought they(AMD) would also loose the license from intel to make desktop/notebook processors![]()
Perhaps because it was a "chip on Steves shoulder" decision and with him gone, we'll begin to see more unbiased decisions.
Is CoreImage to high of a level for you?Sorry, but OpenCL is not embedded in Apples lowest level graphics libraries.
You contradict yourself here. If CoreImage is using OpenCL the the APIs use OpenCL. Obviously the APIs abstract the specifics of what is being done away, but the fact is CoreImage uses OpenCL if it is available. Frankly I can't understand how you can make the statement above. CoreImage is a low level component of Apples software stack and thus is used by the higher level elements in the stack.There are some portions of Apple's graphics APIs that are GPU accelerated (compositing in quartz for example) or even accelerated using GPGPU (CoreImage for example). But none of the APIs use OpenCL.
I have to disagree here too. Apps do use OpenCL through Apple libraries as demonstrated above however minor that may be at the moment. I suspect though what you are getting at is the direct linking of Apps to OpenCL and the use of the OpenCL API. Here the adoption of this technology is only taking place in apps where it has a direct and noticeable impact. That is apps that can exploit the SIMD like nature of GPUs.Almost no serious applications use OpenCL yet.
A notable counter example is Final Cut X which states OpenCL support amongst its requirements.
I doubt this will ever happen, but if they would ever support the quadro line of cards I would extremely happy for this move.
right, because Steve wouldn't have known that the company was going to do a massive GPU switch before he passed. This idea just popped into Tim Cooks head yesterday.![]()
Power Mac G5 developer kits for the Xbox 360? Power Mac G5 cases with Pentium 4 innards and Intel GMA 900?Another "Steve was a superhero who worked 16 hour days up until the day he died (when he only worked an 8 hour day because he passed early-afternoon)" post....
Enough of the idle¹ worship. This is a rumour about a system probably many months away from shipping. These decisions are based on long term roadmaps, not on ideas that "pop into heads" or a narrow look at current offerings. Apple has probably debated "Nvidia vs. ATI" for every system that they've introduced.
(Note that Apple announced the switch to Intel during the Pentium 4 "Netburst" days - Apple wasn't switching to Netburst, Apple was switching to Intel's roadmap across workstation/desktop/mobile products.)
¹ not a typo
G80 and G92 were King until ATI released the HD 4800 line. Even at that point the Radeons were on par with nVidia's massive GT200 chips. Do not look at the prices that an 8800 GTX Ultra could fetch from those days...I don't necessarily think nVidia is bad - I remember the 8800GTX days where nVidia completely dominated ATI and ATI had no response for years.... but ever since ATI/AMD released the 4xxx series, it has been king and it is seeming that nVidia has put most of its focus on processors for tablets and phones nowadays.
This is just fun to watch.
AMD is amazing. AMD sucks. NVidia is amazing. No NVidia is terrible.
It's a fight to the bitter end.
Is CoreImage to high of a level for you?
Do an otool -L CoreImage | grep OpenCL
Do the same for QuartzComposer
$ otool -L CoreImage|grep OpenCL
/System/Library/Frameworks/OpenCL.framework/Versions/A/OpenCL (compatibility version 1.0.0, current version 1.0.0)
$ otool -L QuartzComposer|grep OpenCL
/System/Library/Frameworks/OpenCL.framework/Versions/A/OpenCL (compatibility version 1.0.0, current version 1.0.0)
Is CoreImage to high of a level for you?
Do an otool -L CoreImage | grep OpenCL
Do the same for QuartzComposer
You contradict yourself here. If CoreImage is using OpenCL the the APIs use OpenCL. Obviously the APIs abstract the specifics of what is being done away, but the fact is CoreImage uses OpenCL if it is available. Frankly I can't understand how you can make the statement above. CoreImage is a low level component of Apples software stack and thus is used by the higher level elements in the stack.
We can argue about how much OpenCL is used in CoreImage or about where else it is used, but due to where the technology is implemented I think it is fair to say apps do benefit from OpenCL.
I have to disagree here too. Apps do use OpenCL through Apple libraries as demonstrated above however minor that may be at the moment. I suspect though what you are getting at is the direct linking of Apps to OpenCL and the use of the OpenCL API. Here the adoption of this technology is only taking place in apps where it has a direct and noticeable impact. That is apps that can exploit the SIMD like nature of GPUs.
The problem is I don't think a lot of people understand where using OpenCL type acceleration is valuable. It won't be used in a text editor, spreadsheet or other apps dealing with none parallel data. Could OpenCL be used to a greater extent in general apps? Possible but it will never be wide spread.
Which supports my point an app needs to have a data set amendable to processing on a GPU in the first place.
This whole discussion is senseless and makes about as much sense as arguing about the use of an encryption processor on a CPU. Obviously that sub unit has a specific function for which it accels, you would not expect apps that don't need it to use the encryption processor anyways would you? So too with GPGPU computing.
I don't contradict myself by saying CoreImage uses GPGPU, because I never said it uses OpenCL. CoreImage is built on top of GLSL, not OpenCL. Where are you getting your information that CoreImage uses OpenCL in any substantial way? That it is linked with the library is not evidence.
It's used in Quartz Composer to help you develop OpenCL Kernels, from what I can gather.
I don't think you know why programmers link frameworks to their code... It's generally to use the framework.
otool -L /Developer/.../[B]iPhoneOS5.0.sdk[/B]/.../CoreImage | grep OpenCL
/System/Library/PrivateFrameworks/OpenCL.framework/OpenCL (compatibility version 1.0.0, current version 1.0.0)
I have that video card and I have a core i7 model. I never had a kernel panic nor ever did the GPU ever get above 85 deg C.
Easy. Nvidia likely made Apple an offer they couldn't refuse i.e. cheaper component pricing.
So what happened to the licencing issue? Intel had strictly forbidden discrete graphics to be used, everyone had to use their HD3000 built into the Sandy Bridge. What changed? How come they are allowing Apple to use nvidia chips all of a sudden? Did intel change its licence, did Apple persuade them, or are intel and nvidia suddenly friends again?
Pretty much.I don't give a rats ass, as long its not an intel one I'm happy![]()
I know anti-glare is an option for MacBook Pro (it's my current screen). I really wish it was on more devices. I don't need "bright, vibrant colors" or whatever the glossy screen is supposed to provide. I just want to be able to use my computer in any room without having to close all the drapes first.