Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why oh why is this crap constantly repeated.

And the macbook air's opencl is amazing yes?
No support there.
People are being too emotional. Yes Apple help founded OpenCL but the reality is they have used it about as much as the rest of the industry. As in they really haven't done too much with it.
You really don't know what you are talking about. OpenCL is perhaps one of the most successful industry initiative to come out of Apple lately. It has been widely adopted and is the only strong competitor to CUDA.

As for what they have done with it, it has been embedded into some of Apples lowest level graphics libraries. Effectively OpenCL accelerates every app out there to some extent.
How many Apps does anyone here use on a daily basis that actually uses OpenCL? I count 0 for me.
Learn to count! Almost every app out there gets some acceleration from OpenCL supporting hardware via Apple libraries.
I suspect the recent rumours regarding AMD are much more likely to be true than before. Lots of people lost their jobs at AMD recently, and when a company shows you no loyalty why would those people show any loyalty to them post pink slip?

What does laying off a few office workers, non productive workers at that, have to do with this discussion?
 
No!! I've had nothing but bad experiences with nVidia on Macs. I had a 2008 MBP with the ill-fated 8600m GT and there were many graphics glitches and sometimes the computer would not wake up and when it did, graphics would be scrambled. I went to the Apple Store many times for the issue and the last time was because my computer wouldn't turn on.

That computer was 2 years old so when I got a replacement under Applecare, I got the 2010 MBP with 330 GT and now this https://discussions.apple.com/message/16786584#16786584, a 150-page thread of problems. Although the title 'Lion' many users pointed out the screen on the computer turns black when using only the discrete graphics started happening in Snow Leopard. The only solution was to force Intel graphics. After months of having an oversized Macbook Air Apple finally extended support for those machines to 2 years (http://support.apple.com/kb/TS4088).
 
ATI has Linux drivers? :rolleyes:

No, they have something better - documentation without NDA. (If you know what that translates to - open source drivers with full KMS/3D support. They do exist - most newer ATI chips are much better supported out of box than NVidia can ever dream of - all with open source drivers thanks to the documentation ATI provides.)
 
Last edited:
So what happened to the licencing issue?
Intel had strictly forbidden discrete graphics to be used
What licensing issue?
Intel has never disallowed the use of discrete graphics with their CPUs as far as I know.
Nothing prevents computer manufacturers from hooking up a discrete GPU via PCIe lanes.

How come they are allowing Apple to use nvidia chips all of a sudden? Did intel change its licence, did Apple persuade them, or are intel and nvidia suddenly friends again?
"All of a sudden?" Nothing has really changed or would change.

Current (and also previous) generation MacBook Pros do nothing any different: using Intel integrated graphics in addition to having a discrete GPU on the same board.
 
nVidia is great and all, but I'm really bummed about the speculation that the thinner one will not contain dedicated graphics. That is a dealbreaker for me. Sure, integrated graphics have gotten better over the years, but they haven't improved so much that they can handle gaming on high settings very well.
 
As for what they have done with it, it has been embedded into some of Apples lowest level graphics libraries. Effectively OpenCL accelerates every app out there to some extent.

Learn to count! Almost every app out there gets some acceleration from OpenCL supporting hardware via Apple libraries.
So can we benchmark it? It feels like Grand Central all over again. Just use the libraries!
 
List of Intel Macs that have either offered cards from both vendors in the same line or offered a card from a different vendor as an upgrade option:
iMac - Late 2006, Early 2008, Early 2009, Late 2009
Mac Pro - Mid 2006, Early 2008, Early 2009

And that's what Dell (and others) do all the time including the CPU vendors. And Apple does not do it anymore. So why do people claim that Apple is more agile than others?
 
That is a legitimate concern.

I see where you are going but what matters to me is as an end user on Linux.

I just wanted to point out that AMD has drastically changed the ATI mind set with respect to Linux. Unfortunately good things don't happen overnight but they have steadily improved their driver support To the point that AMD GPUs are very usable in Linux. Further people are reporting very positively on what is coming down the line with respect to alpha and beta drivers.

Now those improvements are with respect to the closed source stuff, AMD has been doing a very good job of getting the info they can out to the open source community. nVidia does nothing here. Frankly I don't see AMD even competing with nVidia but rather with Intel on Linux. Intel has like wise stepped up their Linux support and is attempting to erase some of the foul ups with Sandy Bridge support before Ivy Bridge comes out. No body is perfect here, but I'd rather spend my money on AMD or Intel rather than to support nVidia which has a hostile attitude towards open source.

This message is starting to sound like it belongs in a Linux forum.
 
...they really haven't done too much with it.

How many Apps does anyone here use on a daily basis that actually uses OpenCL? I count 0 for me.
How about the frameworks all apps use or the latest pro software Apple released, Final Cut X.
Final Cut X uses openCL in complex visual transformation, like the transitions.

And for those that said openCL performance on the latest MBAir were poor, you need to look at Intel whose integrated GPUs don't support openCL running the code on the CPU instead (missing the point totally).
openCL support is supposed to come for Intel integrated GPU with Ivy Bridge (that will also improve significantly graphic performances).

Logically the discrete graphics market will be reduced in a couple of years to pro stations that need 3-4GB card, or heavy calculation (openCL or CUDA), even gamers will end up using integrated GPUs since the next big shift is going to be an increase in resolution that even the most powerful GPUs (with more than 300W consumed) can't handle. People will keep playing at lower resolution where integrated GPUs are catching up.
 
No support there.

You really don't know what you are talking about. OpenCL is perhaps one of the most successful industry initiative to come out of Apple lately. It has been widely adopted and is the only strong competitor to CUDA.

As for what they have done with it, it has been embedded into some of Apples lowest level graphics libraries. Effectively OpenCL accelerates every app out there to some extent.

Learn to count! Almost every app out there gets some acceleration from OpenCL supporting hardware via Apple libraries.


What does laying off a few office workers, non productive workers at that, have to do with this discussion?

Sorry, but OpenCL is not embedded in Apples lowest level graphics libraries. There are some portions of Apple's graphics APIs that are GPU accelerated (compositing in quartz for example) or even accelerated using GPGPU (CoreImage for example). But none of the APIs use OpenCL. Almost no serious applications use OpenCL yet. A notable counter example is Final Cut X which states OpenCL support amongst its requirements.
 
And that's what Dell (and others) do all the time including the CPU vendors. And Apple does not do it anymore. So why do people claim that Apple is more agile than others?

Why are you asking me? I was simply correcting your false claim.
 
Apple going back to nvidia would be a great move.

I have the 9400M in my 2008 MacBook and its perfect. No problems at all related to the GPU.

If you're having screen flickering, thats due to some other problem, not the GPU.

Excessive heat? Blame Apple's poor cooling design. Same with the 8 series GPU "problems". Those systems didn't have problems if GPUs had proper cooling systems.

I played through GTA4 on my 2008 MacBook back in the day, as well as other PC games. I had no heating issues whatsoever.

I've been using nvidia GPUs since the first RivaTNT. The first GPU to offer playable 1024x768 with 32-bit color. I have never had an issue with any of them.

However, every single AMD/ATI GPU I have tried has always had driver issues. Everything from games being unstable to missing textures, to piss poor image quality, to not having proper v-sync in Windows desktop.

I've built hundreds of systems for various clients over the last decade, and every single one has used some type of nvidia GPU. To my knowledge, every single system is still functioning. Might have had the odd PSU or HDD failure here and there, but the overall systems are still perfectly functional.

As much as I like AMD's CPUs for "bang for the buck", AMD/ATI GPUs are absolute junk because of piss poor driver support.

For me, nvidia has always performed better, never had driver or hardware issues, and always supported new games properly out of the door. AMD/ATI GPUs have been nothing but trouble in one way or another. I'll own a system with an AMD CPU but I will not own a system with an AMD GPU.
 
I just wanted to point out that AMD has drastically changed the ATI mind set with respect to Linux. Unfortunately good things don't happen overnight but they have steadily improved their driver support To the point that AMD GPUs are very usable in Linux. Further people are reporting very positively on what is coming down the line with respect to alpha and beta drivers.

Now those improvements are with respect to the closed source stuff, AMD has been doing a very good job of getting the info they can out to the open source community. nVidia does nothing here. Frankly I don't see AMD even competing with nVidia but rather with Intel on Linux. Intel has like wise stepped up their Linux support and is attempting to erase some of the foul ups with Sandy Bridge support before Ivy Bridge comes out. No body is perfect here, but I'd rather spend my money on AMD or Intel rather than to support nVidia which has a hostile attitude towards open source.
I will just wait to see how it turns out.

This message is starting to sound like it belongs in a Linux forum.
It was like that long before.
 
This is just pure ignorance. Apple is one of the most rigid computer manufacturers out there. Just look at Dell offerings. You'll find that right now they offer computers with Intel and AMD CPUs, integrated (Intel/AMD) and discrete GPUs from NVIDIA and AMD including dual card SLI and crossfire configurations. When you buy Apple computer, you buy it for the case not the internals. Apple being "agile" is a good joke though :D Agile companies do not keep their models unchanged for 2 years (as Apple does with Mac Pro).

You have absolutely no idea what 'agile' means. No, it is not defined as offering every configuration imaginable. That's not tough to do or something to aspire to- pc manufacturers have been doing it forever and takes no special skill or courage. Even ones close to going out of business. If you can't see or understand the agility of Apple in the last 10 years, in terms of transitions, throwing away old conventions, and disrupting market after market, then your ignorance is immense and there's no point even trying to explain it to you.
 
This is good news only in the sense that Apple's NVidia drivers have always seemed to be superior to the ATI/AMD ones. Though I'm not sure how true this is anymore with the major driver overhauls in Lion.
 
I like NVIDIA's driver software better than AMD's, but other than that I've never noticed much of a difference in performance. My 8800GT died on me, sure, but so did a Radeon of mine 4 years before.
 
I don't understand this move.

AMD has absolutely destroyed nVidia in the graphics segment with everything but the high-end gaming graphics that requires a dedicated gaming laptop.

The 6xxx series trounces the nVidia 5xx series.

The 7xxx series is going to make even a bigger jump than the 5xxx to 6xxx jump was. 28nm core, completely new architecture.

Unless nVidia's new architecture is a vast improvement over what they currently have, I don't get this move.

It's a rumour that is hard to believe. If indeed it comes true, then I would look for where the money is. Apple has always looked to switch suppliers quickly to get best prices.
 
What are you trying to say now?

So can we benchmark it? It feels like Grand Central all over again. Just use the libraries!

Grand Central Dispatch is used heavily. Do you honestly expect software developer, or more precisely their marketers, to focus in on every element of Mac OS used by their products? They won't because most consumers don't care and wouldn't know what it means anyways.

As to Libraries of course you want to use them. More importantly you pick the level of abstraction to fit the task at hand. Why would a developer do anything differently. Apple supplies the NSOperation class for access from Objective C via the Foundation class at the high level. Dispatch queues and block objects for lower level access are provided also, so their is no reason not to use the Apple supplied stuff.

In the end all I hear is a bunch of garbage that is spouted by people who could get to the facts by reading a bit of Apples documentation. More importantly if you are not a developer why would you comment? If you are a developer, your knowledge of the Mac's OS and it's APIs must be extremely limited, because GCD is widely used.
 
You have absolutely no idea what 'agile' means. No, it is not defined as offering every configuration imaginable. That's not tough to do or something to aspire to- pc manufacturers have been doing it forever and takes no special skill or courage. Even ones close to going out of business. If you can't see or understand the agility of Apple in the last 10 years, in terms of transitions, throwing away old conventions, and disrupting market after market, then your ignorance is immense and there's no point even trying to explain it to you.

I see. So offering fewer models, fewer options, sticking with the same model for years is not called "agile". They are "agile in ignoring their customers that is. They transition from supporting two GPU vendors to supporting just one and we call it "agile". Wow. A new crop of Apple fanboys was born.
 
I have the 9400M & 9600GT

It is 100% a problem for many people with the 9400M.
Check the forums, and all I have to do is switch to the 9400 to see it flickr almost every second it feels.

Luckily I never cared about the lower card and use the higher one... wish they just put the best one in there and left the other out.
[Totally not into the fact you cant choose to switch now, after the glitch with the cards in this macbook it put me off from wanting to update to where it will 'decide' when to switch - and heaven forbid one card flickers constantly like this one does.

pfft. not happy about this at all. Anyway...

Peace


I have the 9400M in my 2008 MacBook and its perfect. No problems at all related to the GPU.

If you're having screen flickering, thats due to some other problem, not the GPU.

p.s.
At least a switch back to Nvidia, if their cards work, will let Adobe Video Suite do its job properly. ;)
 
In the end all I hear is a bunch of garbage that is spouted by people who could get to the facts by reading a bit of Apples documentation. More importantly if you are not a developer why would you comment? If you are a developer, your knowledge of the Mac's OS and it's APIs must be extremely limited, because GCD is widely used.
Marketing and showmanship are easy on stage. I absolutely loathe programming but that does not mean I could not be forced to do it in a pinch.

Or...something only a developer would love and a buzzword for everyone else!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.