Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I might be wrong, but what does the OS X switching have to do with the brand of card? From what I understand, it works like this:

As soon as the system receives a call from an application that says it requires openGL or QE or whatever, the Intel GPU is disabled and the discrete card (whatever it may be) gets activated. It has nothing to do with the brand of the discrete card. As long as there are proper drivers enabling it to function, it makes no difference whether its ATI or nVidia or anything else.

the switching is all handled automatically by Mac OS X without any user intervention (though there is actually a System Preference to deactivate it, if you choose). Apps that use advanced graphics frameworks such as OpenGL, Core Graphics, Quartz Composer or others will cause the OS to trigger the discrete GPU. So, when you are reading or writing Mail, or editing an Excel spreadsheet, Mac OS X will simply use the integrated Intel HD graphics. If you fire up Aperture or Photoshop, Mac OS X kicks on the NVIDIA GeForce GT 330M.

taken from http://arstechnica.com/apple/news/2010/04/inside-apples-automatic-gpu-switching.ars
 
What GPUs would you put in that category?

We've already seen multiple pointers to the 5xxx mobile cards that are in the same TDP range as the 330M, and run rings around it.

Neither does Nvidia's last gen tech.

Who cares? The comparison under discussion was comparing with something that required a reboot.

You guys aren't listening-if you want to know how Optimus works, read the Anadtech articles on it. It is *NOT* possible with ATi's current hardware. Again, ATi's current hardware works like Nvidia's last gen (second gen) switchable graphics.

And again, I don't care. That's fine by me. The question is whether ATI's cards could do a better job of offering switching without rebooting and performance within the thermal envelope, and the answer is unambiguously yes.

Furthermore, Apple's technology is not Optimus, and we have no information supporting the claim that Apple's technology couldn't do seamless switching with the ATI hardware. Apple isn't using nVidia's switching technology; why would they use ATI's?
 
We've already seen multiple pointers to the 5xxx mobile cards that are in the same TDP range as the 330M, and run rings around it.



Who cares? The comparison under discussion was comparing with something that required a reboot.



And again, I don't care. That's fine by me. The question is whether ATI's cards could do a better job of offering switching without rebooting and performance within the thermal envelope, and the answer is unambiguously yes.

Furthermore, Apple's technology is not Optimus, and we have no information supporting the claim that Apple's technology couldn't do seamless switching with the ATI hardware. Apple isn't using nVidia's switching technology; why would they use ATI's?

You're right, but he's not getting it, he's got his mind set on one thing only. Feels like you're banging your head against a wall doesn't it? Obviously the majority have passed over this thread because they know it's retarded, so I'm going to let it die a graceful death.
 
We've already seen multiple pointers to the 5xxx mobile cards that are in the same TDP range as the 330M, and run rings around it.

Again, what? That may well be the case but I'd like more info, including actual power draw/heat dissipation, as TDP is measured differently by different companies.

Who cares? The comparison under discussion was comparing with something that required a reboot.

So now you're moving the goal posts? Several of you have explicitly claimed Nvidia's Optimus is the same thing as ATi's current switching, and now you're claiming "oh, it's close enough because you don't need to reboot"?

Furthermore, Apple's technology is not Optimus, and we have no information supporting the claim that Apple's technology couldn't do seamless switching with the ATI hardware. Apple isn't using nVidia's switching technology; why would they use ATI's?

So you keep claiming, yet it's running on Optimus hardware, doing the exact same thing as Optimus, and not available on anything but Nvidia Optimus hardware...why do you suppose that is?

PLEASE READ ANANDTECH'S ARTICLES before continuing to claim this is the same thing.

You're right, but he's not getting it, he's got his mind set on one thing only. Feels like you're banging your head against a wall doesn't it? Obviously the majority have passed over this thread because they know it's retarded, so I'm going to let it die a graceful death.

Ditto for you. Rather than insulting someone for daring to say something that doesn't fit into your preconceived notions, why not actually just read the articles explaining what this technology is doing? It's an interesting read.
 
I have no information suggesting that the optimus thing was particularly relevant; if anything, I'd think that it came about because Apple developed the technology.

Optimus is not about hardware, it is about software. Let's say you have half a dozen apps running, and they are all using the integrated graphics right now. There are textures stored on the graphics card, shader programs, contents of windows that are overlapped by other windows, and so on. The challenge isn't to turn off one graphics card and turn on the other one, the challenge is to do that and move everything that is on one card over to the other card, without any hickups that the user notices or application crashes. The driver for one card has to be switched off, and the driver for the other card switched on, without any interruption. That's the challenge.

Optimus does that for Windows (or is supposed to do it; I haven't tried it). But that is for Windows, it doesn't help one bit for MacOS X, so Optimus has exactly zero relevance for MacOS X. On the other hand, I can't see that the technology would be very dependent on the actual hardware. Ok, you need the hardware so that the same physical connector can take its image first from card A, then from card B. But beyond that, you just need to turn one card off and another card on _and make sure the apps don't notice it_.
 
Optimus is not about hardware, it is about software. Let's say you have half a dozen apps running, and they are all using the integrated graphics right now. There are textures stored on the graphics card, shader programs, contents of windows that are overlapped by other windows, and so on. The challenge isn't to turn off one graphics card and turn on the other one, the challenge is to do that and move everything that is on one card over to the other card, without any hickups that the user notices or application crashes. The driver for one card has to be switched off, and the driver for the other card switched on, without any interruption. That's the challenge.

Optimus does that for Windows (or is supposed to do it; I haven't tried it). But that is for Windows, it doesn't help one bit for MacOS X, so Optimus has exactly zero relevance for MacOS X. On the other hand, I can't see that the technology would be very dependent on the actual hardware. Ok, you need the hardware so that the same physical connector can take its image first from card A, then from card B. But beyond that, you just need to turn one card off and another card on _and make sure the apps don't notice it_.

Yes it does have relevance. Again, it's not just software, it's also hardware. Also, it does not shut down the integrated graphics-those are always used. AMD does not currently have the same technology in any of their GPUs.

EDIT: Regarding the earlier power draw issue-AMD doesn't seem to have a part comparable to the Geforce GT 330. The 54x0 parts are less powerful, the 56x0 parts are more powerful. You'd have to show that a GT 330 draws the same or more power than a 56x0 part. I can't find anything on the GT 330's power usage. Plus the type of RAM used, etc. makes a difference.

At any rate, it's all a moot point as obviously Apple wants the switching technology Nvidia has. Saying "well I'd be perfectly fine using the older type of technology!" is pointless because Apple isn't. Heck, I want a Geforce 280 or Radeon 5870 in there...Apple doesn't.
 
EDIT: Regarding the earlier power draw issue-AMD doesn't seem to have a part comparable to the Geforce GT 330. The 54x0 parts are less powerful, the 56x0 parts are more powerful. You'd have to show that a GT 330 draws the same or more power than a 56x0 part. I can't find anything on the GT 330's power usage. Plus the type of RAM used, etc. makes a difference.

No clue what the rest of the argument is about except for this part.

The 5650 is ATI's mid-range card, and the GT 330M is Nvidia's mid-range card. That itself makes them comparable. Nvidia just sucked, that's why the ATI 5650 beats the GT 330M, but both these cards are mid-range and meant to compete with each other.

Anyway the power consumption of the 5650 is 15W - 19W. It was previously stated that the GT 330M's TDP is 23W on notebookcheck, but apparently it was taken down.

Google searches show that a lot of people said the GT 330M's TDP is 23W, probably based on what notebookcheck said.
http://www.google.com.sg/search?q=gt+330m+23w
 
I have to help Wolfpup out a little here as it is apperantly impossible to read a little on Anandtech.

Optimus is actually very simple to understand.
gen3b-slide1.jpg

This Copy Engine is in hardware and is what ATI cards are missing. They could do the same on a software level but that would cost performance and is thus a stupid thing to do. All Optimus does is add another layer in this dual GPU mess. The OS thinks it has to deal with only one GPU the integrated one and the Optimus driver takes care of the rest.
The Copy Engine is in all 2X0 and 3X0 Nvidia cards, but was previously not activated because Nvidia had to finish the software work first.

Intels FDI (Flexibly Display Interface) is basically the same thing as has been around for a while which relies heavily on OS support. Both GPUs have an display output which is merged in a demux chip onboard. There the signal switches from one to the other. I don't have facts about how long the switch takes but this can basically happen instantly. What makes the old solutions slow is I believe that handing over from one driver to another is a pain and slow.
All the stuff that needs to be in VRAM can be transfered in less than a second. That is 256 or 512mb VRAM and you can transfer about 8GB/s over PCIe x16. That is about 0.0625 seconds is all the data transfered and when all the software is ready I guess the dmux chip can swtich in msec. Shutting down some processes, ramping up others initialising stuff in the drivers that is all software stuff that probably takes most of the time. This is were OSX now works better than Win7, which will also improve as soon as somebody really tries.

Optimus is easy because it works and it needs no wires, no demux, no anything besides an extra GPU and PCIe which has to be there anyway. It is cheap but probably also has the downside that the integrated GPU is still in use (although light use). If you use the dedicated like many Optimus Notebooks only when plugged in and never on the road this doesn't matter but Apple wanted the best solution and put some work in the old but more complicated and expensive one.
 
So now you're moving the goal posts? Several of you have explicitly claimed Nvidia's Optimus is the same thing as ATi's current switching, and now you're claiming "oh, it's close enough because you don't need to reboot"?

I have only claimed that I don't see any obvious reason that it couldn't be done on ATI's hardware.

So you keep claiming, yet it's running on Optimus hardware, doing the exact same thing as Optimus, and not available on anything but Nvidia Optimus hardware...why do you suppose that is?

Apple developed this before Optimus existed. Optimus is not "hardware". It's something nVidia does in software. On Windows. Furthermore, on the 17" macs, Apple's technology switches between an nVidia GPU and an Intel GPU. So that's not "Optimus hardware", it's not even nVidia hardware.

Again, both Apple and nVidia have said that it's not Optimus, even though it does the same thing.

Do you think that, because OS X and Linux both drive disks, and run on Intel hardware, that means OS X and Linux are the same thing?

They're different things. They are separately engineered, different, ways of accomplishing the same goal.

We currently have no information either way as to whether Apple's switching technology, which both Apple and nVidia say is not the same technology as Optimus, could be used with ATI hardware. Maybe it could, maybe it couldn't. We don't know.

Ditto for you. Rather than insulting someone for daring to say something that doesn't fit into your preconceived notions, why not actually just read the articles explaining what this technology is doing? It's an interesting read.

Because I've read them, and I understand it just fine, and:

1. There is no relationship, other than "same end result", between Optimus and Apple's display switching.
2. I don't care whether ATI's display switching technology is precisely identical to Optimus or not.
3. In fact, I don't care about ATI's display switching technology at all.
4. All I care about is whether Apple's technology, which is not Optimus, as proven above, would allow switching between the Intel (not nVidia) integrated graphics and a discrete GPU, without rebooting.
5. Which ATI can do on their hardware, so presumably it could be done by anyone else who wanted to.

I don't really care about seamless. Honestly, I don't care about switching at all; if Apple had sold a machine which had a low-power 56xx GPU and no switching, I'd have bought it.
 
I have only claimed that I don't see any obvious reason that it couldn't be done on ATI's hardware.

That's because you're refusing to read articles that actually explain how this works.

Apple developed this before Optimus existed. Optimus is not "hardware". It's something nVidia does in software.

Again, it is also HARDWARE.

On Windows. Furthermore, on the 17" macs, Apple's technology switches between an nVidia GPU and an Intel GPU. So that's not "Optimus hardware", it's not even nVidia hardware.

With a statement like that, you're showing you have no idea what Optimus is. Please just read the article and quit making claims about it.

Again, both Apple and nVidia have said that it's not Optimus, even though it does the same thing.

Assuming they actually said that, it's meant as "this doesn't have the brand name 'Optimus' and is using different software to do the same thing".

We currently have no information either way as to whether Apple's switching technology, which both Apple and nVidia say is not the same technology as Optimus, could be used with ATI hardware. Maybe it could, maybe it couldn't. We don't know.

Yes we do know, because ATi does not have the equivalent of Optimus. From the article you refuse to read:

"Incidentally, AMD switchable graphics is essentially equivalent to NVIDIA's generation two implementation."

I don't really care about seamless. Honestly, I don't care about switching at all; if Apple had sold a machine which had a low-power 56xx GPU and no switching, I'd have bought it.

As I keep saying, I don't care that you don't care about it being seamless. I don't care about that either. I don't want the Intel GPU used at all. I'd be thrilled if every laptop used a Geforce 280 or Radeon 5870. That is COMPLETELY IRRELEVANT. Clearly Apple DOES care about switching, and DOES want it to be seamless, which rules out ATi.

There may also be considerations in terms of plans Apple has for running more general purpose code on the GPU, as Nvidia's hardware seems to considerably outperform ATi's, to the point where on a desktop part, a top of the line last gen Nvidia part is outperforming ATi's current top of the line part.
 
That's because you're refusing to read articles that actually explain how this works.

No, it's because nothing in those articles addresses the question.

Have you yet read everything Wikipedia has under "H"? No? Then why are you talking about this? ... oh, because none of that is relevant.

You keep quoting things that make points different from what you're trying to say they show.

Again, it is also HARDWARE.

nVidia's is.

With a statement like that, you're showing you have no idea what Optimus is. Please just read the article and quit making claims about it.

I have read the articles. Also, I know what the words in them mean.

Assuming they actually said that, it's meant as "this doesn't have the brand name 'Optimus' and is using different software to do the same thing".

No.

It's meant as this is not the same technology.

You jumped to a conclusion, you've never supported it, all you've got is that you've found articles which don't instantly and conclusively disprove it...

But you've shown no support.

Apple's solution predates Optimus. It's not Optimus. Optimus is what nVidia did when, having seen Apple's technology be popular and successful, they decided to implement the same thing.

Yes we do know, because ATi does not have the equivalent of Optimus.

So what? The question isn't whether Optimus could run on ATI's hardware, it's whether Apple's software could be used with non-nVidia hardware. And we already know it can.

From the article you refuse to read

No, the article which does not turn out to be relevant to this claim.

As I keep saying, I don't care that you don't care about it being seamless. I don't care about that either. I don't want the Intel GPU used at all. I'd be thrilled if every laptop used a Geforce 280 or Radeon 5870. That is COMPLETELY IRRELEVANT. Clearly Apple DOES care about switching, and DOES want it to be seamless, which rules out ATi.

Except that we don't know that. All we know is that Apple, who have clearly got a deal with nVidia, picked an nVidia chip. We don't know that they require the particular functionality in question, nor do we know that ATI's hardware couldn't do this -- only that ATI's software doesn't.

You're jumping to conclusions.

There may also be considerations in terms of plans Apple has for running more general purpose code on the GPU, as Nvidia's hardware seems to considerably outperform ATi's, to the point where on a desktop part, a top of the line last gen Nvidia part is outperforming ATi's current top of the line part.

Interesting theory, but so far as I can tell, it's not true of the laptop parts, where ATI's lead is huge.
 
EDIT: Man...this is weird, he's partially right...but that leaves the question about why you can't install drivers on Windows for the GPU...

Unfortunately this may mean that the current Macbook Pros just aren't supported, either because Apple won't allow Nvidia to support them, or because they're doing switchable graphics in a none standard way.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.