Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well... and that card was the default grfx card in all iMac G5s and lower-end PowerMac G5's :rolleyes:

Even more reason for Apple never to have used that card.
 
Now that is curious... because Quartz 2D Extreme certainly gave my 2D performance a boost (12" rev.c Powerbook).

Is there a plist I can check to confirm this?

edit: OK, did the quickest, easiest test I could think of; run Quartz composer for a while and watch temperatures, with ThermographX. With viewer active, GPU temp rose 4 degrees in less than a minute. Close viewer, do something else... returns to normal. Reopen viewer, behaviour is repeated.

So... *shrug*. Look like it is being used, here at least.
 
The 5200 is CoreImage compatible, but the tech note describes the fact that using the GPU of the 5200 could actually be slower than leaving it all to the CPU only.

The ripple effect will work, and Quartz 2D will be in effect (if enabled by hand).

This is the interesting part: (from the linked note):
"Note: By default, Core Image uses the CPU for rendering on systems with a GeForce 5200 series card because, for most benchmarks, the 5200 can be slower than the CPU on currently shipping hardware."
 

Attachments

  • Picture 2.png
    Picture 2.png
    9.9 KB · Views: 91
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.