Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
An interesting project. Not one that I would suggest for someone to undertake, but fully possible. I have a B&W G3 with Leopard on it, 1GB of ram, a G4 upgrade, and a PCI GeForce 5200. CoreImage works very well or as well as it can on a GeForece 5200. It benches higher than the GeForce 5200 in my 12" Powerbook, iMac G4, iMac G5, and my PowerMac G5, but still a poor performer at gaming. The B&W G3 has a rather unique video card slot. It's a 66Mhz PCI that is not shared with any other device. In effect, it's an AGP 1X slot. The hardest part is getting the older kexts to load in the correct order. Last bit of advice, do not try this on a Rev. 1 board.

My spunky Power Macintosh 8600 does indeed have Leopard on it. It runs about the same at my B&W G3, but with an ATI 9200. It has QuartzExtreme enabled by default and it runs it on the shared PCI bus along with a USB and ATA card. A pity no CoreImage card has compatibility with OldWorld ROMs or that would be one nice machine. Also, one of the beta versions of Leopard runs on G3, but it is very unstable on them.
Hi @Intell,

Sorry to revive a 6-year-old thread but how did you get QE and CI to work for the 5200 on Leopard with the B&W? I have a similar-specced machine (B&W 500 MHz G4, 384 Mb RAM, GF5200) with Leopard on it following the "Complete Leopard on Unsupported Macs package" and as described on machtech (https://web.archive.org/web/20130924163750/http://www.mactech.com/2008/09/23/leopard-pre-agp). Is there any gain in using the 66 MHz slot with it?
 
Hmmm, from what I recall there was a script that did something to the kext to allow QE/CI on PCI video cards. By default QE/CI is only allowed on AGP/PCIe cards. Don't forget the 5200 doesn't have full, proper CoreImage support.

I believe there is some gain in using the 66Mhz PCI slot. I remember running tests, but don't remember the outcome of them.
 
I have a flashed 5200 in I think a Yikes! system. I don't actually 100% recall but I'm pretty sure that's where it ended up. A Yikes! really isn't that different from a G4 upgraded B&W-the ADB port isn't there but the pads for it are, the PCB layout is the same, and it even IDs as a PowerMac G3.

The 5200 I used had 256mb VRAM and dual VGA outputs. Don't ask me who made it.

Whatever the case, I remember a sort of weird thing where even though QE with something like a 9200 has to be enabled manually, CI, and consequently QE, is enabled automatically when a CI card is installed.

Somewhere or another too, I have a PCI Quadro that's supposed to flash really nicely to a 6200, but I never got around to that.

This is digging way back in my memory since it's been a few years since I played with any of this. Still, though, I seem to remember getting better OpenMark scores with the card in the 66mhz slot than in one of the 33mhz slots.
 
  • Like
Reactions: B S Magnet
Hmmm, from what I recall there was a script that did something to the kext to allow QE/CI on PCI video cards. By default QE/CI is only allowed on AGP/PCIe cards. Don't forget the 5200 doesn't have full, proper CoreImage support.

I believe there is some gain in using the 66Mhz PCI slot. I remember running tests, but don't remember the outcome of them.
QE and CI is enabled by default on even a PCI bus as long as the card supports Core image (its due to the fact that on PPC builds of OS X Tiger PCIe cards show up as PCI to the system rather than AGP and the fact the PCIe based PowerMac G5 is a thing )

but yeah if you have a QE only card like a Radeon 9200, then you need hacks to enable QE, but for a GeForce FX 5200 or GeForce 6200 it is enabled by defaults and no hacks are needed :)

I believe there is some gain in using the 66Mhz PCI slot. I remember running tests, but don't remember the outcome of them.
Ditto on this, as you say from the past, the 66Mhz PCI slot has a fairly direct connection to the Grackle South bridge, but the other slots are forced to mingle on a shared common bus with the onboard ATA controller and Paddington North bridge

1625761932757.png

but it is still all PCI so performance aint going to be stellar for Graphical workloads

and can confirm Rev A G3 BW's are very cantankerous beasts (which is fun* because all of my G3 BW's are Rev A's LOL)

I have a flashed 5200 in I think a Yikes! system. I don't actually 100% recall but I'm pretty sure that's where it ended up. A Yikes! really isn't that different from a G4 upgraded B&W-the ADB port isn't there but the pads for it are, the PCB layout is the same, and it even IDs as a PowerMac G3.

The 5200 I used had 256mb VRAM and dual VGA outputs. Don't ask me who made it.

Whatever the case, I remember a sort of weird thing where even though QE with something like a 9200 has to be enabled manually, CI, and consequently QE, is enabled automatically when a CI card is installed.

Somewhere or another too, I have a PCI Quadro that's supposed to flash really nicely to a 6200, but I never got around to that.

This is digging way back in my memory since it's been a few years since I played with any of this. Still, though, I seem to remember getting better OpenMark scores with the card in the 66mhz slot than in one of the 33mhz slots.

I think the GeForce FX 5200 ended up in your Xserve G5 IIRC

the PCI Quadro card you have is a Quadro FX 600, which should flash into FX 5200 not 6200 sadly, but a FX 5200 with actual fully functioning DVI since like the Apple FX 5200 the Quadro FX 600 uses external TMDS transmitters and like the Apple FX 5200 it has 2 of them to boot :)

which would make it the only Dual DVI card for a PCI Mac I think?

you do have a Flashable PNY GeForce 6200 too but thats 1 VGA and 1 DVI, but still that one is the only card that gives full core image on a PCI mac :)

 
Last edited:
Leave it to @LightBulbFun to know more about what I have than I do! :)

I think you're right on it ending up in the G5, but I'm 99.9% sure I had it in a Yikes! at one point. Just thinking about where/when/how I would have flashed it, that would have been the easiest place to test it.
 
QE and CI is enabled by default on even a PCI bus as long as the card supports Core image (its due to the fact that on PPC builds of OS X Tiger PCIe cards show up as PCI to the system rather than AGP and the fact the PCIe based PowerMac G5 is a thing )

but yeah if you have a QE only card like a Radeon 9200, then you need hacks to enable QE, but for a GeForce FX 5200 or GeForce 6200 it is enabled by defaults and no hacks are needed :)

Ah yes, that was the thing. It was QE that needs the hacks. I guess Apple never assumed someone would have a PCI CI card and never made checks for it to block it.
 
  • Like
Reactions: LightBulbFun
My spunky Power Macintosh 8600 does indeed have Leopard on it. It runs about the same at my B&W G3, but with an ATI 9200. It has QuartzExtreme enabled by default and it runs it on the shared PCI bus along with a USB and ATA card. A pity no CoreImage card has compatibility with OldWorld ROMs or that would be one nice machine. Also, one of the beta versions of Leopard runs on G3, but it is very unstable on them.
For getting newer Open Firmware cards (256MB Nvidia 6200) to work with older Open Firmware Power Macs, see
https://forums.macrumors.com/thread...l-work-in-a-beige-power-macintosh-g3.2303689/
Trying to get 512MB and/or 7800 to work as well.
 
  • Like
Reactions: Amethyst1
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.