Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mrcandy

macrumors regular
Original poster
Nov 12, 2007
112
0
Calgary, AB Canada
Having successfully flashed a PC 8800GT (EVGA brand), today I decided to try over-clocking it. I initially used a Windows program to alter the clocks on the fly, until I found the upper limit for my card. I then edited the MP 8800 ROM image to alter the clock frequencies so the changes would remain permanent under both Windows and OSX. The stock Apple image has the clock's at Core:600, Memory:900, and Shader:1500 MHz. I have mine running now at 700:1000:1500.

I can run Crysis under XP in bootcamp at 740:1050:1500 with no problems, but at those settings I have trouble under OSX. It will boot and run the OS fine, but as soon as I start a game it plays for a few seconds and then locks up. Happens on both the Prey demo and UT 2004. Backing off to 700:1000:1500 runs UT 2004 with no problem, haven't tried the Prey demo at the new numbers yet.

I've benchmarked a flyby in UT 2004 for the ATI 2600XT, NV 8800GT (stock), and NV 8800GT (OC). Results are in attached pdf. Not sure why/how the ATI managed to score the max it did, because the difference between the ATI and the NV is huge. The average number gives a much more realistic feel for how the cards compare. i.e. it takes twice as long to complete the flyby on the ATI as it does on the NV.

I used Santaduck Toolpak V3 for the benchmark and NiBiTor to edit the ROM image.
 

Attachments

  • UT 2004 BM.pdf
    33.4 KB · Views: 411
What were you using to OC in Windows?

I have had good success with CoolBits in the past, but on my own MP I installed Vista . . . so I tried nTune. It seemed to work OK, but if I try to get it to give me system performance information, I get the Blue Screen of Death.

Is there any other CoolBits-esque trick for Vista, other than nTune?

Stuart
 
Nice!

If you edit the firmware with NiBiTor under Windows, the changes stick when you reboot into OS X? Would you mind posting your modified firmware image?
 
im sure im preaching to the choir here but remember when overclocking your video cards ensure that enough cooling airflow is passing through the cards...... especially with any dual video card setup where the cards have less space for the heatsinks to do their job!! :D
 
Having successfully flashed a PC 8800GT (EVGA brand), today I decided to try over-clocking it. I initially used a Windows program to alter the clocks on the fly, until I found the upper limit for my card. I then edited the MP 8800 ROM image to alter the clock frequencies so the changes would remain permanent under both Windows and OSX. The stock Apple image has the clock's at Core:600, Memory:900, and Shader:1500 MHz. I have mine running now at 700:1000:1500.

I can run Crysis under XP in bootcamp at 740:1050:1500 with no problems, but at those settings I have trouble under OSX. It will boot and run the OS fine, but as soon as I start a game it plays for a few seconds and then locks up. Happens on both the Prey demo and UT 2004. Backing off to 700:1000:1500 runs UT 2004 with no problem, haven't tried the Prey demo at the new numbers yet.

I've benchmarked a flyby in UT 2004 for the ATI 2600XT, NV 8800GT (stock), and NV 8800GT (OC). Results are in attached pdf. Not sure why/how the ATI managed to score the max it did, because the difference between the ATI and the NV is huge. The average number gives a much more realistic feel for how the cards compare. i.e. it takes twice as long to complete the flyby on the ATI as it does on the NV.

I used Santaduck Toolpak V3 for the benchmark and NiBiTor to edit the ROM image.
That is interesting, can NiBiTor see the EFI portion? I am trying to understand, how editing the BIOS can affect the EFI portion of the ROM (where the clocks would have to be as well).
 
What were you using to OC in Windows?

I have had good success with CoolBits in the past, but on my own MP I installed Vista . . . so I tried nTune. It seemed to work OK, but if I try to get it to give me system performance information, I get the Blue Screen of Death.

Is there any other CoolBits-esque trick for Vista, other than nTune?

Stuart

For the on-the-fly testing phase I was using nTune. I don't recall any system performance information page in nTune, but since it was the first time I've used nTune, I may have missed it. I didn't get any blue screens under XP. If I pushed the limits too far I would get screen freeze with the only way out being a power cycle.
 
Nice!

If you edit the firmware with NiBiTor under Windows, the changes stick when you reboot into OS X? Would you mind posting your modified firmware image?

Based on the UT 2004 results comparing the stock to OC versions of my ROM image, my conclusion is yes the changes stick.

I don't mind posting the image (will have to wait to this evening when I get home), but each card is different and there's no guarantee my numbers will work with another card. Best bet would be to do as I did and find the limits of your card using the on-the-fly technique, and then use those numbers in your own custom image to make them permanent.
 
That is interesting, can NiBiTor see the EFI portion? I am trying to understand, how editing the BIOS can affect the EFI portion of the ROM (where the clocks would have to be as well).

NiBiTor knows the EFI portion is there, but doesn't know what to do with it, or provide any info/editing capability on that portion. Fortunately, it's smart enough to simply copy/preserve the EFI half when you make changes to the BIOS part. It also knows that the checksum is after the end of the EFI piece and correctly updates that as well.

I don't know for sure, but it would seem from my results that the EFI portion simply refers to the clocks in the BIOS portion, rather than having them duplicated.

After I made my first change to the image I copied the original and modified images back to OSX and compared them in Hexedit. There were two bytes changed in the middle of the file, which represented the one clock change I had made at that point, plus the checksum at the very end. So NoBiTor is definately only changing the BIOS half.
 
im sure im preaching to the choir here but remember when overclocking your video cards ensure that enough cooling airflow is passing through the cards...... especially with any dual video card setup where the cards have less space for the heatsinks to do their job!! :D

NiBiTor also allows you to change the parameters for (video card) fan control. The way it comes set in the Apple image is for an on-board IC to adjust fan speed based on current measured temperature. I did not change any settings in this area, since it's based on measured temperature.
 
For the on-the-fly testing phase I was using nTune. I don't recall any system performance information page in nTune, but since it was the first time I've used nTune, I may have missed it. I didn't get any blue screens under XP. If I pushed the limits too far I would get screen freeze with the only way out being a power cycle.
I was getting the BSoD without trying to overclock; there was just an option in ForceWare for "Performance" information which, when selected, would kill everything. It gave me the BSoD twice, so I got rid of nTune.

I hadn't adjusted the settings on the cards at all. But, nTune is an old program relative to Vista, and I'm running 64-bit Vista, so who knows.


Stuart
 
Are both of these programs availible for download? I'm especially interested in the OSX side OC. For the record, every 8800GT ever made will go to 700Mhz.

Also, because you can apparently even edit the fan speed, the AKIMBO cooler from EVGA would fit with no trouble in the Mac 8800GT, and does a much better job of cooling than it's single slot counterpart. It's like $40, and should let everyone squeeze out a bit more performance while keeping the temps much more respectable.
 
Having successfully flashed a PC 8800GT (EVGA brand), today I decided to try over-clocking it. I initially used a Windows program to alter the clocks on the fly, until I found the upper limit for my card. I then edited the MP 8800 ROM image to alter the clock frequencies so the changes would remain permanent under both Windows and OSX. The stock Apple image has the clock's at Core:600, Memory:900, and Shader:1500 MHz. I have mine running now at 700:1000:1500.

I can run Crysis under XP in bootcamp at 740:1050:1500 with no problems, but at those settings I have trouble under OSX. It will boot and run the OS fine, but as soon as I start a game it plays for a few seconds and then locks up. Happens on both the Prey demo and UT 2004. Backing off to 700:1000:1500 runs UT 2004 with no problem, haven't tried the Prey demo at the new numbers yet.

I've benchmarked a flyby in UT 2004 for the ATI 2600XT, NV 8800GT (stock), and NV 8800GT (OC). Results are in attached pdf. Not sure why/how the ATI managed to score the max it did, because the difference between the ATI and the NV is huge. The average number gives a much more realistic feel for how the cards compare. i.e. it takes twice as long to complete the flyby on the ATI as it does on the NV.

I used Santaduck Toolpak V3 for the benchmark and NiBiTor to edit the ROM image.

Can you post a comparison of the stock nvidia vs OC nvidia in Crysis?

I don't think OC'ing is worth it in games like UT2004, where you already have 200+ fps average. Although, I understand that with a game like Crysis, the improvement may be minimal at best.

Thanks
 
Are both of these programs availible for download? I'm especially interested in the OSX side OC. For the record, every 8800GT ever made will go to 700Mhz.

Also, because you can apparently even edit the fan speed, the AKIMBO cooler from EVGA would fit with no trouble in the Mac 8800GT, and does a much better job of cooling than it's single slot counterpart. It's like $40, and should let everyone squeeze out a bit more performance while keeping the temps much more respectable.

Yes, you can get nTune here and NiBiTor from here.
 
Can you post a comparison of the stock nvidia vs OC nvidia in Crysis?

I don't think OC'ing is worth it in games like UT2004, where you already have 200+ fps average. Although, I understand that with a game like Crysis, the improvement may be minimal at best.

Thanks

Actually, more like 15% - here's the results:

Code:
Running GPU benchmark 1
Results will depend on current system settings

 
32 bit, DX9, Med Spec, Build 5879, Level=island - Stock clock
==============================================================
!TimeDemo Run 1 Finished.
    Play Time: 51.99s, Average FPS: 38.47
    Min FPS: 31.69 at frame 571, Max FPS: 51.71 at frame 94
    Average Tri/Sec: 35277488, Tri/Frame: 917057
    Recorded/Played Tris ratio: 1.00
==============================================================


32 bit, DX9, Med Spec, Build 5879, Level=island - Overclock
==============================================================
!TimeDemo Run 1 Finished.
    Play Time: 45.04s, Average FPS: 44.41
    Min FPS: 36.51 at frame 561, Max FPS: 59.55 at frame 84
    Average Tri/Sec: 40724904, Tri/Frame: 917071
    Recorded/Played Tris ratio: 1.00
==============================================================
 
Actually, more like 15% - here's the results:

Code:
Running GPU benchmark 1
Results will depend on current system settings

 
32 bit, DX9, Med Spec, Build 5879, Level=island - Stock clock
==============================================================
!TimeDemo Run 1 Finished.
    Play Time: 51.99s, Average FPS: 38.47
    Min FPS: 31.69 at frame 571, Max FPS: 51.71 at frame 94
    Average Tri/Sec: 35277488, Tri/Frame: 917057
    Recorded/Played Tris ratio: 1.00
==============================================================


32 bit, DX9, Med Spec, Build 5879, Level=island - Overclock
==============================================================
!TimeDemo Run 1 Finished.
    Play Time: 45.04s, Average FPS: 44.41
    Min FPS: 36.51 at frame 561, Max FPS: 59.55 at frame 84
    Average Tri/Sec: 40724904, Tri/Frame: 917071
    Recorded/Played Tris ratio: 1.00
==============================================================

Oh, pretty nice! I guess I will be OCing then :D
 
Actually, more like 15% - here's the results:

Code:
Running GPU benchmark 1
Results will depend on current system settings

I have my card running at 700/1000. Yay!

What benchmark settings did you use for the results that you posted? It defaults to 800 x 600, which seems very low.
 
I have my card running at 700/1000. Yay!

What benchmark settings did you use for the results that you posted? It defaults to 800 x 600, which seems very low.

The header line shows the settings (Medium for everything), but I missed the screen resolution. I ran both my tests at 1920x1200 which is the native resolution for my monitor.
 
The header line shows the settings (Medium for everything), but I missed the screen resolution. I ran both my tests at 1920x1200 which is the native resolution for my monitor.

Ok -- I guessed the same, and ran a test at exactly those settings:

Code:
2/27/2008 8:20:34 PM - XP
Beginning Run #1 on Map-island, Demo-benchmark_gpu
DX9 1900x1200, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen
Demo Loops=3, Time Of Day= 9
Global Game Quality: Medium
 ==============================================================
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
    Play Time: 50.39s, Average FPS: 39.69
    Min FPS: 38.86 at frame 1944, Max FPS: 55.88 at frame 1017
    Average Tri/Sec: 35689000, Tri/Frame: 899269
    Recorded/Played Tris ratio: 1.02
!TimeDemo Run 1 Finished.
    Play Time: 40.25s, Average FPS: 49.68
    Min FPS: 38.86 at frame 1944, Max FPS: 62.13 at frame 109
    Average Tri/Sec: 45159692, Tri/Frame: 908935
    Recorded/Played Tris ratio: 1.01
!TimeDemo Run 2 Finished.
    Play Time: 40.29s, Average FPS: 49.64
    Min FPS: 38.86 at frame 1944, Max FPS: 62.13 at frame 109
    Average Tri/Sec: 45142264, Tri/Frame: 909357
    Recorded/Played Tris ratio: 1.01
TimeDemo Play Ended, (3 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

2/27/2008 8:20:34 PM - XP

Run #1- DX9 1900x1200 AA=No AA, 32 bit test, Quality: Medium ~~ Overall Average FPS: 49.66

It seems that the first run is always the slowest. I used CrysisBenchmarkTool1.05, which automates the runs.
 
I tried going through with Nibitor and editing the clocking on the card to reflect the 700:1500:1000, but when I booted up into Vista it kept saying the standard settings. Should I save the bios without the checksum or do anything special?
 
I tried going through with Nibitor and editing the clocking on the card to reflect the 700:1500:1000, but when I booted up into Vista it kept saying the standard settings. Should I save the bios without the checksum or do anything special?

Has anyone tried upping the shader clock?
 
I have mine currently running at 700:1600:1000. I figured out the issue was the naming of the file. Stupid dos limitations.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.