Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

xpcker

macrumors regular
Original poster
Oct 25, 2010
129
0
Hi, have anyone do this? i OC my card within windows but never before with mac, my doubt is that the info of clock speeds zeus gives me is this

Unhandled init script entry with id '0x8c' at 0xda33
Unhandled init script entry with id '0x8c' at 0xda33
-- General info --
Card: GT216 [GeForce GT 330M]
Architecture: GA5 A2
PCI id: 0x0a29
Subvendor id: 0x106b
GPU clock: -8589935.000 MHz
Bustype: PCI-Express

-- Shader info --
Clock: -8589935.000 MHz
Stream units: 48 (00000011b)
ROP units: 8 (00000011b)
-- Memory info --
Amount: 256 MB
Type: 128 bit DDR3
Clock: -8589935.000 MHz

-- PCI-Express info --
Current Rate: 16X
Maximum rate: 16X

-- Smartdimmer info --
Backlight level: 100%

-- Sensor info --
Sensor: G84 GPU Internal Sensor
GPU temperature: 65C

-- VideoBios information --
Version: 70.16.58.0a.00
Signon message: GT216 Apple K18 VGA BIOS
VID mask: 14
Voltage level 0: 0.01V, VID: 0
Voltage level 1: 0.00V, VID: ff
Voltage level 2: 0.15V, VID: 0

---------

what about that clocks? they aren't real according to me.... what value should i put on the mem clock and gpu clock?


thank you all!!!
 
The voltage readings are also wrong.
I wouldn't trust an app that does that terrible at reading out data to know how to properly set new settings. Why overclock in OSX anyway, do you play games in OSX?
I'd stay clear of it.
 
i do play a lot sc2, yesterday in a game i had 15fps with everything in low, im going mad.... 8 gb ram and a ssd and a 2010 mbp, its not a good deal

i wont sell it for anything, love mac, never going back to anything finishes with dows :D
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.