Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Can they legally call it a 2600xt then??? Apple underclocked it to Pro speeds. Any legal eagles out there wanna chime in about this? I am of the opinion they shouldn't be calling it a 2600xt if it does not perform at 2600xt speeds.

This has me curious. I wonder what my "8800GT" is actually clocked at...

I'm wondering about this too, the legality of it all.

I may just cancel my 8800 and get the 9800GX2. Can't use it in OSX, but at least I'd get the advertised speeds...
 
I'm wondering about this too, the legality of it all.

I may just cancel my 8800 and get the 9800GX2. Can't use it in OSX, but at least I'd get the advertised speeds...

Will the 9800GX2 work on a Mac in windows? Does the card need any hardware/motherboard support like Nvidia MCP's, or North/Southbridge?
Has Nvidia even released details about compatibility yet? I'll get one too. :D
 
Will the 9800GX2 work on a Mac in windows? Does the card need any hardware/motherboard support like Nvidia MCP's, or North/Southbridge?
Has Nvidia even released details about compatibility yet? I'll get one too. :D

It's internal SLI, so it should work with any motherboard (ATi's similar dual GPU solution, the 3870X2, works on any mobo)

Most details about the card are rumors with nothing finalized, but it should debut at the upcoming CeBIT event, and may go onsale by March 11th.

It may be smarter to wait until later this year though when nvidia and ati release their true next gen GPUs...

Anyway, this thread has all the rumors:

http://www.hardforum.com/showthread.php?t=1259450
 
Apple has a history of under clocking cards. I couldn't say why, but I suppose it could be to fit inside a power or thermal envelope (Unlikely). They can market the card as whatever they like because it is the MAC version of the card. The PC version of the 2600 and the mac version of the 2600 don't have much in common firmware wise. The same is true for the GT. However, I would totally recommend upping the clock and memory speeds. The stock GT is 600Mhz core clock and 1800Mhz (effective) memory clock.

What you should check out to ensure that it gets up to that without issue, since the 8800GT does run hot for my liking anyway.

http://www.evga.com/products/moreinfo.asp?pn=202-F2-EV03-A1

Should keep your GT nice and cool :D
 
Apple has a history of under clocking cards. I couldn't say why, but I suppose it could be to fit inside a power or thermal envelope (Unlikely).

That would be my guess, but there was a quote I saw that eluded to ATI basically treating the Mac versions of the graphics cards as an entirely different product and because there was lower demand for them they offered slower versions.
 
am i the only one for whom that doesn't make any sense ? after all the there is a HD2400 below the HD2600 anyway who uses even less power and costs less and can be passively cooled so why bother with a more expensive chip and then underclock it considerable

take the hint apple just offer that lower model for more cards/screens and keep those clock rates normal than this crap
 
Apple downclocked it to meet their power requirements for 4 slot operation.

Raising it in the slots that can handle it should be too big a problem -- though those words usually melt silicon.
What? The PCIe slot can only put out 75W (so 4 slots can do 300 watts). If you need more power than 75W then you hook up the 6 or 8 pin power connector. That is good for up to another 150W per plug. If you still need more power and have godly 12+V rails then get Molex to PCIe 6 or 8 pin adaptor. But overall the standard PCIe spec dictates that a card can pull no more than 225W from 1 slot and 1 PCIe power connector. So total power for PCIe could be 900W (technically).
There is an upgraded spec that allows 150W drawn from the slot plus 150W from the power plug, but very few mobo's have that and i don't know of any cards that are designed to take advantage of it.
That would be my guess, but there was a quote I saw that eluded to ATI basically treating the Mac versions of the graphics cards as an entirely different product and because there was lower demand for them they offered slower versions.
ATI gives Apple what they ask for. In most cases Apple asks for a slower part, one cause it is cheaper and they can have a higher profit margin from it. Two cause they are worried about sound (and in the iMacs and notebooks, power/heat). It would be much easier for ATI just to give Apple what they use for PC's. Oh, AFAIK Apple write it's own graphics drivers. Period. If ATI did write Apple graphics drivers you would be able to find them on ATI's website. The most recent card they have OS X drivers for is the G5 version of the X1900.
 
Apple has a history of under clocking cards. I couldn't say why, but I suppose it could be to fit inside a power or thermal envelope (Unlikely).

That would be my guess, but there was a quote I saw that eluded to ATI basically treating the Mac versions of the graphics cards as an entirely different product and because there was lower demand for them they offered slower versions.

This is what AnandTech has to say about it...

"Historically, ATI Mac Edition cards have always been clocked lower than their PC counterparts; ATI explained the reasoning behind this disparity as having to do with basic supply and demand. The demand for Mac video cards is lower than their PC counterparts, so ATI runs them at lower clock speeds to maintain their desired profit per card regardless of whether they are selling to Mac or PC markets."
 
This is what AnandTech has to say about it...

"Historically, ATI Mac Edition cards have always been clocked lower than their PC counterparts; ATI explained the reasoning behind this disparity as having to do with basic supply and demand. The demand for Mac video cards is lower than their PC counterparts, so ATI runs them at lower clock speeds to maintain their desired profit per card regardless of whether they are selling to Mac or PC markets."

Ah that's the quote I was thinking of.
 
It's internal SLI, so it should work with any motherboard (ATi's similar dual GPU solution, the 3870X2, works on any mobo)

Most details about the card are rumors with nothing finalized, but it should debut at the upcoming CeBIT event, and may go onsale by March 11th.

It may be smarter to wait until later this year though when nvidia and ati release their true next gen GPUs...

Anyway, this thread has all the rumors:

http://www.hardforum.com/showthread.php?t=1259450



I read that about March 11th. I'm thinking I will have already beat the game by then. Well there is always Mass Effect, but I doubt it will have the problems from hyper realism that Crysis has. It still might be worth having the card. If this is the trend games are taking, and I hope they are, Apple will just have to start improving their graphics options.
 
Apple has a history of under clocking cards. I couldn't say why, but I suppose it could be to fit inside a power or thermal envelope (Unlikely). They can market the card as whatever they like because it is the MAC version of the card. The PC version of the 2600 and the mac version of the 2600 don't have much in common firmware wise. The same is true for the GT. However, I would totally recommend upping the clock and memory speeds. The stock GT is 600Mhz core clock and 1800Mhz (effective) memory clock.

What you should check out to ensure that it gets up to that without issue, since the 8800GT does run hot for my liking anyway.

http://www.evga.com/products/moreinfo.asp?pn=202-F2-EV03-A1

Should keep your GT nice and cool :D


I'm actually at work today... Imagine that.

But could someone look at their 2600(XT) running under Bootcamp with the ATI tool and see what the clock rate is under windows. I seem to remember seeing 800/1000 not the lower rates that are being talked about here.
 
I'm willing to bet that these cards are pass-overs for PC versions of the 2600 XT that failed the clock and RAM speed tests and were thus destined for 2400s or the mac version of the 2600.

there is no "legal" reason that they have to give you the same level of performance with the name 2600xt.

in fact, ATI has always put out carefully-named PC cards that sounded faster but were actually slower...for example, putting out a 9550 Pro 256 and selling it for the same price as a 9600 Pro 128. People bought the 9550 because it had "TWICE THE MEMORY!" but it was so under-clocked that it might as well have been a 9200. Since there are only a smattering of cards for the Mac, they can name any card whatever they want.

They can turn off pipelines, sub in slow RAM, reduce clock speed, anything they like...They could put out a modified version of the 2400 XT and call it a 3870 XTR Mac Edition if they wanted.

look at the specs and you can tell how good the card is:

memory speed
gpu speed
gpu bus width
memory bus width
transistor count
memory amount
memory type


compare those things on any two given graphics cards and you can tell which is better. The formula has held up since the original GeForce, so it works.

gpu speed and bus width are the most important factors, followed by memory speed and bus width. then transistor count and memory amount. Below a certain critical amount, memory amount is most important, but generally cards with too-little VRAM are not sold. The exception are those TurboCache-style cards that borrow RAM from the motherboard...but they are barely better than integrated graphics.
 
I'm willing to bet that these cards are pass-overs for PC versions of the 2600 XT that failed the clock and RAM speed tests and were thus destined for 2400s or the mac version of the 2600.

there is no "legal" reason that they have to give you the same level of performance with the name 2600xt.

in fact, ATI has always put out carefully-named PC cards that sounded faster but were actually slower...for example, putting out a 9550 Pro 256 and selling it for the same price as a 9600 Pro 128. People bought the 9550 because it had "TWICE THE MEMORY!" but it was so under-clocked that it might as well have been a 9200. Since there are only a smattering of cards for the Mac, they can name any card whatever they want.

They can turn off pipelines, sub in slow RAM, reduce clock speed, anything they like...They could put out a modified version of the 2400 XT and call it a 3870 XTR Mac Edition if they wanted.

look at the specs and you can tell how good the card is:

memory speed
gpu speed
gpu bus width
memory bus width
transistor count
memory amount
memory type


compare those things on any two given graphics cards and you can tell which is better. The formula has held up since the original GeForce, so it works.

gpu speed and bus width are the most important factors, followed by memory speed and bus width. then transistor count and memory amount. Below a certain critical amount, memory amount is most important, but generally cards with too-little VRAM are not sold. The exception are those TurboCache-style cards that borrow RAM from the motherboard...but they are barely better than integrated graphics.

Don't forget how many Quads are active (Pixel, Vertex, and Geometry Shaders). Oh and ROPs, those are important as well.
 
I'm willing to bet that these cards are pass-overs for PC versions of the 2600 XT that failed the clock and RAM speed tests and were thus destined for 2400s or the mac version of the 2600.

there is no "legal" reason that they have to give you the same level of performance with the name 2600xt.

in fact, ATI has always put out carefully-named PC cards that sounded faster but were actually slower...for example, putting out a 9550 Pro 256 and selling it for the same price as a 9600 Pro 128. People bought the 9550 because it had "TWICE THE MEMORY!" but it was so under-clocked that it might as well have been a 9200. Since there are only a smattering of cards for the Mac, they can name any card whatever they want.

They can turn off pipelines, sub in slow RAM, reduce clock speed, anything they like...They could put out a modified version of the 2400 XT and call it a 3870 XTR Mac Edition if they wanted.

look at the specs and you can tell how good the card is:

memory speed
gpu speed
gpu bus width
memory bus width
transistor count
memory amount
memory type


compare those things on any two given graphics cards and you can tell which is better. The formula has held up since the original GeForce, so it works.

gpu speed and bus width are the most important factors, followed by memory speed and bus width. then transistor count and memory amount. Below a certain critical amount, memory amount is most important, but generally cards with too-little VRAM are not sold. The exception are those TurboCache-style cards that borrow RAM from the motherboard...but they are barely better than integrated graphics.

Ditch that model. Hang it up to dry with the Mhz myth.

The unified nature of how graphic card works these days could care less about most of the things you listed. Sure they are important in itself but you cannot take two graphic cards and compare them if they do not share the same underlying architecture.

What matters is how much work they can do every cycle, much like Intructions Per Cycle known from the central processing unit world.
 
Ditch that model. Hang it up to dry with the Mhz myth.

The unified nature of how graphic card works these days could care less about most of the things you listed. Sure they are important in itself but you cannot take two graphic cards and compare them if they do not share the same underlying architecture.

What matters is how much work they can do every cycle, much like Intructions Per Cycle known from the central processing unit world.

Even then you have to be specific. Are you talking about Vec5 or Scalar + Vec4, etc. Really is more difficult nowadays. General rule of thumb? Compare Cards within the same family. Otherwise you are not going to get a good representation of the hardware you are looking at.
 
could someone look at their 2600(XT) running under Bootcamp with the ATI tool and see what the clock rate is under windows. I seem to remember seeing 800/1000 not the lower rates that are being talked about here.

695.25/792
XP Pro SP2 32bit
 
This is what AnandTech has to say about it...

"Historically, ATI Mac Edition cards have always been clocked lower than their PC counterparts; ATI explained the reasoning behind this disparity as having to do with basic supply and demand. The demand for Mac video cards is lower than their PC counterparts, so ATI runs them at lower clock speeds to maintain their desired profit per card regardless of whether they are selling to Mac or PC markets."

So how do we get this sorted out?

Does anyone have the time and resources to write an open letter to apple, and do some publicity around it?
 
So how do we get this sorted out?

Does anyone have the time and resources to write an open letter to apple, and do some publicity around it?

I don't know.
It's been going on for years.
The X800 XT Mac Edition is also under-clocked for example.
 
Someone posted this in another thread.

It appears that Apple did NOT underclock the 8800.

I'll assume it was due to nvidia, who has a little more class.
 

Attachments

  • 8800stats.jpeg
    8800stats.jpeg
    46.1 KB · Views: 74
If Apple was underclocking the card would it show up in Windows?

i suspect so since those settings are normally done directly on the card kinda like with the bios settings for CPU etc. but i have no idea if apple is doing some behind the back going around that stuff with tools to adjust it everytime it boots
 
i suspect so since those settings are normally done directly on the card kinda like with the bios settings for CPU etc. but i have no idea if apple is doing some behind the back going around that stuff with tools to adjust it everytime it boots

I was wondering because in Windows you can over and under clock a card without ever touching the cards firmware. I just wasn't sure if the BIOS side allows for the standard clocks while the EFI runs it at a lower clock.
 
It's written into the card, it runs at stock speeds for everything. This is reassuring though, I'll be ordering one soon. Vista gets worse and worse every day.
 
yea. thats not surprising considering the video card on the macbook pros only come with half the ram... and thats for the higher end ones! its apple, but i guess they say that it will make the battery last longer, when really it just saves them money
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.