You do realize what this means, don't you?
POWERBOOK G5 NEXT TUESDAY!
![]()
-Clive
HAHAHAHAHAHAHA! LOL!
That's hillarious!!! Maybe you have already responded since someone else asked and i haven't the thread entirely - but how did you do that? So cool!
You do realize what this means, don't you?
POWERBOOK G5 NEXT TUESDAY!
![]()
-Clive
That thing needs a case redesign anyway, so it can finally have user replaceable hard drives. ( ) I've never understood why they continue to mostly use ATi.
ATI chips produce better picture than nVidia's. That's why![]()
Think I responded to this before...no they don't. Just depends on the quality of the components used
Even in spite of the ATI/AMD merger? (i.e., AMD and Intel are rivals)
Even if Apple pulls off any existing coolers, and underclocks the HD 2600 (XT?) so it only uses the same amount of power as the X1600, it is still going to completely decimate the X1600. If Apple can get them, they would be stupid to not use them.
I'm expecting a minor (feature) update and not a major (case) redesign...
You do realize what this means, don't you?
POWERBOOK G5 NEXT TUESDAY!
![]()
-Clive
Isn't the X1600 on MBP already underclocked?
A small underclock makes a huge difference in TDP, and as I believe the performance per watt of the HD 2600 is supposed to be far, far superior to that of the X1600 (I've seen a few slides on the net somewhere) Apple should use it, no question.
Personally I think Apple does switch to nvidia gpus
Then, tell me why the 15-incher has "always" used ATI in it, and the 17-incher *dropped* nVidia as soon as possible it only had nVidia in the first revision (together with the 12-incher) and for 15/17 it has been ATI ever since.
Apple Pro laptops have ATI written all over it.
New GPU's would be great. 10-20% more performance in synthetic benchmarks. Don't expect DX10 support to be decent on anything less than the 8800/HD2900 series though.I know they used ATI for a long time now and dropped nvidia. I just wanted to share some thoughts why a recent switch could still be possible. it's not a big deal to use a different gpu and a change in layout has to be done anyway.
however let's hope for the best that apple is upgrading the gpu in the macbook pro either to the hd2600 or to the geforce 8m series. the macbook pro really needs an upgraded gpu.
The Core 2 models clocked up higher from idle when under load.Yeah, I believe it's only like 10% or something. The point is, that even if the recommended clock HD 2600 draws 4 times as much juice as the X1600, or about 70W (I'm sure the actual power usage is much much less than this, but this is hypothetical to make the maths easier), if Apple halves the clock speeds, the power requirements should drop to about 17W, the TDP of the X1600. So then, the card should work about half as well as it does at full speed. But it is still going to run rings around the X1600.
A small underclock makes a huge difference in TDP, and as I believe the performance per watt of the HD 2600 is supposed to be far, far superior to that of the X1600 (I've seen a few slides on the net somewhere) Apple should use it, no question.
HD 2600 should produce much less heat, and provide far better performance than 8600M.
The 8600M GS has 16 stream processors. The 8600 GT has 32, clocked a little slower. Hw many does the HD 2600 have, you ask? 120. That's right. Four times more than the 8600 GT, eight times more than the GS.
If you don't see the difference, does it mean nobody else will see it either?
To my eyes, ATI's picture is better than nVidia's. And to my ears, Genelec's monitors produce more accurate sound than Mackie's. As long as I can see or hear the difference, I'll be making choices based on my own experience about things.
And I will say it a third time. You are WRONG. If you're seeing a difference, it's because the Nvidia parts you've seen are using cheaper components. The only other posibility is that to your eye, the default color settings, etc. look better on ATi's drivers.
Unfortunately he is right in some ways. Older nVidia cards couldn't hold a candle to the quality of ATi cards in 3D. ATi had the upper hand from the 9700 up through the X800 or so. The nVidia 6800GT had graphics quality almost on par but not as good and even the 7900 lagged behind with default settings. this was because nVidia lowered graphics quality to increase speed. With the 8800 series though nVidia is as good as ATi.
Unfortunately he is right in some ways. Older nVidia cards couldn't hold a candle to the quality of ATi cards in 3D. ATi had the upper hand from the 9700 up through the X800 or so. The nVidia 6800GT had graphics quality almost on par but not as good and even the 7900 lagged behind with default settings. this was because nVidia lowered graphics quality to increase speed. With the 8800 series though nVidia is as good as ATi.
The Core 2 models clocked up higher from idle when under load.
http://barefeats.com/mbcd9.html
That's ~70w on the desktop variant.
Doubtful. The 2900 uses far more power than a (more powerful) 8800GTX. More thank likely that applies to the scaled down parts as well.
You're not comparing equivalent things. The processors on ATi's cards aren't nearly as full featured or powerful as those on Nvidia's cards. Comparing the number is less meaningful as comparing clock speeds. The 2900 has 320 (or thereabouts) processors, and it's destroyed by the 8800GTX's 128 processors. All while using less power.
The HD 2900 is made on a 90nm process, as is the 8800GTX and 8600M. The HD 2600 is made on a 65 nm process. Smaller processes give very large improvements to power usage.
:lol: I shouldn't have to be the one providing a link hereLink for proof of the difference between stream processors?
Finally, stop using comparisons between 2900 and 8800 to justify drawing comparisons between the 2600 and 8600. 2900 and 8800 are designed for bragging rights of the companies, and people who buy them. 2600 and 8600 are designed to be used by normal people
The Mobility HD 2600 is not just a scaled down version of the HD 2900, but it is based on the same technologies. I believe it will have higher performance per watt than the 8600M.
Are you referring to the 9800 series?That thing needs a case redesign anyway, so it can finally have user replaceable hard drives.
That said, there's no reason an 8600GT wouldn't work if an x2600 did. Though the x2600 will likely draw more power, when it actually launches.
I hope they finally just ditch ATi, as they've trailed Nvidia since the very first notebook GPU was launched (only briefly catching up during one generation). Plus I can't stand ATi's paper launches. (Or terrible drivers, though that's not really relevant on OS X I suppose.) I've never understood why they continue to mostly use ATi.
Are you referring to the 9800 series?
If you don't see the difference, does it mean nobody else will see it either?
To my eyes, ATI's picture is better than nVidia's. And to my ears, Genelec's monitors produce more accurate sound than Mackie's. As long as I can see or hear the difference, I'll be making choices based on my own experience about things.