Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ISMPlus

macrumors regular
May 25, 2007
121
0
NYC
You do realize what this means, don't you?

POWERBOOK G5 NEXT TUESDAY!

D40FA1.jpg


-Clive

HAHAHAHAHAHAHA! LOL!

That's hillarious!!! Maybe you have already responded since someone else asked and i haven't the thread entirely - but how did you do that? So cool!:D
 

JFreak

macrumors 68040
Jul 11, 2003
3,151
9
Tampere, Finland
That thing needs a case redesign anyway, so it can finally have user replaceable hard drives. (…) I've never understood why they continue to mostly use ATi.

Yep, the user-replaceable hard drive would be great, as would the magnetic lid closing mechanism. I'm just not sure they're doing those changes now — even though the latter would likely be a piece of cake to implement, I bet they're waiting for major case redesign to change "anything".

ATI chips produce better picture than nVidia's. That's why ;)
 

Wolfpup

macrumors 68030
Sep 7, 2006
2,926
105
ATI chips produce better picture than nVidia's. That's why ;)

Think I responded to this before...no they don't. Just depends on the quality of the components used, at least when we were dealing with analog connections. There might have been a bigger variance in Nvidia based boards back when only ATi made ATi boards.

Who's to say this isn't going to be a major case redesign? The time frame seems right to me, the introduction of a new lighting system makes it seem a bit more likely to me also.

I'm guessing 33% chance. Real decisive there :D

I guess next Tuesday it is...hopefully.
 

JFreak

macrumors 68040
Jul 11, 2003
3,151
9
Tampere, Finland
Think I responded to this before...no they don't. Just depends on the quality of the components used

If you don't see the difference, does it mean nobody else will see it either?

To my eyes, ATI's picture is better than nVidia's. And to my ears, Genelec's monitors produce more accurate sound than Mackie's. As long as I can see or hear the difference, I'll be making choices based on my own experience about things.
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
Even in spite of the ATI/AMD merger? (i.e., AMD and Intel are rivals)

Intel tells Apple to use NVIDIA, Apple tells Intel to shove it.
Intel asks Apple to use NVIDIA, AMD has an extra arrow in their ongoing court case against Intel.
Intel keeps their mouth shut, and accepts that Apple will use whatever is best at the moment, everyone's happy. Especially us.

HD 2600 should produce much less heat, and provide far better performance than 8600M. ATI uses 65nm manufacturing process, I think NVIDIA is still on 90nm, possibly 80nm, but neither come close to 65nm.
The 8600M GS has 16 stream processors. The 8600 GT has 32, clocked a little slower. Hw many does the HD 2600 have, you ask? 120. That's right. Four times more than the 8600 GT, eight times more than the GS. Leaked benchmarks suggest that this really shows in scores too. Clock speeds are comparable.

Even if Apple pulls off any existing coolers, and underclocks the HD 2600 (XT?) so it only uses the same amount of power as the X1600, it is still going to completely decimate the X1600. If Apple can get them, they would be stupid to not use them.
 

JFreak

macrumors 68040
Jul 11, 2003
3,151
9
Tampere, Finland
Even if Apple pulls off any existing coolers, and underclocks the HD 2600 (XT?) so it only uses the same amount of power as the X1600, it is still going to completely decimate the X1600. If Apple can get them, they would be stupid to not use them.

Isn't the X1600 on MBP already underclocked?
 

a456

macrumors 6502a
Oct 5, 2005
882
0
I'm expecting a minor (feature) update and not a major (case) redesign...

If you're right there's no reason they won't ship next week. If you're wrong they'll wait until WWDC. Let's see what the next eleven days bring.:cool:
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
Isn't the X1600 on MBP already underclocked?

Yeah, I believe it's only like 10% or something. The point is, that even if the recommended clock HD 2600 draws 4 times as much juice as the X1600, or about 70W (I'm sure the actual power usage is much much less than this, but this is hypothetical to make the maths easier), if Apple halves the clock speeds, the power requirements should drop to about 17W, the TDP of the X1600. So then, the card should work about half as well as it does at full speed. But it is still going to run rings around the X1600.
A small underclock makes a huge difference in TDP, and as I believe the performance per watt of the HD 2600 is supposed to be far, far superior to that of the X1600 (I've seen a few slides on the net somewhere) Apple should use it, no question.
 

JFreak

macrumors 68040
Jul 11, 2003
3,151
9
Tampere, Finland
A small underclock makes a huge difference in TDP, and as I believe the performance per watt of the HD 2600 is supposed to be far, far superior to that of the X1600 (I've seen a few slides on the net somewhere) Apple should use it, no question.

It'd be no surprise if they did, but still makes me wonder how low must they clock? Well, I guess it remains to be seen.
 

Keiichi

macrumors newbie
Jun 1, 2007
20
0
Personally I think Apple does switch to nvidia gpus for several reasons. First of all the new geforce 8M series has hardware support for OpenGL 2.1 whereas the new Mobility Radeon as far as I remember just has hardware support for OpenGL 2.0. Apple does not care about DX features however they do care about OpenGL. Furthermore AMD bought ATI and Apple is bound to Intel. So Intel can put pressure on Apple to not use AMD GPUs which IMO is probably really happening. I also read that AMDs unified video decoder is still buggy compared to nvidias pure video. Anyway availability can also be a reason to switch to nvidia. AMD does still have problems shipping their new mobile gpus or do you see any hd2600 powered notebooks out there already. I for myself don't like nvidia as much as former ATI, but I think OpenGL 2.1 is a very strong argument for nvidia as Mac OSX's graphic features are built upon OpenGL and Leopard will make use (if Tiger does not right now - I don't know wheather Tiger does make use of the 2.1 libraries) of the new features OpenGL 2.1 does implement.
 

JFreak

macrumors 68040
Jul 11, 2003
3,151
9
Tampere, Finland
Personally I think Apple does switch to nvidia gpus

Then, tell me why the 15-incher has "always" used ATI in it, and the 17-incher *dropped* nVidia as soon as possible — it only had nVidia in the first revision (together with the 12-incher) and for 15/17 it has been ATI ever since.

Apple Pro laptops have ATI written all over it.
 

Keiichi

macrumors newbie
Jun 1, 2007
20
0
Then, tell me why the 15-incher has "always" used ATI in it, and the 17-incher *dropped* nVidia as soon as possible — it only had nVidia in the first revision (together with the 12-incher) and for 15/17 it has been ATI ever since.

Apple Pro laptops have ATI written all over it.

I know they used ATI for a long time now and dropped nvidia. I just wanted to share some thoughts why a recent switch could still be possible. it's not a big deal to use a different gpu and a change in layout has to be done anyway.
however let's hope for the best that apple is upgrading the gpu in the macbook pro either to the hd2600 or to the geforce 8m series. the macbook pro really needs an upgraded gpu.
 

Eidorian

macrumors Penryn
Mar 23, 2005
29,190
386
Indianapolis
I know they used ATI for a long time now and dropped nvidia. I just wanted to share some thoughts why a recent switch could still be possible. it's not a big deal to use a different gpu and a change in layout has to be done anyway.
however let's hope for the best that apple is upgrading the gpu in the macbook pro either to the hd2600 or to the geforce 8m series. the macbook pro really needs an upgraded gpu.
New GPU's would be great. 10-20% more performance in synthetic benchmarks. Don't expect DX10 support to be decent on anything less than the 8800/HD2900 series though.

Yeah, I believe it's only like 10% or something. The point is, that even if the recommended clock HD 2600 draws 4 times as much juice as the X1600, or about 70W (I'm sure the actual power usage is much much less than this, but this is hypothetical to make the maths easier), if Apple halves the clock speeds, the power requirements should drop to about 17W, the TDP of the X1600. So then, the card should work about half as well as it does at full speed. But it is still going to run rings around the X1600.
A small underclock makes a huge difference in TDP, and as I believe the performance per watt of the HD 2600 is supposed to be far, far superior to that of the X1600 (I've seen a few slides on the net somewhere) Apple should use it, no question.
The Core 2 models clocked up higher from idle when under load.

http://barefeats.com/mbcd9.html

That's ~70w on the desktop variant.
 

Wolfpup

macrumors 68030
Sep 7, 2006
2,926
105
HD 2600 should produce much less heat, and provide far better performance than 8600M.

Doubtful. The 2900 uses far more power than a (more powerful) 8800GTX. More thank likely that applies to the scaled down parts as well.

The 8600M GS has 16 stream processors. The 8600 GT has 32, clocked a little slower. Hw many does the HD 2600 have, you ask? 120. That's right. Four times more than the 8600 GT, eight times more than the GS.

You're not comparing equivalent things. The processors on ATi's cards aren't nearly as full featured or powerful as those on Nvidia's cards. Comparing the number is less meaningful as comparing clock speeds. The 2900 has 320 (or thereabouts) processors, and it's destroyed by the 8800GTX's 128 processors. All while using less power.
 

Wolfpup

macrumors 68030
Sep 7, 2006
2,926
105
If you don't see the difference, does it mean nobody else will see it either?

To my eyes, ATI's picture is better than nVidia's. And to my ears, Genelec's monitors produce more accurate sound than Mackie's. As long as I can see or hear the difference, I'll be making choices based on my own experience about things.

And I will say it a third time. You are WRONG. If you're seeing a difference, it's because the Nvidia parts you've seen are using cheaper components. The only other posibility is that to your eye, the default color settings, etc. look better on ATi's drivers.
 

TBi

macrumors 68030
Jul 26, 2005
2,583
6
Ireland
And I will say it a third time. You are WRONG. If you're seeing a difference, it's because the Nvidia parts you've seen are using cheaper components. The only other posibility is that to your eye, the default color settings, etc. look better on ATi's drivers.

Unfortunately he is right in some ways. Older nVidia cards couldn't hold a candle to the quality of ATi cards in 3D. ATi had the upper hand from the 9700 up through the X800 or so. The nVidia 6800GT had graphics quality almost on par but not as good and even the 7900 lagged behind with default settings. this was because nVidia lowered graphics quality to increase speed. With the 8800 series though nVidia is as good as ATi.

If we are talking about just 2D quality then i think someone is taking a few too many placebos :D
 

Wolfpup

macrumors 68030
Sep 7, 2006
2,926
105
Unfortunately he is right in some ways. Older nVidia cards couldn't hold a candle to the quality of ATi cards in 3D. ATi had the upper hand from the 9700 up through the X800 or so. The nVidia 6800GT had graphics quality almost on par but not as good and even the 7900 lagged behind with default settings. this was because nVidia lowered graphics quality to increase speed. With the 8800 series though nVidia is as good as ATi.

He was talking about 2D, presumably, and BOTH companies have always made optimizations in 3D to improve performance. Nvidia's at least are more transparent and can be disabled. ATi just has generic settings...some weird name that I can't think of right now. Catalyst A.I. or something weird?

Anyway I have no real problem with optimizations, as if you look at the consoles, they're cutting corners all over the place, a lot worse than the optimizations Nvidia or ATi do on PC games.
 

PCMacUser

macrumors 68000
Jan 13, 2005
1,702
23
Unfortunately he is right in some ways. Older nVidia cards couldn't hold a candle to the quality of ATi cards in 3D. ATi had the upper hand from the 9700 up through the X800 or so. The nVidia 6800GT had graphics quality almost on par but not as good and even the 7900 lagged behind with default settings. this was because nVidia lowered graphics quality to increase speed. With the 8800 series though nVidia is as good as ATi.

In fact the 8800 series is now better than ATI for 3d quality. I think Apple should go with the best, and right now that is nVidia. Perhaps in another year ATI will catch up - that seems to be the name of the game when it comes to AMD these days.

If Apple do decide on ATI/AMD, then we will know that it is not for performance and energy reasons, but perhaps due to some contractual arrangement they have made... or maybe because of cheaper prices?
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
The Core 2 models clocked up higher from idle when under load.

http://barefeats.com/mbcd9.html

That's ~70w on the desktop variant.

Yup yup, the 70W was just to make the maths easier. I wasn't actually saying that the Mobility HD 2600 would have a TDP of 70W... It will certainly be much, much less than this.
Ah well, doesn't matter.

Doubtful. The 2900 uses far more power than a (more powerful) 8800GTX. More thank likely that applies to the scaled down parts as well.

The HD 2900 is made on a 90nm process, as is the 8800GTX and 8600M. The HD 2600 is made on a 65 nm process. Smaller processes give very large improvements to power usage.

You're not comparing equivalent things. The processors on ATi's cards aren't nearly as full featured or powerful as those on Nvidia's cards. Comparing the number is less meaningful as comparing clock speeds. The 2900 has 320 (or thereabouts) processors, and it's destroyed by the 8800GTX's 128 processors. All while using less power.

Link for proof of the difference between stream processors?

Finally, stop using comparisons between 2900 and 8800 to justify drawing comparisons between the 2600 and 8600. 2900 and 8800 are designed for bragging rights of the companies, and people who buy them. 2600 and 8600 are designed to be used by normal people. The Mobility HD 2600 is not just a scaled down version of the HD 2900, but it is based on the same technologies. I believe it will have higher performance per watt than the 8600M.

Also, all benchmarks I've seen between 8600M and HD 2600 have put the 2600 way in front. Although they were pretty crappy benchmarks.
 

Wolfpup

macrumors 68030
Sep 7, 2006
2,926
105
The HD 2900 is made on a 90nm process, as is the 8800GTX and 8600M. The HD 2600 is made on a 65 nm process. Smaller processes give very large improvements to power usage.

Usually, but there's no always a big benefit (see Prescott for that...) Also it's starting with a more power hungry design to begin with. There are still some AMD chips on a larger process that can beat out equivalent Intel chips on a smaller process. So it's not always true.

Link for proof of the difference between stream processors?
:lol: I shouldn't have to be the one providing a link here :D Go to Anandtech if you want more info. Though it should be pretty obvious that all processors are not equivalent.

Finally, stop using comparisons between 2900 and 8800 to justify drawing comparisons between the 2600 and 8600. 2900 and 8800 are designed for bragging rights of the companies, and people who buy them. 2600 and 8600 are designed to be used by normal people


Actually they're designed for people who want or need great 3D performance.

The Mobility HD 2600 is not just a scaled down version of the HD 2900, but it is based on the same technologies. I believe it will have higher performance per watt than the 8600M.

Yes, they are pretty much just scaled down versions. The 8600 adds more advanced video processing capabilities over the 8800 (they did this with the 6800 line too), but otherwise these are the same designs, just with a percentage of the processors lopped off, narrower buses, etc.

I didn't realize the 2600 was on a smaller process, so I agree it MAY draw less power, but there's no way it would draw less if they were on the same process. Nvidia's architecture is much more efficient.
 

amd4me

macrumors 6502
Nov 19, 2006
364
0
That thing needs a case redesign anyway, so it can finally have user replaceable hard drives.

That said, there's no reason an 8600GT wouldn't work if an x2600 did. Though the x2600 will likely draw more power, when it actually launches.

I hope they finally just ditch ATi, as they've trailed Nvidia since the very first notebook GPU was launched (only briefly catching up during one generation). Plus I can't stand ATi's paper launches. (Or terrible drivers, though that's not really relevant on OS X I suppose.) I've never understood why they continue to mostly use ATi.
Are you referring to the 9800 series?
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
OK, based on these benchmarks, I will be happy if Apple uses the 8600M GT. Not the GS. Some of those GT scores look damn good.
http://forum.notebookreview.com/showthread.php?t=125246
Considering some massive 19" desktop replacement with a Mobility 2600 XT, which was probably overclocked, apparrently got a 3dmark'06 score of a tad over 4000. But of course no data as to what resoluton the test was run in.

So, either the 8600M GT 256/512 or the HD 2600 256/512 for the new MBP's?


Hmmm... This is interesting.
http://forum.notebookreview.com/showthread.php?t=122359&page=2
Claims TDP of X1600 is 35 Watts, not the 17W that I found before.
But more importantly, claims TDP of 8600M GT is 43W.
So, I suppose we'll see what the MBP gets real soon.
 

PCMacUser

macrumors 68000
Jan 13, 2005
1,702
23
If you don't see the difference, does it mean nobody else will see it either?

To my eyes, ATI's picture is better than nVidia's. And to my ears, Genelec's monitors produce more accurate sound than Mackie's. As long as I can see or hear the difference, I'll be making choices based on my own experience about things.

I know where you're coming from, but does this mean you'll ditch Apple if they go with nVidia?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.