Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR
There has already been a couple topics and several posts that talk about and argue the merits of the new macs, this thread aims to give the people who jump into this forum the facts as well as some comparisons.

It's intended as a quick reference of information for those who are considering purchasing the new iMac ("Mid 2007"), and need technical details about the GPU to help their decision.

To start things off, let's get a few details out of the way.
  • iMacs are essentially "desktop laptops", they use mobile processors, mobile ram, and they also use mobile GPUs. This is not specifically advertised.
  • Because of this, keep in mind any time the GPUs are talked about in the specifications, they are actually the "Mobility" (ATI) or "Go/M" (nVidia) series graphics cards.
  • If you want to play games the best thing to do in an iMac is to add more memory (RAM), stock 1GB tends to be a bottleneck and will prevent the rest of the system from reaching its full potential.
  • The HD 2400 XT is less powerful then the "plain" HD 2400
  • Yes, the HD 2600 Pro (Mobility) in the new iMacs is the best GPU offered in an iMac to date (that's right, better then the X1600 and the 7300GT)
  • The HD 2X00 Series cards have a completely different architecture that few games are taking advantage of right now. It's designed to be shader intensive - In fact they can easily offer double the shader operations of the nVidia 8 Series cards! (8600GT, etc). This will be more of a payoff in newer shader-heavy games/engines then current gen games.
  • They are also very new cards - and as with any brand new GPU expect major performance improvements with more optimized drivers. Developers likely haven't even had much time with them yet, so look at benchmarks distrustingly for now, performance is almost garaunteed to improve as both drivers and game software matures.

Okay, on to some specs and comparisons.

The first comparison here will not be of the "Mobility" class since I can't find a good site that includes the mobile version of the HD 2X00 series cards (they are too new).

GPUReview.com - ATI HD 2600 Pro VS X1600 Pro
(You can change the cards to view different comparisons, hopefully the site will update with the mobile versions of the new cards soon.)

One thing you should notice immediately from that comparison is the the HD 2600 has a HUGE boost in Shader Operations from 72,000 vs 6,000. It's worth noting, though, that the X1600 has a separate Vertex unit, so those numbers aren't a direct 1:1 ratio.

For more details on shader operations, see:

GPUReview.com - Shader Operations

Here's a notable snipped:

GPUReview.com said:
Though it's not a perfect indicator of performance, all else being equal, a card with more pixel shader processing power will outperform a card with lower pixel shader power. As games get more and more shader dependent, cards with more shading power will pull farther and farther ahead of competing cards.

A perfect example of this phenomenon is the X1800 XT and the X1900 XTX. Both cards have nearly identical specifications, with the exception of shader processing power. The X1900 XTX has 3 times the shader processing power of the X1800 XT. This change alone is enough to make the X1900 XTX nearly twice as fast as the X1800 XT in shader-heavy games.

So what games are shader-heavy? I'm not sure about current games, especially as a lot of this is still classed as "DirectX 10" and "Shader Model 4.0", which no games out (to my knowledge) support this.

My best guess, would be newer games: Crysis, RAGE, UT3, etc. Specifically, anything that will be running on CryENGINE 2 (Crysis), iD Tech 5 (RAGE), and Unreal Engine 3 (UT3, Gears of War)

That being said, the GPUs in iMacs aren't (and will never be), high-end cards. These aren't HD 2900s or 8800 GTXs, don't expect to max out resolution and settings.

----------

Now, here's some technical details on the cards in question (I'll list the Mobility HD 2600 Pro, then only list notable performance number differences for the other cards):

ATI Mobility Radeon HD 2600 - Overview
  • ATI Avivo HD
    • High Definition video playback, including full HD DVD and Blu-ray disc support while on battery
  • "Radically New" and Efficient 3D Architecture
  • Performance-per-Watt (Requires only 45W)

ATI Mobility Radeon HD 2600 - Specifications
  • Features
    • 390 million transistors using 65nm fabrication process (Should run cooler then 90nm)
    • Unified superscalar shader architecture
    • 128-bit 4-channel GDDR3 memory interface
    • 256-bit internal ring bus for memory read/write
  • Unified Superscalar Shader Architecture
    • 120 stream processing units (Dynamically balance for vertex/geometry/pixel shaders)
    • 128-bit floating point precision for all operations
    • Command processor for reduced CPU overhead
    • Up to 40 texture fetches per clock cycle
    • Up to 128 textures per pixel
    • High-resolution texture support (up to 8192 x 8192)
    • 8 render targets (MRTs) with ant-aliasing support
    • Physics processing support
  • Full support for Microsoft DirectX 10.0
    • Shader Model 4.0
    • Geometry Shaders
    • etc
  • Anti-aliasing features
    • Multi-sample anti-aliasing (up to 8 samples per pixel)
    • Custom Filter Anti-Aliasing (CFAA) for improved quality
    • Adaptive super-sampling and multi-sampling
    • All anti-aliasing features compatible with HDR rendering
  • Texture filtering features
    • 2x/4x/8x/16x high quality adaptive anisotropic filtering modes (Up to 128 taps per pixel)
    • 128-bit floating point HDR texture filtering
    • Shared exponent HDR (RGBE 9:9:9:5) texture format support

ATI Mobility Radeon HD 2400 XT - Specifications
  • Features
    • 180 million transistors using 65nm fabrication process
    • 64-bit single channel GDDR3 memory interface
  • Unified Superscalar Shader Architecture
    • 40 stream processing units
    • Up to 16 texture fetches per clock cycle
  • Anti-aliasing features
    • Multi-sample anti-aliasing (up to 4 samples per pixel)

ATI Mobility Radeon X1600 - Specifications
  • Features
    • 157 million transistors using 90nm fabrication process
    • 128-bit 4-channel GDDR3 memory interface
  • Ultra-Threaded Shader Engine
    • DirectX 9.0
    • Shader Model 3.0
    • 12 pixel shaders
    • 5 vertex shaders

The standout in these specs to me, personally, is that the Mobility HD 2400 XT has such a drastically slower memory interface! A single channel 64-bit interface compared to 4-channel 128-bit interfaces.

On the surface it looks like everything else might beat out the X1600 found in older iMacs, but it seems highly likely the memory interface will strangle the card from ever outperforming it, though. Time will tell, I suppose.
 

MAW

macrumors regular
Apr 29, 2007
155
0
Los Angeles
great post H.!!!

as someone ready to purchase and a bit jaded with all the negative posts this past 24 hrs. i'd like to thank you for putting the gpu issue into perspective. very helpful!!

i am gpu ILLITERATE and only know how to judge a card based on googling fps performance so i make no claim to wisdom here but, just my $.02, i see both sides of the arguement on this issue.

while some were expecting and hoping(myself included,based on readings) for maybe the 8600, the hd 2600 seems to be more suited for every day ilife application( i.e. the video encoding capabilities) which is what the imac itself seems to be intended for.

ofcourse in an ideal world we would get all the new imac has to offer AND the ability to play crysis @ max rez. and settings. but it's obvious with this refresh that the imac is intended to further bridge the gap between computer and entertainment center and it's a big balancing act to pull off and nigh impossible to please us all.


that said, i think these new machines are lovely and can't wait to get mine!!!
 

GFLPraxis

macrumors 604
Mar 17, 2004
7,152
460
Two minor corrections.
iMacs are essentially "desktop laptops", they use mobile processors, mobile ram, 2.5" hard drives, and they also use mobile GPUs. This is not specifically advertised.

Mobile processors, mobile RAM, and mobile GPUs are correct, but iMacs use DESKTOP hard drives, which is why they are suitable for video editing (though obviously a Mac Pro is even better). Mobile processors are usually as good as desktop, except they cost significantly more and cap out lower (2.8 GHz vs 3.33 GHz in the case of the Core 2 Duo IIRC).

Only Mobile GPUs are usually worse than their desktop counterparts, AND more expensive.

Yes, the HD 2600 Pro (Mobility) in the new iMacs is the best GPU offered in an iMac to date (that's right, better then the X1600 and the 7300GT)

The question in a lot of people's minds right now, though, is this. The previous iMac offered a BTO option of the 7600GT. Does the HD 2600 Pro beat that?


It's clear that the new iMac's stock cards beat the old iMac's stock cards, but the high end iMac had the option of an even better card, and I think that's the real question.

-----

Thanks for clearing things up though. I think one of the most important things people miss is that these new cards are DirectX 10/OpenGL 2 cards. When the next generation of games comes, these will probably have significant performance advantages just because of the greater feature set.



It's nice to have a thread for clearing through the FUD. I can't figure out what features iMovie has because alongside people claiming there is no audio rubberbanding and dual-soundtracks are people claiming there are no transitions and titles (obvious from watching the keynote that some of these are false). The level of FUD is just ridiculous.

I would like to see people just shut up about iMac complaints though. We're talking about $300 price drop, bigger screens, better CPU, bigger hard drives, better hardware (gigabit ethernet, 802.11n, BT2.0), glass screens that can be cleaned easily, a new design, and the stock GPUs are slightly better. People are complaining that the GPU's are not much better and rating it negative. Anyone who can consider the new iMacs a "lackluster update" (I've seen that comment) because the GPU is only slightly better is simply insane.





EDIT:
Wait, this is one of those new cards with HD encoder/decoder chips on board, right? Does OS X utilize that?!?
 

TheSilencer

macrumors regular
May 27, 2007
111
0
OK, let´s see the G84M 8600 Specs.

284 million transistors.
80nm process
475MHz Core
7.6 billion texel/sec fillrate
32 shader units, 950MHz
12.8/22.4GB/sec bandwidth
128bit memory interface
up to 1400MHz memory clock speed
91.2Gigaflops shader processing power

And some additional features, HDCP, supported quality modes and so on:

http://www.nvidia.com/object/geforce_8600_8500_tech_specs.html

Compared only the "RAW" processing power:
GF8600M GT - 91.2Gigaflops
HD2400 XT - 56Gigaflops
HD2600 PRO - 114Gigaflops

Main problem with the new ATi cards are driver and architecture. Technically, they could gain 30-40% more power if driver and games or programms can use them full but even 50% more would only mean that you now have in Command and Conquer 3 (PC) not 28FPS (1024x768, high quality) go to 42FPS.


The new card compared to the old card from the 24" iMac, the 7600GT beats the HD2600 Pro.

http://www.anandtech.com/video/showdoc.aspx?i=3023&p=6
 

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR
... but iMacs use DESKTOP hard drives ...

Wasn't extremely easy to track down proof of that (as SATA is also available in 2.5"), but you're definitely right and I've corrected the post!

The question in a lot of people's minds right now, though, is this. The previous iMac offered a BTO option of the 7600GT. Does the HD 2600 Pro beat that?

Possibly...
GPUReview.com - HD 2600 Pro VS 7600GT

HD 2600 Pro:
Memory Bandwidth: 16 GB/sec
Shader Operations: 72000 Operations/sec
Pixel Fill Rate: 2400 MPixels/sec
Texture Fill Rate: 4800 MTexels/sec

7600GT:
Core Clock: 560 MHz
Memory Clock: 700 MHz (1400 DDR)
Memory Bandwidth: 22.4 GB/sec
Shader Operations: 6720 Operations/sec
Pixel Fill Rate: 4480 MPixels/sec
Texture Fill Rate: 6720 MTexels/sec
Vertex Operations: 700 MVertices/sec

The HD 2600 Pro thus ends up:
- 29% less Memory Bandwidth
- 47% less Pixel Fill Rate
- 29% less Texture Fill Rate
+ 1,071% more the Shader Operations (10.71X)
+ DirectX 10
+ Shader Model 4.0
+ Physics processing support

I would venture to guess these numbers are going to mean the 7600GT is definitely better for current games. But with that massive shader advantage, it could easily make up for the deficiencies in upcoming engines. Will it? We'll have to wait and see I guess.

Wait, this is one of those new cards with HD encoder/decoder chips on board, right? Does OS X utilize that?!?

I believe the answer to both to those is "Yes", but I'm just saying that from memory. If someone has links to back up or deny that, post them please!
 

harveypooka

macrumors 65816
Feb 24, 2004
1,291
0
Great post, Haoshiro.

I guess I was hoping for something a bit more revolutionary though...
 

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR
Compared only the "RAW" processing power:
GF8600M GT - 91.2 Gigaflops
HD2400 XT - 56 Gigaflops
HD2600 PRO - 114 Gigaflops

So the HD 2600 Pro even beats the 8600M GT in raw power, interesting.

It's all going to come down to good drivers and engines. Shaders are becoming more and more important, and it looks like that is exactly where the HD 2X00 series really shines.

Well, that and video output. ATI has always won the "visual quality" award when it comes to 2D (images, movies), and with plenty of power to push HD DVD and Blu-ray (even on battery power no less), that's great to have.
 

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR

Chone

macrumors 65816
Aug 11, 2006
1,222
0
So the HD 2600 Pro even beats the 8600M GT in raw power, interesting.

It's all going to come down to good drivers and engines. Shaders are becoming more and more important, and it looks like that is exactly where the HD 2X00 series really shines.

Well, that and video output. ATI has always won the "visual quality" award when it comes to 2D (images, movies), and with plenty of power to push HD DVD and Blu-ray (even on battery power no less), that's great to have.

Just a little point I'd like to make, we already have two massive shader-intensive engines out there, the Oblivion engine and the F.E.A.R. engine, if there is a place where the HD 2600 should shine is those games and while they perform quite nicely, the 8600 cards still outpace them.

Also, there were initially driver problems but those were corrected a LONG time ago, the current Cat 7.7 are more than adequate and developer for the HD cards' full potential.

Also it doesn't matter if the HD 2600 has the shader power to shine in games, if its already this limited in Oblivion then it will certainly be bottlenecked in other games (you see shaders are not everything).

Just a little thing I wanted to say, don't get your hopes up for the HD 2600, this is the performance you are getting today and the performance you'll get tomorrow. No hidden potential or driver issues or "shader intensive" engines we haven't seen. We've seen Oblivion, we've seen the Cat 7.7 and we know just what the HD 2600 is capable of.

Other than that, a really nice post but still, the HD 2400 in the $1200 iMac is WORSE than the X1600 it had before and the HD 2600 Pro is worse than the 7600GT the 24 incher had before and the HD 2600 Pro is not a big leap over the 7300GT and X1600.

If you are considering a 1200 iMac be sure you are not going to use 3D at all because springing up to the HD 2600 would be quite an upgrade.
 

GFLPraxis

macrumors 604
Mar 17, 2004
7,152
460
I'd go with the 7600GT any day.

For current games, yeah, but he's right; the ATi card is WAY better for shader operations, and games like Crysis are very shader-heavy. I'd expect the ATi card to perform better for all DirectX 10-era games like Gears of War or Crysis.
 

Eidorian

macrumors Penryn
Mar 23, 2005
29,190
386
Indianapolis
For current games, yeah, but he's right; the ATi card is WAY better for shader operations, and games like Crysis are very shader-heavy. I'd expect the ATi card to perform better for all DirectX 10-era games like Gears of War or Crysis.
Err...have you seen the DirectX 10 benchmarks? Seriously?

There's a slim chance of slightly greater performance on native DX10 games but otherwise you're going to need an 8800 or HD2900 for DX10.
 

Chone

macrumors 65816
Aug 11, 2006
1,222
0
For current games, yeah, but he's right; the ATi card is WAY better for shader operations, and games like Crysis are very shader-heavy. I'd expect the ATi card to perform better for all DirectX 10-era games like Gears of War or Crysis.

Did you not bother to read my post directly above yours? :confused:
 

harveypooka

macrumors 65816
Feb 24, 2004
1,291
0
At the end of the day the iMac is not going to live up to Gears of War or Battlefield 2142's graphics.

If Rage is released anytime soon you'll need a Mac Pro to get some nice shiny graphics.

I hope, hope, hope Apple release a cut-down Mac Pro in the next few months.
 

fblack

macrumors 6502a
May 16, 2006
528
1
USA
Reviews...

Hey Haoshiro how about we quote some reviews?:)

Most reviews that I've read seem to agree that what are supposed to be midrange cards are pretty weak.

http://www.anandtech.com/video/showdoc.aspx?i=3023&p=12

We had no problems expressing our disappointment with NVIDIA over the lackluster performance of their 8600 series. After AMD's introduction of the 2900 XT, we held some hope that perhaps they would capitalize on the huge gap NVIDIA left between their sub $200 parts and the higher end hardware. Unfortunately, that has not happened.

In fact, AMD went the other way and released hardware that performs consistently worse than NVIDIA's competing offerings. The only game that shows AMD hardware leading NVIDIA is Rainbow Six: Vegas. Beyond that, our 4xAA tests show the mainstream Radeon HD lineup, which already lags in performance, scales even worse than NVIDIA. Not that we really expect most people with this level of hardware to enable 4xAA, but it's still a disappointment.

Usually it's easier to review hardware that is clearly better or worse than it's competitor under the tests we ran, but this case is difficult. We want to paint an accurate picture here, but it has become nearly impossible to speak negatively enough about the AMD Radeon HD 2000 Series without sounding comically absurd.

Even with day-before-launch price adjustments, there is just no question that, in the applications the majority of people will be running, AMD has created a series of products that are even more unimpressive than the already less than stellar 8600 lineup.
...All we can do at this point is lament the sad state of affordable next generation graphics cards and wait until someone at NVIDIA and AMD gets the memo that their customers would actually like to see better performance that at least consistently matches previous generation hardware. For now, midrange DX10 remains MIA.

http://www.extremetech.com/article2/0,1697,2151679,00.asp

Final Thoughts: Not for the Hardcore DX10 Gamer
It's hard to recommend the Radeon HD 2600 Pro. The $100 price seems attractive, but it comes at a cost. To get good performance out of even year-old games, you'll need to start cranking detail levels down, even at the conservative resolution of 1280x1024. Forget about DX10 stuff: So far, it all runs like molasses. To some extent, this is par for the course for $100 graphics cards. They just don't deliver a satisfying game-playing experience. It's unfortunate that this big shift in architecture that accompanied DX10 didn't change that.

http://www.tomshardware.com/2007/07/24/hd_2600_and_geforce_8600/page10.html

In essence, there isn't a real middle ground, only enthusiast high end, value and entry level product...Are the HD 2400, 8500 and 8400 series cards good for gaming? No, but for an HTPC they would be good.

When looking at the gaming results and the video playback figures, the HD 2600 and 8600 series are indeed a value proposition...They can play some existing games and will be able to play simple DX10 games in the future...All games going forward next year will most likely be DX10, so these should be able to play games like the next The Sims or children's educational programs, but in no way will they be able to handle graphically intense titles.

http://www.pcworld.idg.com.au/index.php/taxid;2136212627;pid;3779;pt;1

Although the hardware is not primed for serious gamers, offering fairly mediocre scores in our benchmarks, its passive cooler and less-invasive size will appeal to people looking to build a quiet, and possibly smaller form factor, PC that can handle media applications. In particular, it's good for watching DVDs or high-definition media such as Blu-ray or HD-DVD. According to ATI, all HD 2000-series graphics cards have ATIs UVD (unified video decoder) to handle video decoding, rather than offload it to the CPU, and they also include a DVI to HDMI converter in the sales pack. The HDMI output supports digital video and audio signals, so it's simple to hook up to your TV or amplifier.
In our benchmarks, the Sapphire HD 2600 Pro proved to be a bit of a slow coach. In our DirectX 9 tests, it barely scraped through as playable and in our DirectX 10 tests it really hit rock bottom.

http://www.guru3d.com/article/content/440/19/

Let's analyze, it's fair to say that both the 2400 and 2600 cards do not live up to our initial expectations from a 3D performance point of view, no sir. Also performance wise there's just a huge gap in between the 2900 and 2600 series. AMD has not been able to kick the current performance crown holder in the mid-range segment from its lofty chair. And that's annoying...

We've seen it happen with the HD 2900 XT and just like that 2900, what does AMD do? They lower the price significantly. And I have to agree here; that's the real trick to do it right as prices over the past few years have been shifted upwards by NVIDIA a lot. We're now at a level where people must pay 700 EUR for a high-end card (8800 Ultras) and for the best mid-range product we see prices of 220 EUR which is insane considering the performance you get out of it. Still for some reason we all consistently bend over and take it in the... well you know what I'm trying to say as really it's the monopolized situation we are currently in. This is why we need that hefty competition; to keep the product prices competitive and allow development to thrive in its endless evolutionary path of technology.

This is why I say that both the 2400 and 2600 series are refreshing. Unfortunately they are not the mid-range top performers we all have been hoping for, their 3D rendering capabilities are sufficient; sufficient for the money you have to pay for it. Realistically the HD 2600 XT compared to the GeForce 8600 GT gives NVIDIA a good lead...

http://www.hothardware.com/articles/ATI_Radeon_HD_2600_and_2400_Performance/?page=11

Overall, the new Radeon HD 2600 XT, 2600 Pro, and 2400 XT cards should make for quiet, low-power upgrades from any integrated graphics solution and offer a relatively low-cost of entry into the world of DirectX 10. These cards are obviously not geared to hardcore gamers, but at lower resolutions without high levels of AA and anisotropic filtering enabled they’ll be adequate for casual gaming. These cards are also well suited to HTPC applications where video playback performance and low-noise output are of the utmost importance.

http://www.pcper.com/article.php?aid=426&type=expert&pid=22

The performance of the Radeon HD 2600 XT, 2600 Pro and 2400 XT was more or less a letdown. In AMD documentation they were calling for the direct competition of the 2600 XT to be NVIDIA's 8600 GT card. Based on the estimated pricing from AMD and the pricing of NVIDIA's currently on sale cards though, we found that you could get an 8600 GTS for about the same price. When we added in the 8600 GTS, the performance of the 2600 XT looked less than stellar; but in fact in most cases the 8600 GT was able to outperform it anyway.

The Radeon HD 2600 Pro was in the same sticky situation: it was pitted against the NVIDIA 8500 GT in the AMD documentation though we were able to find 8600 GT cards for about $100 putting them in the same price level as the 2600 Pro. The 2600 Pro didn't really stand a chance against the 8600 GT card either, in much the same way the 2600 XT couldn't fight the 8600 GTS successfully.

The Radeon HD 2400 XT card was a different story - we didn't really compare it to anything of the same price because we didn't have it. At $79 MSRP, the 2400 XT definitely falls into the "budget" category. For most readers here, the gaming performance of an $80 video card isn't going to impress and we didn't find anything surprising in the 2400 XT in that regards.

I could quote more but why flog an already eviscerated corpse? :D

All the reviewers feel that these cards are good for HD playback, and for "simple" or "casual" gaming. Not for "graphically intense titles". These cards don't do so great in DX10 and some reviews suggest sitting out the 1st generation of DX10 cards.

To be fair to Apple there seems to be slim pickings in the midrange market of newer cards. The HD, a DX10 compatible card, and the low prices ($79 MSRP 2400XT cheaper for apple I'm sure) probably had a part to play in their decision to use these cards.

Still they could have opted for the 2600XT as a BTO which would have given comparable performance to a 7600GT in older games and an edge in newer titles.:(
 

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR
Sure, but these are still not reviews of Mobile chips versus other Mobile chips. :rolleyes:

Oblivion and FEAR don't count as new shader-heavy games, imo, either.

We'll see as time goes on, like said.

It's already been pointed out that the Mobility HD 2600 Pro has more raw power then even an 8600M GT, let alone the 7600M GT.

Why isn't this showing in the current crop of benchmarks? I'm sure there are plenty of reasons, and I've seen this happen plenty of times with video cards in the past - especially nVidia cards. My old 6600 GT started out pretty bad, and as time went on games and drivers caused the benchmarks to jump a lot.

I look at current benchmarks and reviews distrustingly, I'll see how things are after six months.

I'll be ordering another iMac soon, with the HD 2600, and will be sure to compare to my current 256MB X1600, as well as friends PCs (who have 7600 GS and GTs, as well as the 8600 GT)
 

Chone

macrumors 65816
Aug 11, 2006
1,222
0
Sure, but these are still not reviews of Mobile chips versus other Mobile chips. :rolleyes:

Oblivion and FEAR don't count as new shader-heavy games, imo, either.

We'll see as time goes on, like said.

It's already been pointed out that the Mobility HD 2600 Pro has more raw power then even an 8600M GT, let alone the 7600M GT.

Why isn't this showing in the current crop of benchmarks? I'm sure there are plenty of reasons, and I've seen this happen plenty of times with video cards in the past - especially nVidia cards. My old 6600 GT started out pretty bad, and as time went on games and drivers caused the benchmarks to jump a lot.

I look at current benchmarks and reviews distrustingly, I'll see how things are after six months.

I'll be ordering another iMac soon, with the HD 2600, and will be sure to compare to my current 256MB X1600, as well as friends PCs (who have 7600 GS and GTs, as well as the 8600 GT)

Raw power stands for theoretical peak performance and by those terms the HD 2900XT should be faster than a 8800 Ultra yet it can barely keep up with the 8800 GTS. Save your theory and "raw power" for another thread.

And why wouldn't Oblivion count as a shader intensive game? Take shaders out of that game and you end up with a crummy looking game a 9600 could run. Oblivion still brings 8800 Ultra cards to its knees and is one of the most demanding shader-intensive games out there, the HD 2600 had a pretty nice chance to flex its muscles in Oblivion, the drivers were already more mature because of the HD 2900XT and what did it achieve? It closed the performance gap but not enough to come on top of the 8600GTS and mind you I'm talking about the HD 2600XT not the severely crippled 2600 Pro in iMacs.

This talk about shaders reminds me of the video memory debate. Just as a 6200 can't use 512 of VRAM because at resolutions where that RAM will be useful the card will be limited by its other functions. Same here, the HD 2600 might have shader power but not nearly enough to compensate. Yeah it "shines" in shader-intensive games like Oblivion but not enough. And this brings me to another point, shading power is seen best in higher resolutions, resolutions the HD 2600 is simply too darn weak to run. At 1024x768 shaders don't make much difference. Excuse me, that would be 800x600 for the HD 2600 in the iMac and 640x480 for the HD 2400...

Now you make it sound as if Crysis is a game about us looking at a flat wall while the card applies shaders non stop...

Like I said earlier what you see is what you get, that shader power (which frankly isn't that much) isn't going to help the 2600 when the rest of it is so lackluster.
 

Haoshiro

macrumors 68000
Original poster
Feb 9, 2006
1,894
6
USA, OR
No, the talk of raw power can stay right here in the thread, no need to save it for another time.

That was the point in fact, that's obviously not reaching it's full potential. Saying that it never will is a fools game, you don't know that, nor do I. We can argue until we are blue in the face, but we don't know.

My stance is simple, if it has untapped potential, there is a chance it will get tapped some time. That doesn't mean it will, but just because it hasn't yet doesn't mean it won't.

I have yet to see benchmarks of the Mobility HD 2600 Pro, nor have we seen how it does in OS X, Tiger or Leopard. There are plenty of unknowns, it doesn't help for people to come on and post as if they have the final word on something without even knowing all the facts - and acting as if they can predict the future.

And, btw, Shader Ops - as the HD 2X00 series is concerned, covers not only pixel shaders, but also texture and vertex.
 

phillipjfry

macrumors 6502a
Dec 12, 2006
847
1
Peace in Plainfield
Sorry if this sounds out of place and just plain silly, but why are people comparing these cards to DX10 performance? Isn't that Vista only? Shouldn't we really only be concerned about OpenGL 2.0 performance and how Leopard will handle the new EA Games/ID games to be coming to the new kitty soon? :confused:
 

contoursvt

macrumors 6502a
Jul 22, 2005
832
0
I was wondering that too but then I thought that maybe the more hardcore gamers might have vista installed on their machines as well and might play that way...

Sorry if this sounds out of place and just plain silly, but why are people comparing these cards to DX10 performance? Isn't that Vista only? Shouldn't we really only be concerned about OpenGL 2.0 performance and how Leopard will handle the new EA Games/ID games to be coming to the new kitty soon? :confused:
 

GFLPraxis

macrumors 604
Mar 17, 2004
7,152
460
Sorry if this sounds out of place and just plain silly, but why are people comparing these cards to DX10 performance? Isn't that Vista only? Shouldn't we really only be concerned about OpenGL 2.0 performance and how Leopard will handle the new EA Games/ID games to be coming to the new kitty soon? :confused:


Actually, since the new EA games are using Cider, they're probably going to be running in DX10, I'd think...
 

fblack

macrumors 6502a
May 16, 2006
528
1
USA
?

Sure, but these are still not reviews of Mobile chips versus other Mobile chips. :rolleyes:

Hey, I'm not trying to bash you here but I think you are sidestepping the issue. It is about the potential of the cards, the mobile versions are not going to be more powerful than the regular cards and if the regular cards are choking (both AMD and NVIDIA) you think the mobile version is going to do better?

Oblivion and FEAR don't count as new shader-heavy games, imo, either.

So what you are saying is that its perfectly ok that they run these games poorly? How about older titles like COD2, should I just stop playing them?


We'll see as time goes on, like said.

Why isn't this showing in the current crop of benchmarks? I'm sure there are plenty of reasons, and I've seen this happen plenty of times with video cards in the past - especially nVidia cards. My old 6600 GT started out pretty bad, and as time went on games and drivers caused the benchmarks to jump a lot.

I look at current benchmarks and reviews distrustingly, I'll see how things are after six months.

Fair enough. Drivers can make a difference. But in my experience no amount of driver tweaking is going to make a $79 card perform like a $400 card. At 1024x768 no AA, no AF the 2400 runs Oblivion at 6.2 FPS I think I saw the 2600pro at 19FPS and the 2600XT at 23FPS. They barely run COD2 better. Do you really expect to see an added 30-40FPS to these scores by driver tweaking?

My stance is simple, if it has untapped potential, there is a chance it will get tapped some time. That doesn't mean it will, but just because it hasn't yet doesn't mean it won't.

I have yet to see benchmarks of the Mobility HD 2600 Pro, nor have we seen how it does in OS X, Tiger or Leopard. There are plenty of unknowns, it doesn't help for people to come on and post as if they have the final word on something without even knowing all the facts - and acting as if they can predict the future.

That's fine, but believe it or not you are coming across as steadfastedly defending these cards, putting you clearly on that side. One has to make decisions based on the best info they have at the time. Right now the info I'm seeing is telling me that these cards are so-so.

I'll be ordering another iMac soon, with the HD 2600, and will be sure to compare to my current 256MB X1600, as well as friends PCs (who have 7600 GS and GTs, as well as the 8600 GT)

Great. I hope you post your impressions and then 6 months down the road we can see what updates and new drivers do for the performance then we can repeat this pow-wow.;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.