iMac Graphics Card Details and Comparisons (HD 2X00 Series)

Discussion in 'Mac and PC Games' started by Haoshiro, Aug 8, 2007.

  1. Haoshiro macrumors 68000

    Haoshiro

    Joined:
    Feb 9, 2006
    Location:
    USA, KS
    #1
    There has already been a couple topics and several posts that talk about and argue the merits of the new macs, this thread aims to give the people who jump into this forum the facts as well as some comparisons.

    It's intended as a quick reference of information for those who are considering purchasing the new iMac ("Mid 2007"), and need technical details about the GPU to help their decision.

    To start things off, let's get a few details out of the way.
    • iMacs are essentially "desktop laptops", they use mobile processors, mobile ram, and they also use mobile GPUs. This is not specifically advertised.
    • Because of this, keep in mind any time the GPUs are talked about in the specifications, they are actually the "Mobility" (ATI) or "Go/M" (nVidia) series graphics cards.
    • If you want to play games the best thing to do in an iMac is to add more memory (RAM), stock 1GB tends to be a bottleneck and will prevent the rest of the system from reaching its full potential.
    • The HD 2400 XT is less powerful then the "plain" HD 2400
    • Yes, the HD 2600 Pro (Mobility) in the new iMacs is the best GPU offered in an iMac to date (that's right, better then the X1600 and the 7300GT)
    • The HD 2X00 Series cards have a completely different architecture that few games are taking advantage of right now. It's designed to be shader intensive - In fact they can easily offer double the shader operations of the nVidia 8 Series cards! (8600GT, etc). This will be more of a payoff in newer shader-heavy games/engines then current gen games.
    • They are also very new cards - and as with any brand new GPU expect major performance improvements with more optimized drivers. Developers likely haven't even had much time with them yet, so look at benchmarks distrustingly for now, performance is almost garaunteed to improve as both drivers and game software matures.

    Okay, on to some specs and comparisons.

    The first comparison here will not be of the "Mobility" class since I can't find a good site that includes the mobile version of the HD 2X00 series cards (they are too new).

    GPUReview.com - ATI HD 2600 Pro VS X1600 Pro
    (You can change the cards to view different comparisons, hopefully the site will update with the mobile versions of the new cards soon.)

    One thing you should notice immediately from that comparison is the the HD 2600 has a HUGE boost in Shader Operations from 72,000 vs 6,000. It's worth noting, though, that the X1600 has a separate Vertex unit, so those numbers aren't a direct 1:1 ratio.

    For more details on shader operations, see:

    GPUReview.com - Shader Operations

    Here's a notable snipped:

    So what games are shader-heavy? I'm not sure about current games, especially as a lot of this is still classed as "DirectX 10" and "Shader Model 4.0", which no games out (to my knowledge) support this.

    My best guess, would be newer games: Crysis, RAGE, UT3, etc. Specifically, anything that will be running on CryENGINE 2 (Crysis), iD Tech 5 (RAGE), and Unreal Engine 3 (UT3, Gears of War)

    That being said, the GPUs in iMacs aren't (and will never be), high-end cards. These aren't HD 2900s or 8800 GTXs, don't expect to max out resolution and settings.

    ----------

    Now, here's some technical details on the cards in question (I'll list the Mobility HD 2600 Pro, then only list notable performance number differences for the other cards):

    ATI Mobility Radeon HD 2600 - Overview
    • ATI Avivo HD
      • High Definition video playback, including full HD DVD and Blu-ray disc support while on battery
    • "Radically New" and Efficient 3D Architecture
    • Performance-per-Watt (Requires only 45W)

    ATI Mobility Radeon HD 2600 - Specifications
    • Features
      • 390 million transistors using 65nm fabrication process (Should run cooler then 90nm)
      • Unified superscalar shader architecture
      • 128-bit 4-channel GDDR3 memory interface
      • 256-bit internal ring bus for memory read/write
    • Unified Superscalar Shader Architecture
      • 120 stream processing units (Dynamically balance for vertex/geometry/pixel shaders)
      • 128-bit floating point precision for all operations
      • Command processor for reduced CPU overhead
      • Up to 40 texture fetches per clock cycle
      • Up to 128 textures per pixel
      • High-resolution texture support (up to 8192 x 8192)
      • 8 render targets (MRTs) with ant-aliasing support
      • Physics processing support
    • Full support for Microsoft DirectX 10.0
      • Shader Model 4.0
      • Geometry Shaders
      • etc
    • Anti-aliasing features
      • Multi-sample anti-aliasing (up to 8 samples per pixel)
      • Custom Filter Anti-Aliasing (CFAA) for improved quality
      • Adaptive super-sampling and multi-sampling
      • All anti-aliasing features compatible with HDR rendering
    • Texture filtering features
      • 2x/4x/8x/16x high quality adaptive anisotropic filtering modes (Up to 128 taps per pixel)
      • 128-bit floating point HDR texture filtering
      • Shared exponent HDR (RGBE 9:9:9:5) texture format support

    ATI Mobility Radeon HD 2400 XT - Specifications
    • Features
      • 180 million transistors using 65nm fabrication process
      • 64-bit single channel GDDR3 memory interface
    • Unified Superscalar Shader Architecture
      • 40 stream processing units
      • Up to 16 texture fetches per clock cycle
    • Anti-aliasing features
      • Multi-sample anti-aliasing (up to 4 samples per pixel)

    ATI Mobility Radeon X1600 - Specifications
    • Features
      • 157 million transistors using 90nm fabrication process
      • 128-bit 4-channel GDDR3 memory interface
    • Ultra-Threaded Shader Engine
      • DirectX 9.0
      • Shader Model 3.0
      • 12 pixel shaders
      • 5 vertex shaders

    The standout in these specs to me, personally, is that the Mobility HD 2400 XT has such a drastically slower memory interface! A single channel 64-bit interface compared to 4-channel 128-bit interfaces.

    On the surface it looks like everything else might beat out the X1600 found in older iMacs, but it seems highly likely the memory interface will strangle the card from ever outperforming it, though. Time will tell, I suppose.
     
  2. MAW macrumors regular

    Joined:
    Apr 29, 2007
    Location:
    Los Angeles
    #2
    great post H.!!!

    as someone ready to purchase and a bit jaded with all the negative posts this past 24 hrs. i'd like to thank you for putting the gpu issue into perspective. very helpful!!

    i am gpu ILLITERATE and only know how to judge a card based on googling fps performance so i make no claim to wisdom here but, just my $.02, i see both sides of the arguement on this issue.

    while some were expecting and hoping(myself included,based on readings) for maybe the 8600, the hd 2600 seems to be more suited for every day ilife application( i.e. the video encoding capabilities) which is what the imac itself seems to be intended for.

    ofcourse in an ideal world we would get all the new imac has to offer AND the ability to play crysis @ max rez. and settings. but it's obvious with this refresh that the imac is intended to further bridge the gap between computer and entertainment center and it's a big balancing act to pull off and nigh impossible to please us all.


    that said, i think these new machines are lovely and can't wait to get mine!!!
     
  3. GFLPraxis macrumors 604

    GFLPraxis

    Joined:
    Mar 17, 2004
    #3
    Two minor corrections.
    Mobile processors, mobile RAM, and mobile GPUs are correct, but iMacs use DESKTOP hard drives, which is why they are suitable for video editing (though obviously a Mac Pro is even better). Mobile processors are usually as good as desktop, except they cost significantly more and cap out lower (2.8 GHz vs 3.33 GHz in the case of the Core 2 Duo IIRC).

    Only Mobile GPUs are usually worse than their desktop counterparts, AND more expensive.

    The question in a lot of people's minds right now, though, is this. The previous iMac offered a BTO option of the 7600GT. Does the HD 2600 Pro beat that?


    It's clear that the new iMac's stock cards beat the old iMac's stock cards, but the high end iMac had the option of an even better card, and I think that's the real question.

    -----

    Thanks for clearing things up though. I think one of the most important things people miss is that these new cards are DirectX 10/OpenGL 2 cards. When the next generation of games comes, these will probably have significant performance advantages just because of the greater feature set.



    It's nice to have a thread for clearing through the FUD. I can't figure out what features iMovie has because alongside people claiming there is no audio rubberbanding and dual-soundtracks are people claiming there are no transitions and titles (obvious from watching the keynote that some of these are false). The level of FUD is just ridiculous.

    I would like to see people just shut up about iMac complaints though. We're talking about $300 price drop, bigger screens, better CPU, bigger hard drives, better hardware (gigabit ethernet, 802.11n, BT2.0), glass screens that can be cleaned easily, a new design, and the stock GPUs are slightly better. People are complaining that the GPU's are not much better and rating it negative. Anyone who can consider the new iMacs a "lackluster update" (I've seen that comment) because the GPU is only slightly better is simply insane.





    EDIT:
    Wait, this is one of those new cards with HD encoder/decoder chips on board, right? Does OS X utilize that?!?
     
  4. TheSilencer macrumors regular

    Joined:
    May 27, 2007
    #4
    OK, let´s see the G84M 8600 Specs.

    284 million transistors.
    80nm process
    475MHz Core
    7.6 billion texel/sec fillrate
    32 shader units, 950MHz
    12.8/22.4GB/sec bandwidth
    128bit memory interface
    up to 1400MHz memory clock speed
    91.2Gigaflops shader processing power

    And some additional features, HDCP, supported quality modes and so on:

    http://www.nvidia.com/object/geforce_8600_8500_tech_specs.html

    Compared only the "RAW" processing power:
    GF8600M GT - 91.2Gigaflops
    HD2400 XT - 56Gigaflops
    HD2600 PRO - 114Gigaflops

    Main problem with the new ATi cards are driver and architecture. Technically, they could gain 30-40% more power if driver and games or programms can use them full but even 50% more would only mean that you now have in Command and Conquer 3 (PC) not 28FPS (1024x768, high quality) go to 42FPS.


    The new card compared to the old card from the 24" iMac, the 7600GT beats the HD2600 Pro.

    http://www.anandtech.com/video/showdoc.aspx?i=3023&p=6
     
  5. Haoshiro thread starter macrumors 68000

    Haoshiro

    Joined:
    Feb 9, 2006
    Location:
    USA, KS
    #5
    Wasn't extremely easy to track down proof of that (as SATA is also available in 2.5"), but you're definitely right and I've corrected the post!

    Possibly...
    GPUReview.com - HD 2600 Pro VS 7600GT

    HD 2600 Pro:
    Memory Bandwidth: 16 GB/sec
    Shader Operations: 72000 Operations/sec
    Pixel Fill Rate: 2400 MPixels/sec
    Texture Fill Rate: 4800 MTexels/sec

    7600GT:
    Core Clock: 560 MHz
    Memory Clock: 700 MHz (1400 DDR)
    Memory Bandwidth: 22.4 GB/sec
    Shader Operations: 6720 Operations/sec
    Pixel Fill Rate: 4480 MPixels/sec
    Texture Fill Rate: 6720 MTexels/sec
    Vertex Operations: 700 MVertices/sec

    The HD 2600 Pro thus ends up:
    - 29% less Memory Bandwidth
    - 47% less Pixel Fill Rate
    - 29% less Texture Fill Rate
    + 1,071% more the Shader Operations (10.71X)
    + DirectX 10
    + Shader Model 4.0
    + Physics processing support

    I would venture to guess these numbers are going to mean the 7600GT is definitely better for current games. But with that massive shader advantage, it could easily make up for the deficiencies in upcoming engines. Will it? We'll have to wait and see I guess.

    I believe the answer to both to those is "Yes", but I'm just saying that from memory. If someone has links to back up or deny that, post them please!
     
  6. harveypooka macrumors 65816

    Joined:
    Feb 24, 2004
    #6
    Great post, Haoshiro.

    I guess I was hoping for something a bit more revolutionary though...
     
  7. Eidorian macrumors Penryn

    Eidorian

    Joined:
    Mar 23, 2005
    Location:
    Indianapolis
  8. Haoshiro thread starter macrumors 68000

    Haoshiro

    Joined:
    Feb 9, 2006
    Location:
    USA, KS
    #8
    So the HD 2600 Pro even beats the 8600M GT in raw power, interesting.

    It's all going to come down to good drivers and engines. Shaders are becoming more and more important, and it looks like that is exactly where the HD 2X00 series really shines.

    Well, that and video output. ATI has always won the "visual quality" award when it comes to 2D (images, movies), and with plenty of power to push HD DVD and Blu-ray (even on battery power no less), that's great to have.
     
  9. MRU macrumors demi-god

    MRU

    Joined:
    Aug 23, 2005
    Location:
    Other
  10. Haoshiro thread starter macrumors 68000

    Haoshiro

    Joined:
    Feb 9, 2006
    Location:
    USA, KS
    #10
    Haha, too true!

    Thanks for all the compliments, much appreciated! I definitely spent far too much time on it, lol...
     
  11. Chone macrumors 65816

    Chone

    Joined:
    Aug 11, 2006
    #11
    Just a little point I'd like to make, we already have two massive shader-intensive engines out there, the Oblivion engine and the F.E.A.R. engine, if there is a place where the HD 2600 should shine is those games and while they perform quite nicely, the 8600 cards still outpace them.

    Also, there were initially driver problems but those were corrected a LONG time ago, the current Cat 7.7 are more than adequate and developer for the HD cards' full potential.

    Also it doesn't matter if the HD 2600 has the shader power to shine in games, if its already this limited in Oblivion then it will certainly be bottlenecked in other games (you see shaders are not everything).

    Just a little thing I wanted to say, don't get your hopes up for the HD 2600, this is the performance you are getting today and the performance you'll get tomorrow. No hidden potential or driver issues or "shader intensive" engines we haven't seen. We've seen Oblivion, we've seen the Cat 7.7 and we know just what the HD 2600 is capable of.

    Other than that, a really nice post but still, the HD 2400 in the $1200 iMac is WORSE than the X1600 it had before and the HD 2600 Pro is worse than the 7600GT the 24 incher had before and the HD 2600 Pro is not a big leap over the 7300GT and X1600.

    If you are considering a 1200 iMac be sure you are not going to use 3D at all because springing up to the HD 2600 would be quite an upgrade.
     
  12. GFLPraxis macrumors 604

    GFLPraxis

    Joined:
    Mar 17, 2004
    #12
    For current games, yeah, but he's right; the ATi card is WAY better for shader operations, and games like Crysis are very shader-heavy. I'd expect the ATi card to perform better for all DirectX 10-era games like Gears of War or Crysis.
     
  13. TheSilencer macrumors regular

    Joined:
    May 27, 2007
    #13
  14. Eidorian macrumors Penryn

    Eidorian

    Joined:
    Mar 23, 2005
    Location:
    Indianapolis
    #14
    Err...have you seen the DirectX 10 benchmarks? Seriously?

    There's a slim chance of slightly greater performance on native DX10 games but otherwise you're going to need an 8800 or HD2900 for DX10.
     
  15. Chone macrumors 65816

    Chone

    Joined:
    Aug 11, 2006
    #15
    Did you not bother to read my post directly above yours? :confused:
     
  16. harveypooka macrumors 65816

    Joined:
    Feb 24, 2004
    #16
    At the end of the day the iMac is not going to live up to Gears of War or Battlefield 2142's graphics.

    If Rage is released anytime soon you'll need a Mac Pro to get some nice shiny graphics.

    I hope, hope, hope Apple release a cut-down Mac Pro in the next few months.
     
  17. fblack macrumors 6502a

    fblack

    Joined:
    May 16, 2006
    Location:
    USA
    #17
    Reviews...

    Hey Haoshiro how about we quote some reviews?:)

    Most reviews that I've read seem to agree that what are supposed to be midrange cards are pretty weak.

    http://www.anandtech.com/video/showdoc.aspx?i=3023&p=12

    http://www.extremetech.com/article2/0,1697,2151679,00.asp

    http://www.tomshardware.com/2007/07/24/hd_2600_and_geforce_8600/page10.html

    http://www.pcworld.idg.com.au/index.php/taxid;2136212627;pid;3779;pt;1

    http://www.guru3d.com/article/content/440/19/

    http://www.hothardware.com/articles/ATI_Radeon_HD_2600_and_2400_Performance/?page=11

    http://www.pcper.com/article.php?aid=426&type=expert&pid=22

    I could quote more but why flog an already eviscerated corpse? :D

    All the reviewers feel that these cards are good for HD playback, and for "simple" or "casual" gaming. Not for "graphically intense titles". These cards don't do so great in DX10 and some reviews suggest sitting out the 1st generation of DX10 cards.

    To be fair to Apple there seems to be slim pickings in the midrange market of newer cards. The HD, a DX10 compatible card, and the low prices ($79 MSRP 2400XT cheaper for apple I'm sure) probably had a part to play in their decision to use these cards.

    Still they could have opted for the 2600XT as a BTO which would have given comparable performance to a 7600GT in older games and an edge in newer titles.:(
     
  18. Haoshiro thread starter macrumors 68000

    Haoshiro

    Joined:
    Feb 9, 2006
    Location:
    USA, KS
    #18
    Sure, but these are still not reviews of Mobile chips versus other Mobile chips. :rolleyes:

    Oblivion and FEAR don't count as new shader-heavy games, imo, either.

    We'll see as time goes on, like said.

    It's already been pointed out that the Mobility HD 2600 Pro has more raw power then even an 8600M GT, let alone the 7600M GT.

    Why isn't this showing in the current crop of benchmarks? I'm sure there are plenty of reasons, and I've seen this happen plenty of times with video cards in the past - especially nVidia cards. My old 6600 GT started out pretty bad, and as time went on games and drivers caused the benchmarks to jump a lot.

    I look at current benchmarks and reviews distrustingly, I'll see how things are after six months.

    I'll be ordering another iMac soon, with the HD 2600, and will be sure to compare to my current 256MB X1600, as well as friends PCs (who have 7600 GS and GTs, as well as the 8600 GT)
     
  19. Chone macrumors 65816

    Chone

    Joined:
    Aug 11, 2006
    #19
    Raw power stands for theoretical peak performance and by those terms the HD 2900XT should be faster than a 8800 Ultra yet it can barely keep up with the 8800 GTS. Save your theory and "raw power" for another thread.

    And why wouldn't Oblivion count as a shader intensive game? Take shaders out of that game and you end up with a crummy looking game a 9600 could run. Oblivion still brings 8800 Ultra cards to its knees and is one of the most demanding shader-intensive games out there, the HD 2600 had a pretty nice chance to flex its muscles in Oblivion, the drivers were already more mature because of the HD 2900XT and what did it achieve? It closed the performance gap but not enough to come on top of the 8600GTS and mind you I'm talking about the HD 2600XT not the severely crippled 2600 Pro in iMacs.

    This talk about shaders reminds me of the video memory debate. Just as a 6200 can't use 512 of VRAM because at resolutions where that RAM will be useful the card will be limited by its other functions. Same here, the HD 2600 might have shader power but not nearly enough to compensate. Yeah it "shines" in shader-intensive games like Oblivion but not enough. And this brings me to another point, shading power is seen best in higher resolutions, resolutions the HD 2600 is simply too darn weak to run. At 1024x768 shaders don't make much difference. Excuse me, that would be 800x600 for the HD 2600 in the iMac and 640x480 for the HD 2400...

    Now you make it sound as if Crysis is a game about us looking at a flat wall while the card applies shaders non stop...

    Like I said earlier what you see is what you get, that shader power (which frankly isn't that much) isn't going to help the 2600 when the rest of it is so lackluster.
     
  20. Haoshiro thread starter macrumors 68000

    Haoshiro

    Joined:
    Feb 9, 2006
    Location:
    USA, KS
    #20
    No, the talk of raw power can stay right here in the thread, no need to save it for another time.

    That was the point in fact, that's obviously not reaching it's full potential. Saying that it never will is a fools game, you don't know that, nor do I. We can argue until we are blue in the face, but we don't know.

    My stance is simple, if it has untapped potential, there is a chance it will get tapped some time. That doesn't mean it will, but just because it hasn't yet doesn't mean it won't.

    I have yet to see benchmarks of the Mobility HD 2600 Pro, nor have we seen how it does in OS X, Tiger or Leopard. There are plenty of unknowns, it doesn't help for people to come on and post as if they have the final word on something without even knowing all the facts - and acting as if they can predict the future.

    And, btw, Shader Ops - as the HD 2X00 series is concerned, covers not only pixel shaders, but also texture and vertex.
     
  21. phillipjfry macrumors 6502a

    phillipjfry

    Joined:
    Dec 12, 2006
    Location:
    Peace in Plainfield
    #21
    Sorry if this sounds out of place and just plain silly, but why are people comparing these cards to DX10 performance? Isn't that Vista only? Shouldn't we really only be concerned about OpenGL 2.0 performance and how Leopard will handle the new EA Games/ID games to be coming to the new kitty soon? :confused:
     
  22. contoursvt macrumors 6502a

    Joined:
    Jul 22, 2005
    #22
    I was wondering that too but then I thought that maybe the more hardcore gamers might have vista installed on their machines as well and might play that way...

     
  23. GFLPraxis macrumors 604

    GFLPraxis

    Joined:
    Mar 17, 2004
    #23

    Actually, since the new EA games are using Cider, they're probably going to be running in DX10, I'd think...
     
  24. Eidorian macrumors Penryn

    Eidorian

    Joined:
    Mar 23, 2005
    Location:
    Indianapolis
    #24
    Cider should translate DirectX APIs to OpenGL. I believe they did mention DX10 on non-Windows platforms would be available. :D
     
  25. fblack macrumors 6502a

    fblack

    Joined:
    May 16, 2006
    Location:
    USA
    #25
    ?

    Hey, I'm not trying to bash you here but I think you are sidestepping the issue. It is about the potential of the cards, the mobile versions are not going to be more powerful than the regular cards and if the regular cards are choking (both AMD and NVIDIA) you think the mobile version is going to do better?

    So what you are saying is that its perfectly ok that they run these games poorly? How about older titles like COD2, should I just stop playing them?


    Fair enough. Drivers can make a difference. But in my experience no amount of driver tweaking is going to make a $79 card perform like a $400 card. At 1024x768 no AA, no AF the 2400 runs Oblivion at 6.2 FPS I think I saw the 2600pro at 19FPS and the 2600XT at 23FPS. They barely run COD2 better. Do you really expect to see an added 30-40FPS to these scores by driver tweaking?

    That's fine, but believe it or not you are coming across as steadfastedly defending these cards, putting you clearly on that side. One has to make decisions based on the best info they have at the time. Right now the info I'm seeing is telling me that these cards are so-so.

    Great. I hope you post your impressions and then 6 months down the road we can see what updates and new drivers do for the performance then we can repeat this pow-wow.;)
     

Share This Page