How much better is HD 2600 than the old X1600

Discussion in 'iMac' started by aranhil, Aug 7, 2007.

  1. aranhil macrumors newbie

    Joined:
    Jul 26, 2007
    #1
    Hi all. I'm looking at the new $1499 20" iMac. It features the HD 2600 (256MB) graphics card. For somebody who doesn't play games alot, is this new HD 2600 significantly better than the X1600 on the old iMacs? For me, the game I was going to play only requires 64MB Radeon 8500. So does this mean the new iMac hasn't really improved from the old one (apart from the slight increase in CPU and hard drive)? Cuz I'm really leaning toward buying a discounted old iMac instead.

    BTW I never understood what a better graphics card do other than improving gaming experience? Do movies look better or something?
     
  2. OTA macrumors newbie

    OTA

    Joined:
    Aug 7, 2007
    #2
    So, take the 20´entry level and upgrade ram.
    I think will be better for you
     
  3. capran macrumors member

    Joined:
    Nov 28, 2003
    #3
    No, graphics cards have advanced to the point where 2D and generally video quality is very good even for low end parts. It's mostly only gamers like me who care about faster and more powerful GPUs. Although it may be possible for GPUs to integrate the newer video codecs like H.264, so video work will be accelerated.

    As for how much better the newer 2x00 ATI cards are than the 1600, I'm not sure, but they're both low end graphics cards, and are in the $100 range or so for PCs.
     
  4. BornAgainMac macrumors 603

    BornAgainMac

    Joined:
    Feb 4, 2004
    Location:
    Florida Resident
    #4
    Who knows for sure.

    I wish the numbers meant something. Technically the newer card should be 1000 (units) better. If the cards about the same then maybe it should be called X1601. With the current numbering system, I have no way to judge how the card will perform.

    In the future, ATI will have new cards and these will be the reactions:

    In 2009, ATI HD 3300 XT in the new iMac. - Reaction: Sucks. Only has 512 MB. Yesterday's technology.

    In 2010, ATI 4900 Ultra AX in the new iMac - Reaction: Sucks. Only has 1 GB of video ram.

    In 2013, ATI 7710 Rage 3D Rated R in the new iMac - Reaction: Sucks. I am buying a Mac Pro for my gaming needs.

    At least with the Core 2 Duo, the higher the Mhz, the faster it is.
     
  5. ReanimationLP macrumors 68030

    ReanimationLP

    Joined:
    Jan 8, 2005
    Location:
    On the moon.
    #5
    The 2600 is actually basically the same as the X1600, with OpenGL 2.0 and DirectX 10 bolted on, 65nm instead of 90nm, and a slightly faster core and memory clocks.

    Performance will probably be about 20-50% faster, depending on what game you're playing.
     
  6. chewietobbacca macrumors 6502

    Joined:
    Jun 18, 2007
    #6
    Pretty close. The 2600 also boasts the UVD, or universal video decoder, which shifts video decoding to the GPU and not the CPU. For example, the CPU load can be reduced from 50-60% all the way down to 10% while the GPU uses its own hardware to hardware decode H.264 for example. Pretty nifty for DVD playback and so on.

    Actually AMD/ATI and NVIDIA both have pretty consistent number schemes. Although previously ATI used the X symbol in front of their cards for the previous two generations to symbolize they were in the ten thousands, that X is simply replaced with HD for high def I believe.

    For example, the previous generation ATI Card was the X1000 series. Previously, it was the X800 series. And before that, it was the 9000 series. As you can see, the X simply states that they are in the 10,000's. The newest generation ATI card is of the HD2000 series.

    For example, there is the HD2400, HD2600, HD2900, etc.

    So the first number is the generation (in this case 2), the second number designates whether it is low/middle/high end (the higher the number, the higher end the card is. For example, the HD2900XT is the high end card, while the HD2600 would be the mid range, HD 2400 is the low end. Previously, the X1600 would be mid range while the X1950 would be the high end).

    Also, the third number often designates whether its a refresh. For example, X850 and X1950 were refreshes of the X800 and X1900. Though this isn't always true, as the X1900 itself was a refresh of the X1800. This varies generation to generation depending on release date, level of competition, etc. The 2900 series has yet to have a refresh yet, though one is supposedly pending.

    Finally, the letters at the end designate the performance level within that range. For ATI, IIRC it has traditionally been:

    XTX > XT > PRO > XL > GTO > GT

    So, for example, the X1950XTX was the highest performance king of the X1K series, which was better than the X1950XT, which was better than the X1950PRO and all the X19---- were the high end performance cards.

    NVIDIA follows similar numbering schemes such as 8800 designates 8 series, the second 8 meaning high end (if they had a 8900, the 9 would be a refresh of the 8).

    Hope that helps

    NVIDIA follows this lettering: Ultra > GTX > GTS > GT > GS
     
  7. Dont Hurt Me macrumors 603

    Dont Hurt Me

    Joined:
    Dec 21, 2002
    Location:
    Yahooville S.C.
    #7
    good post minus your performance increase, I doubt you would see 20 -50% on any modern game made the past 2 years. I would say 2-5%
     
  8. chewietobbacca macrumors 6502

    Joined:
    Jun 18, 2007
    #8
    2-10% in games is about right, with a significant improvement in video decoding speed and so on.

    The HD2600PRO is best used for light gaming and home entertainment.

    Personally, in my home theater PC, i'm getting a HD2400 PRO (cheap at $50 and very good at video decoding and quiet too) for that task myself :D
     
  9. revenuee macrumors 68020

    revenuee

    Joined:
    Sep 13, 2003
    Location:
    A place where i am supreme emporer
    #9
    does anyone know how the 2600 Pro, in the 24 inch, will perform running MOTION and Aperture?
     
  10. ~~Hello~~ macrumors 6502

    Joined:
    Apr 27, 2007
    #11
    Can't be any worse than the intel integrated graphics though can it?
     
  11. iW00t macrumors 68040

    iW00t

    Joined:
    Nov 7, 2006
    Location:
    Defenders of Apple Guild
    #12
    Nope. Intel graphics is the worst there is.

    If any company tries to sell a GPU that loses out to Intel graphics they should just close up shop and shoot themselves, because IIG costs $1 to the mainboard manufacturer at worst.
     

Share This Page