Ati X800 in 3Ghz G5

Discussion in 'Games' started by PowermacG5, May 16, 2004.

  1. PowermacG5 macrumors regular

    PowermacG5

    Joined:
    Jul 24, 2003
    Location:
    Grand Rapids, MI
    #1
    Think of it this way. Apple surprised us all with having the best graphics card in the world put into the G5 a year ago. I think apple has a deal with Ati to put the latest graphics card in the machine. I cant picture Steve Jobs saying "oh yeah and we put last years graphics card in"

    I will be sad this summer when my computer is officialy outdated.
     
  2. Veldek macrumors 68000

    Veldek

    Joined:
    Mar 29, 2003
    Location:
    Germany
    #2
    Unfortunately, that's not true. When we got the Radeon 9800 (with 128 MB VRAM), the PC world already had the Radeon 9800 (with 256 MB VRAM) and, more importantly, the Radeon 9800XT, which was undoubtedly the best graphics card at this time.

    We don't even have this card yet, when the new generation is already coming. So I wouldn't bet too much on getting the X800, although I hope so, too.
     
  3. tom.96 macrumors regular

    Joined:
    Jun 13, 2003
    Location:
    UK (southern)
    #3
    Well I'm looking for a top graphics card for Doom3. My 8 meg ATi rage is in need of updating (although its been a very good card), so a nice new ATi or Nvidia card would suit me fine. I'm looking forward to seeing what is in the next lot of PM's and iMacs.
     
  4. edesignuk Moderator emeritus

    edesignuk

    Joined:
    Mar 25, 2002
    Location:
    London, England
    #4
    Even though this is not correct (see Veldeks post), why exactly is it/should it be a surprise. This is a pro workstation that costs $$$$$$, it should be expected that the best of the best will be in it.
     
  5. hvfsl macrumors 68000

    hvfsl

    Joined:
    Jul 9, 2001
    Location:
    London, UK
    #5
    I expect Apple to use both the G6800 and the X800 range of cards in their new Macs. I expect Apple might choose Nvidia for the ultra highend because the G6800 has a video processor that will speed up a lot of stuff in Final Cut Pro etc.

    Although using Nvidia cards in Macs will require a case redesign, since they take up 2-3 slots because of the massive cooler.
     
  6. Funkatation macrumors regular

    Joined:
    Dec 29, 2001
    #6
    Granted the Radeon 9800 Pro 128MB is not as good as the XT, it's not THAT much slower. The X800 on the other hand will be a huge performance leap from the 9800 series. I really hope they don't go with the G6800Ultra, takes up too much space, and way too much power. The ATI solution on the otherhand takes the same space, but less power than the 9800.
     
  7. titaniumducky macrumors 6502a

    titaniumducky

    Joined:
    Nov 22, 2003
    #7
    I highly doubt they will surpass 2.6 GHz considering IBM's recent production problems. Also, it's not like Apple to make such huge leaps.
     
  8. ZildjianKX macrumors 68000

    ZildjianKX

    Joined:
    May 18, 2003
    #8
    Unfortunately what you just said is true... :(
     
  9. JOD8FY macrumors 6502a

    JOD8FY

    Joined:
    Mar 22, 2004
    Location:
    United States
    #9
    I hope they put a really nice 256MB card into the 3Ghz one. (That's right - the 3Ghz one... ;) )

    JOD8FY
     
  10. Sun Baked macrumors G5

    Sun Baked

    Joined:
    May 19, 2002
    #10
    PCI-Express graphics cards and DDR2 memory would be a boost to the PowerMac Platform.
     
  11. Chaszmyr macrumors 601

    Chaszmyr

    Joined:
    Aug 9, 2002
    #11
    I still believe we WILL see a 3ghz PowerMac no later than September
     
  12. Veldek macrumors 68000

    Veldek

    Joined:
    Mar 29, 2003
    Location:
    Germany
    #12
    That's quite probable, but I still think Steve Jobs will announce them at WWDC with shipping date in August, which means final shipping in September...
     
  13. PowermacG5 thread starter macrumors regular

    PowermacG5

    Joined:
    Jul 24, 2003
    Location:
    Grand Rapids, MI
    #13
    I didnt know 256 VRAM card was out when i bought my G5. That makes me mad. :mad: Apple needs to be one step ahead of the microsoft world, not two steps behind!!!! I would be perfectly satisfied with my graphics card if it wasnt for my studio display with its bunk refresh rate and ghosting. GRRR AHHHHHH.. Why me!!! LCDS ARE TERRIBLE FOR GAMING.
     
  14. PowermacG5 thread starter macrumors regular

    PowermacG5

    Joined:
    Jul 24, 2003
    Location:
    Grand Rapids, MI
    #14
    Steve Jobs will not be made out to be a liar. He will have a 3.0 ghz. He made officail statements that they would have a 3.0 ghz 1 year after the G5 release.
     
  15. Mav451 macrumors 68000

    Mav451

    Joined:
    Jul 1, 2003
    Location:
    Maryland
    #15
    This is the reason I continue to stay with a high quality APERTURE GRILLE monitor NEC FE791SB for my gaming PC (well my only computer actually haha). Superior color reproduction, no ghosting, oh and its < $170.

    (1280 x 1024 @ 85hz (current setting) and 1600 x 1200 @ 76hz, --this is on a 17" folks)

    1) Problems probably = weight (its mid 30 lbs, if you're a regular guy come on, this should be easy).

    2) If you stare at it for hours on end, yeah you'll get the CRT sensation (a lil burning after like 8 hours). Of course, if you stay in front of any computer (LCD or not), in general, that long without getting up, you are asking for trouble.
     
  16. thatwendigo macrumors 6502a

    thatwendigo

    Joined:
    Nov 17, 2003
    Location:
    Sum, Ergo Sum.
    #16
    Three things:
    1) You didn't research the cards or LCDs before you bought your machine?
    2) You expect Apple as a single company to somehow beat the entire PC industry?
    3) You bought a mac to game on?
     
  17. ZildjianKX macrumors 68000

    ZildjianKX

    Joined:
    May 18, 2003
    #17
    Don't forget the power consumption... an LCD burns about 1/4 as much electricity as an equivalent size CRT.
     
  18. BrianKonarsMac macrumors 65816

    Joined:
    Apr 28, 2004
    #18
    why would you use the Nvidia card when the ATi card is identical in price, takes up 1 slot, requires less power, and outperforms the Nvidia card? Makes ZERO sense to me, please explain.
     
  19. Veldek macrumors 68000

    Veldek

    Joined:
    Mar 29, 2003
    Location:
    Germany
    #19
    Well, the X800 doesn’t outperform the 6800 in every test. In OpenGL the 6800 was much better. As this is what Macs use, one could assume that the 6800 is better suited for us. Although we still have to wait for a test with specific Mac hardware for a final conclusion.
     
  20. DarkenedFetus macrumors newbie

    Joined:
    May 17, 2004
    #20

    See, I find this response to X800/6800 very strange, and yet I've seen it all over the internet, in reviews, etc. When I first saw the specs on ATi's new card all I could think was, "they're making a mistake." Why? Here's a key difference between the new cards, which may not be immediately obvious to someone on the consumer end.
    Say we have a bit of code that's to run on the GPU like this:

    if(x)
    {
    compute a;
    compute b;
    compute c;
    result = a+b+c;
    }
    else
    {
    compute d;
    compute e;
    compute f;
    result = d+e+f;
    }

    On ATi's cards and all NVIDIA cards prior to the 6800, if you wanted to run this code it would compile to a version where all of a, b, c, d, e, and f were computed, and then the correct values would be put in result. Code would in reality behave more like this:

    compute a;
    compute b;
    compute c;
    compute d;
    compute e;
    compute f;

    if(x)
    result = a+b+c;
    else
    result = d+e+f;

    In the 6800, however, you're allowed to "branch," meaning you only do the first three computations OR the 2nd three. This is part of "Shader Model 3.0." For some strange reason hardware reviewers have gotten it in their heads that SM 3.0 is "marketing hype." I think it should be pretty clear that SM 3.0 is "speed."
    In fact, part of the reason that the 6800 lags in performance slightly is that branching adds complexity to the general pipeline and slows it down a bit. (I'm actually surprised how competitive NVIDIA is with ATi considering this addition of complexity). But for shader performance in the long run it can't be beat. Not only that, but it makes writing interesting effects a lot easier for developers. ATi will eventually (if not very soon) implement branching as well, and I wouldn't be surprised if it resulted in a minor speed hit.
    On the other hand, I think NVIDIA's "making a mistake" in terms of PCI Express, but that's another story...
     
  21. BrianKonarsMac macrumors 65816

    Joined:
    Apr 28, 2004
    #21
    i've read several reviews discussing how it is actually harder to code for the Nvidia card, but I have no idea on the validity of it personally. even if the Nvidia card does perform better, is the increase in performance (i think it would be minimal, but you know better) worth the additional requirements (i.e. power supply, two slots, turbine noise, etc). Also is Shader Model 3.0 a Direct X feature? Or is it a Nvidia implementation that is supported in both Direct X and OpenGL? So why is it that few reviews have picked up on this difference or even taken it into account?
     
  22. benpatient macrumors 68000

    Joined:
    Nov 4, 2003
    #22
    apparently you also didn't know that the 9800 pro that was released in the G5 was actually a clocked-down version of th 128mb card...it is basically a slightly-overclocked 9800 non-pro...

    which was running at about 220 dollars when the G5 started shipping...yet it was a 350 dollar "upgrade" and still is, with true 9800 pro prices way, way, way below that in a retail box with full games included!
     
  23. DarkenedFetus macrumors newbie

    Joined:
    May 17, 2004
    #23
    Really? I'd be interested to read the reviews/find out what they're saying about ease of programming.

    The power requirements/size are pretty ridiculous. But I think the features in the NVIDIA cards are moving in the right direction; hopefully newer technology and revised design will require less power/smaller size (as always).

    NVIDIA has something called CineFX 3.0, which is the feature set supported by the 6800. It adheres to the Shader Model 3.0 standard in DirectX, as well as the OpenGL specification. If ATi plans to support these standards (DirectX, OpenGL), then they will most likely include branching, programs with a greater number of instructions, etc.

    A lot of reviews are focused on the gaming aspects of these cards in the present. The argument is that the card doesn't improve shader performance on current games, which is true -- but only because developers haven't had access to the cards and therefore can't optimize for the new features. However, I've heard that games like, e.g., Doom 3 support 3.0, and I expect reviewers' opinions will change in the near future.

    Another fallacy I've noticed in articles is that "SM 3.0 can't do anything SM 2.0 can't do." This is absolutely true. It's also true that an Apple II (if it had enough disk space) could do the same thing as a Dual G5. It's just a matter of efficiency. For example, something you can do in one pass in 3.0 might take >1 passes in 2.0, meaning more data probably has to be transferred over the AGP bus (which is not fast in both directions). Of course, PCI Express will make multipass techniques less costly, but then 3.0 will probably be standard by then.

    Really in the end it's not about whether SM 3.0 is a Good Thing. Both companies will support it, and it is definitely a Good Thing. It's just a matter of whether it's worth it now, especially since many games don't support it. If you're a developer or use the GPU for general-purpose computing, it's pretty impossible to pick the ATi card over the 6800 right now. If you're a gamer... I don't know. Though if games start to require SM 3.0 in the future, the X800 will become obsolete a lot quicker than the 6800.
     
  24. x86isslow macrumors 6502a

    Joined:
    Aug 10, 2003
    Location:
    USA
    #24
    All Steve has to have at the keynote is one 3Ghz chip, and a vague promise to ship by the end of the summer.

    If IBM is seeding 2.6 Ghz chips in quantity right now, by the end of June, they could have 2.8s in quantity too.
     
  25. Fukui macrumors 68000

    Fukui

    Joined:
    Jul 19, 2002
    #25
    Yes...hehe :D
     

Share This Page