X1900 vs GT7800

Discussion in 'Mac Pro' started by Demoman, Aug 27, 2006.

  1. Demoman macrumors regular

    Joined:
    Mar 29, 2005
    Location:
    Issaquah, WA
    #1
    Any guess why Apple did not continue with the 7800 card and went with the 1900? Is the 1900 a better card, the extra memory not withstanding? If I understand correctly, the 7800 is almost identical to the 4500 except the 3D functionality has been disabled.
     
  2. Chone macrumors 65816

    Chone

    Joined:
    Aug 11, 2006
    #2
    The X1900XT blows the 7800 out of the water and back TWICE.

    Okay maybe not THAT much but seriously, the X1900XT is the better card, I don't know whats up with the benchmarks at Apple's site but the Quadro 4500 should not beat the X1900XT in any game, even Doom3, I don't know if the results were rigged so the 4500 doesn't appear inferior but the truth is, as a prosumer card, the X1900XT is the best the Mac Pro has to offer, leave the Quadro for the workstation related rendering chores.
     
  3. eXan macrumors 601

    eXan

    Joined:
    Jan 10, 2005
    Location:
    Russia
    #3
    What about GeForce 7900GTX vs Radeon X1900XT?

    I heard that FeForce is faster, but should they both be in the same performance category?
     
  4. macgeek2005 macrumors 65816

    Joined:
    Jan 31, 2006
    #4
    Are you kidding me? Compare the X1800XT to the GeForce 7800, and THEN you have a semi-fair comparison. The X1900XT blows the 7800GT out of the water.
     
  5. jiggie2g macrumors 6502

    jiggie2g

    Joined:
    Apr 12, 2003
    Location:
    Brooklyn,NY
    #5
  6. solvs macrumors 603

    solvs

    Joined:
    Jun 25, 2002
    Location:
    LaLaLand, CA
    #6
    I'm sure they'll offer something else eventually, like whatever the next GeForce is. For now, that was probably the best they had for the price. A 7800 wouldn't have been worth what it would have cost.
     
  7. Demoman thread starter macrumors regular

    Joined:
    Mar 29, 2005
    Location:
    Issaquah, WA
    #7
    I was asking a question geek, OK? And, it was answered by others with more class than you.
     
  8. sdshannon macrumors newbie

    Joined:
    Aug 11, 2006
    Location:
    Portland
    #8

    hahahaha, that is a classy response ;)
     
  9. Chundles macrumors G4

    Chundles

    Joined:
    Jul 4, 2005
    #9
    Graphics cards are bloody annoying me now, X1900XTX, X1950XT Pro, 7800GTX Turbo, 7950XTX GT Pro Turbo supercharged...

    Can someone fill in this for me? One card from Nvidia and one from ATI:

    Low range:

    Mid Range:

    Top Range:

    Seems to me that whenever someone comes out with a graphics card the next day the other company comes out with something twice as good.
     
  10. Lollypop macrumors 6502a

    Lollypop

    Joined:
    Sep 13, 2004
    Location:
    Johannesburg, South Africa
    #10

    I use to follow the graphics card market, knew all the chips and everything, btu you know what, it changes WAY to fast... so here is my opinion....

    Low range: ATI: X13?? NVidia: 73??

    Mid Range: ATI: x16?? Nvidia: 760??

    Top Range: ATI X19?? NVidia 79??

    The ?? is mostly just clock speeds, memory speeds and memory sizes, the basics are about the pipes and shares counts ect.

    Me thinks Apple just went with ATI just to have both vendors release a graphics card for the new intel/EFI architecture.
     
  11. Origin macrumors regular

    Origin

    Joined:
    Aug 11, 2006
    Location:
    Nantes, France
    #11
    Actually, performance speaking, the GeForce 7300 GT is more close to the X1600 (in most tests, the 7300 GT is just behind the X1600 Pro/Mobility) than the X1300 series ;)
     
  12. ergle2 macrumors 6502

    Joined:
    Aug 27, 2006
    #12
    True, tho the new X1300 XT is identical to the X1600 Pro -- but with a 10MHz clock speed bump for the core.

    Confused yet? Don't worry, you will be...
     
  13. Chundles macrumors G4

    Chundles

    Joined:
    Jul 4, 2005
    #13
    I was confused years ago when my Dad bought a Riva TNT card so I could play Quake 2. Haven't played a proper game in nearly ten years.

    And another thing, how come "Pro" is not as good as "XT"? Surely the card makers want "Pro" to be associated with their best? Maybe it's just me or that the base model Ford Falcon is the "XT" but it just doesn't lend itself to a high-end card.

    Time to rationalise the card industry, the human eye can't tell the difference above about 70fps anyway so there's not much to be gained from cards that will run some game at 200fps - they might as well start looking at running at 80fps but have it look photo-realistic. Then let the proper pros have their massive cards for all the motion picture and graphical stuff.
     
  14. Lollypop macrumors 6502a

    Lollypop

    Joined:
    Sep 13, 2004
    Location:
    Johannesburg, South Africa
    #14
    This is where 3dFx went wrong, they (before being gobled up by nVidia) focussed on quality, nobody brough any of their cards.... :eek:
     
  15. Chone macrumors 65816

    Chone

    Joined:
    Aug 11, 2006
    #15

    Low range: 7300GT, X1300XT (X1600Pro)

    Mid Range: 7600GT, X1800GTO

    Top Range: 7950GX2 (Quad SLI), X1950XTX (Crossfire)

    I think thats about right.
     
  16. ergle2 macrumors 6502

    Joined:
    Aug 27, 2006
    #16
    It's about running recent games in VERY high resolutions with high levels of anisotropic filtering and anti-aliasing with full eye candy -- and about being able to do so for a long enough period to make buying such a card worthwhile.

    Beyond this, the next generation of games are offering the ability to offload physics processing onto a secondary/tertiary GPU at least one of the upcoming third party physics libraries - Havok, I think.

    Don't need it? Then the cheaper single card options will do you just fine. :)
     
  17. Dont Hurt Me macrumors 603

    Dont Hurt Me

    Joined:
    Dec 21, 2002
    Location:
    Yahooville S.C.
    #17
    You mean a PPU:D and they have physics cards now, alienware has em as options. problem is your game has to have software that uses the card, most games dont.
     
  18. ergle2 macrumors 6502

    Joined:
    Aug 27, 2006
    #18
    3dFX's own mistakes/mismanagement is what killed them.
    • They insisted 16bit (pseudo-22bit) rendering was "good enough" when the competition offered 32bit.
    • Their cards only supported 256*256 textures long after the competition offered 2048*2048 support.
    • They bought STB and start making/selling cards themselves. OEMs didn't like that kind of competition and where did they run to? nVidia.
    • They were late to the "integrated 2D/3D" party -- and the Banshee was slower in many games than the 3D-only VooDoo 2 due to just having a single texture unit (tho it was around 10% faster in terms of clockspeed..). No Multi-texturing in a single pass meant Quake 2 was slower than when run on the competition.
    • They rejected hardware T&L in favor of the "T-Buffer" -- who knew what a T-Buffer was?
    • They were eternally late, constantly missing product cycles/deadlines and releasing contingency products -- the RAMPAGE chipset was originally due around the time VooDoo 2 shipped! This continued right up 'til their died. Pretty much every product actually launched after the original VooDoo.
    • All the VSA line (VooDoo 4-6) really had to offer was better AA without a major performance hit. The V4 card was slower than the GeForce2MX, which was nVidia's budget card of the day. Plus, V6 never saw the light of day.
    • They didn't supported DirectX acceleration under Windows 2000, only under the Win9x (OpenGL and their own GLIDE APIs were accelerated in W2K) . The competition did.
    • Their spending on employees was legendary, with reports of ~$30K on employee lunches right up til they closed the doors.

    3dFX were very much the architects of their own downfall, I'm afraid.
     
  19. ergle2 macrumors 6502

    Joined:
    Aug 27, 2006
    #19
    No, I mean physics on the cards GPU. Both ATI and nVidia have demonstrated this.

    Check this out.

    The PPU is the PhysX card, and it's slow. PhysX is a direct competitor to Havok.

    The current PhysX card is also notorious for slowing down games when it is enabled -- arguably due to adding support for it late in the game's lifecycle, but it's not impressive, especially with how much the card costs. More than many graphics cards, and it's PCI only, not PCIe.

    The Havok library currently features software fallback for systems that don't have a "spare" GPU.
     
  20. jiggie2g macrumors 6502

    jiggie2g

    Joined:
    Apr 12, 2003
    Location:
    Brooklyn,NY
    #20

    I think Money is what people care bout. For the most part this is right. your chart that is. All cards have been updated to newest version for comparison.

    Ultra High End: $450-550 = 7950GTX / X1950XTX (DDR4)

    High End = $350-450 = 7900GTX / X1900XTX (DDR3)

    Lower High End = $250-350 = 7950GT / X1900XT (DDR3)

    Mid End = $199-250 = 7900GS (20 pipe) / X1950 PRO (36 shaders)

    Low End = $150-199 = 7600GT (12 pipe/128bit) / X1650XT (12pipe/256bit)

    Budget End = $99-149 = 7600GS / X1650 Pro (8 pipe)

    Basic End aka Crap End = 7300GT / X1300XT

    There you have it.
     
  21. Mr. Mister macrumors 6502

    Joined:
    Feb 15, 2006
    #21
    Or maybe "arguably" because the game detects the PhysX card and enables dozens of particle effects and fluid areas that otherwise wouldn't exist in the game, calculating them with a marginal slowdown while at the same time running the game with a physics detail that would slow the computer down 400% if it didn't have a PhysX card. Saying that adding a PhysX card slows games down is a very flat way of looking at it, it's like saying adding a 7900GTX and playing FEAR is inferior to running Quake II on a 9800 Pro because the latter gets a better framerate than the first, disregarding that FEAR is pushing realtime light and shadow effects while Quake II is barely 3D.
     
  22. ergle2 macrumors 6502

    Joined:
    Aug 27, 2006
    #22
    Indeed. However, the benchmarks I saw suggested with full effects the game would be borderline playable, and the eye-candy -- and that's all it added in this case -- in the movie wasn't that big an improvement, so it seems rather pointless based on that example. I am however willing to accept that the support for the card was likely added very late in the dev cycle and thus the engine's support could be rather sub-optimal for the title in question (GR:AW).

    Adding in the card should be pretty much transparent in the sense that it shouldn't drag the rest of the system down, which this obviously does, especially when the current PhysX card is the only one -- it's not a moving target like supporting the huge capability gulf between, say, GMA950 and GeForce MX cards at one end and the latest nVidia 7900GX2 or ATI 1950-range cards.

    It shouldn't be about rendering "more", it should be about improving realism -- because rendering more is still hitting the GPUs hard, and if they don't have power to spare, adding more into the mix isn't helping...

    It's also not really akin to Q2 vs FEAR, which would be several orders of magnitude in a very visible sense. It's more like changing the detail level in FEAR from low to high and comparing things that way.

    Either way, what I've seen thus far suggests that the PhysX card isn't terribly impressive, especially considering the price-tag. Maybe some newer games will make better use of it, or maybe it's the hardware implementation itself that's the problem...
     
  23. Lollypop macrumors 6502a

    Lollypop

    Joined:
    Sep 13, 2004
    Location:
    Johannesburg, South Africa
    #23
    Spending and delayed product launches aside, you actaully colaborated my point, 3dFx believed that 30 fps was enough and that people wanted quality. nVidia and ATI through their own way focussed on performance and in that era people actually wanted their new generation games to perform well... its still the same, we buy a new card so that our new game can run decently, and our old games can look good... 3dFx had a vision, but it was ultimately a flawed vision.

    Sorta back on topic, does anyone think that the goodness that will come in Leopard will require something more than a entry level card?
     

Share This Page