Review: NVIDIA GTX 285 on an 8-core Mac Pro (Arstechnica.com)

Discussion in 'Mac Pro' started by winterspan, Jul 19, 2009.

  1. winterspan macrumors 65816

    Joined:
    Jun 12, 2007
    #1
    http://arstechnica.com/hardware/news/2009/07/review-nvidia-gtx-285-on-the-mac.ars

    Looks pretty good. It definitely beats the 4870 in windows gaming, but CoreImage-based and other pro apps on OSX can favor the 4870 because of Nvidia's crappy drivers. Based on prior performance, it seems like Nvidia really hobbles the OpenGL performance of their non-Quadro cards, but this may not be the case on OSX. (I'm a traditionally windows guys).

    Overall, we need a far more comprehensive test with a multitude of real-world video/graphics/3D/media/etc applications to really know. Most importantly, because of the huge variance in performance, if you are looking at these cards with a certain application or applications in mind, I'd definitely wait for a review of said application to find out which card will be best.
     
  2. netkas macrumors 65816

    Joined:
    Oct 2, 2007
    #2
    they tested glview with default settings, thats pretty stupid.
     
  3. ventro macrumors 6502a

    Joined:
    Sep 23, 2006
    #3
    It appears there are some SERIOUS problems in the way OSX interacts with this card. Hope EVGA/Apple get their **** together and put out an update.
     
  4. seisend macrumors 6502a

    seisend

    Joined:
    Feb 20, 2009
    Location:
    Switzerland, ZG
    #4
    There was an update last week by EVGA/nvidia which should make the card 20% faster in OSX.
     
  5. Pressure macrumors 68040

    Pressure

    Joined:
    May 30, 2006
    Location:
    Denmark
    #5
    They already used the new driver.

     
  6. seisend macrumors 6502a

    seisend

    Joined:
    Feb 20, 2009
    Location:
    Switzerland, ZG
    #6
    true ! my fault, I'm sorry. Hope EVGA can fix it so soon as possible.
     
  7. elvisizer macrumors 6502

    Joined:
    May 29, 2003
    Location:
    San Jose
    #7
    the reviewer wasn't aware of the CUDA demo trick for keeping the card in 3d mode.
     

Share This Page