Whose video cards show truer color - ATi or nVidia?

Discussion in 'Buying Tips and Advice' started by zero2dash, Jun 7, 2007.

  1. zero2dash macrumors 6502a

    zero2dash

    Joined:
    Jul 6, 2006
    Location:
    Fenton, MO
    #1
    I've heard people say that nVidia cards show truer colors than ATi's cards and wanted to see what everyone else thinks and/or what your experience has been...so I know what to get. ;)

    TIA :)
     
  2. shipdestroyer macrumors 6502

    Joined:
    Jun 5, 2007
    Location:
    New Hampshire
    #2
    It doesn't matter. Color reproduction would be different among monitors, but video cards don't really differ in this way.
     
  3. zero2dash thread starter macrumors 6502a

    zero2dash

    Joined:
    Jul 6, 2006
    Location:
    Fenton, MO
    #3
    Incorrect. :D
    As mentioned in articles here, here, and here, there is a difference that looks in favor of ATi's cards when tested on the same computer equipment (system and monitor) with the same benchmark software. Look at the color bar screens and tell me you don't see a difference in favor of ATi. ;)

    As I said in my initial post - I've had people tell me that nVidia was better, but clearly (going by the links I gave) ATi is better. I just wanted to see if anyone had their own input...
     
  4. FleurDuMal macrumors 68000

    FleurDuMal

    Joined:
    May 31, 2006
    Location:
    London Town
    #4
    Ermmm...so you already know the answer? :confused:
     
  5. zero2dash thread starter macrumors 6502a

    zero2dash

    Joined:
    Jul 6, 2006
    Location:
    Fenton, MO
    #5
    I know what those sites say, but I wanted to see if anyone could offer their own input...but I guess my google search did sufficiently answer my own question (after posting the thread). :eek:
     
  6. shipdestroyer macrumors 6502

    Joined:
    Jun 5, 2007
    Location:
    New Hampshire
    #6
    Okay, those benchmarks have nothing to do with "truer color" and everything to do with de-interlacing and image quality.

    I assumed you meant, you know, the ability of a video card to match display with printed items, etc., not DVD benchmarking.
     
  7. G4scott macrumors 68020

    G4scott

    Joined:
    Jan 9, 2002
    Location:
    Austin, TX
    #7
    If you calibrate your monitor correctly, it doesn't really matter.

    One video card may be doing some brightness/contrast adjustment (as seen in some of the images on the pages you linked to), but that's not something the video card should do to try to "improve" the image on its own (unless you tell it to.) If a DVD specifies an image to displayed, the video card sure as hell better output what the DVD is showing with the only changes being your monitor adjustments (either through the monitor or in display preferences.) If the card tries to do it's own thing with contrast and brightness, or tries to make images look more saturated, you could be losing detail in the highlights and shadows.

    "True color" is by no means a limitation of a video card. It's a limitation on monitors, which is why you have to calibrate your monitor, not your video card, to get correct color.
     
  8. shipdestroyer macrumors 6502

    Joined:
    Jun 5, 2007
    Location:
    New Hampshire
    #8
    I'm pretty sure they're just comparing some kind of proprietary hardware/software DVD decoder, so it's not as simple as "what the DVD is showing."
     
  9. dex22 macrumors regular

    Joined:
    Jun 17, 2003
    Location:
    Round Rock, TX
    #9
    If you're using the DVI output, there should be no difference. The signal is digitally derived, and digitally transmitted. For literal colors, there should be no discernable differences, all other things being equal.

    Monitors, however, are VERY variable.

    That said, if using the hardware decoding of DVD data (do macs even use the video card's decoder?) there are defaults for smoothing and filtering which are applied to the DVD image that can make qualitative differences to edges and the appearance of artifacts.

    At the end of the day, you're using an Apple-supplied driver for both types of cards so they should, in theory, have similar settings for DVD decoding.
     
  10. Mac Heretic macrumors member

    Joined:
    Feb 22, 2006
    #10
    Logically, if an analog monitor is calibrated to its optimum, there could be a difference between two different video card with the other being able to give purer spectrum of colors by creating and separating all the RGB signal values from each other optimally.

    Digital signal quality is more on/off- the card is the best if it is able to feed the signal without interferences and as fast and as smooth as ever wanted, if there is a stream going through. Right?

    The difference in still images' colors however seems to be relatively marginal if you use a modern videocard. The main quality issues are found currently with the display panels and cable connections. Once upon a time, when there was Matrox or "the other cards" to have, the output quality of the card itself was more of an issue, right? But did Matrox have not only an incredibly crisp output but better colors too, by the way?
     

Share This Page