Video out - is it de-interlaced?

Discussion in 'Mac Pro' started by NRose8989, Sep 2, 2008.

  1. NRose8989 macrumors 6502a

    Feb 6, 2008
    I'm considering options for multiple monitors when using final cut. if i was to connect a HDTV via DVI to HDMI and put my canvas on the HDTV (full screen, and using my primary display for the browser, timeline, and viewer), would the video be de-interlaced? and what if my timeline setting are set to AVCHD 1080i60? this is were it gets tricky, my canon camcorder shoots in 30p but when the footage is written to SSD, it's written as 30p wrapped in 60i data. I'm basically trying to get a display where i can view my canvas exactly how it would look on a HDTV (either 1080i or 720p).
  2. NRose8989 thread starter macrumors 6502a

    Feb 6, 2008
    Okay, so any help here? I found this Black Magic intensity but I'm not sure what the advantage is between running a card like this or just using a DVI to HDMI cable out the back of the graphics card. I don't plan to do any HDMI capture, just monitoring. thanks
  3. Cromulent macrumors 603


    Oct 2, 2006
    The Land of Hope and Glory
    Best bet would be a Matrox MXO and an Apple Cinema 23" display. The HDTV would be pretty bad to monitor your footage on.
  4. Virtuoso macrumors regular

    Feb 21, 2008
    The fundamental problem is that, unlike old CRT TVs, all flat screens (monitors or HDTVs) display a de-interlaced signal, even on 1080i material. Since the de-interlacing is usually done by the set itself, how well they do it varies from model to model - some do a very good job, some look terrible, so something like a fast pan can look totally smooth on one brand of TV and jerky and ugly on another.

    One way of making sure you see your footage as your viewers will see it is to work only on progressive formats, but you will sacrifice some smoothness as your frame rate will then be 25/30fps rather than a perceived (interlaced) frame rate of 50/60fps.
  5. sirnh macrumors regular

    Aug 16, 2006
    The monitor technology in a LCD is progressive, not de-interlaced :) De-interlacing is something do to interlaced material such as old SD and 1080i, to better display the image on a progressive screen.

    If your LCD supports component or HDMI, it probably has some sort of de-interlacing support, but you probably won't like it. It will probably be a simple filter that blends fields, or a BOB mode, where it displays the individual fields on alternating frames.

    A computer LCD screen will most likely not sync to a 1080i signal over DVI. So you will have to set the signal out of your card to 1080p or even 1200p.

    What you might want to try is connecting a consumer grade HDTV, not a computer display, to your second video port. The vesa monitor information may allow you to select a 1080i signal for your desktop. Under windows, the ATI and nVidia control panels have overrides to force the card to drive HDTVs. I believe the Mac monitor support is all standard vesa plug-n-play.

Share This Page