Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Mr. Happy

macrumors newbie
Original poster
Jun 21, 2011
8
1
I am an editor/shooter currently researching the Panasonic AF100. I do my post in FCP on a Mac Pro.

The camera records to SDHC/XC media with 4:2:0 8-bit, and can output 4:2:2 8-bit via HD-SDI. The main criticism I've run into is that there is no 10-bit option.

My question is: does this really matter for FCP workflow on a Mac Pro?

I know:

- The technical differences between 10-bit and 8-bit color depth
- Why 10-bit is beneficial for compositing/color grading
- No one outputs/delivers in 10-bit, so it's really only for quality preservation during post

I realized that to even see the difference I would need a 10-bit monitor (really expensive) and a video card capable of passing 10-bit.

Is working with 10-bit media on an 8-bit monitor totally pointless? Or would the extra data (though unseen by my eyes) still be of benefit when chroma-keying and color grading?

I do corporate jobs, as well as my own independent projects.

Thanks.
 
My question is: does this really matter for FCP workflow on a Mac Pro?...
Youve gotten this far without it and you seem to have a handle on it so for you its good :)
When I was working in local broadcast station it was important however I saw some workflows that didnt take advantage of it so I kinda thought it was dumb for them to even implement it.
Where I am now we just got an RED MX last winter so were slowly getting our gear up to snuff.
Ordered a Flanders 24" for CC for Resolve so I guess now after all these years, Im getting into 10bit :)
I survived the last 17 years without it.
 
- No one outputs/delivers in 10-bit, so it's really only for quality preservation during post

That's not empirically true, but effectively is until you're working on national TV or movies.

I realized that to even see the difference I would need a 10-bit monitor (really expensive) and a video card capable of passing 10-bit.

That's almost but not quite true. Gamma encoding throws a tiny spanner in the works.

But I'm just being pedantic. If what you shoot is 95% of the way to being the final image, don't worry about 10-bit. Just make sure you don't blow things on set.
 
I guess now after all these years, Im getting into 10bit :)
I survived the last 17 years without it.

That's exactly why I'm curious as to how important it is. Some folks are all about the latest greatest, top of the line, and it often makes no difference in what they actually do on a daily basis.

I know there are real benefits to 10-bit, but I just don't think it's worth the substantial extra costs, unless you're doing actual cinema or national broadcast work.

I just never want to be that guy who spends extra cash for a 10,000,000:1 contrast ratio HDTV. :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.