Using the interlaced stream as a carrier makes it compatible with existing, interlaced-based infrastructures but it does not degrade the equality.
It can degrade. Depends on source.
There's no single answer.
If the source is already softened to avoid interlace twitter, effective resolution does not degrade.
If the source and master are done only for progressive displays, higher effective resolution than what is possible with 1080i, is possible.
This is all of coure very OT and pretty hair-splitting,
but there can be huge differences in effective resolution in material which is all called "1080p".
Starting from sensor, most "1080"-sensors have been made for 1080i. They have huge optical anti-aliasing (or low-pass filtering) and pixels (or rows) are combined for readout. (First field is made out of rows 1,2,3 & 5,6,7 and second field from rows 2,3,4 & 7,8,9 etc.) This is why many sony cameras loose sensitivity when readout is changed to progressive; lines are not read twice.
Then recording format limits. Hdcam & dvcpro has only 1440 horizontal pixels and way less chroma resolution. Pixels are of course resampled to 1920.
And of course physical sampling limits; nyquist & kell.
Also most screens are "fullHD", but they have overscan, so the picture is resampled once again.
You cold have something like real effective (optical) resolution of 1000x700 with "1080p"-material and have done nothing wrong.
On the other end, you could have 2k or 4k scan or oversampled sensor and oversampled (for fullHD) recording. With right workflow and screen you could have real optical fullHD resolution (or maybe just have to substract 0.9 Kell factor).
That's almost triple the resolution!
So if movie vault has interest giving out only highest possible quality and also have in mind dci release and does not want to do the whole restoration process again when next super-duper distribution format is released (twice a decade small upgrade, once a decade big overhaul), they really have to think what to do.
At the risk of going far off tangent, let some of your fellow forum members help you. I'll make the first attempt, though it would help more if I knew exactly your primary use for your computer.
You mentioned looking at laptops. The 15" and 17" both offer matte screens as an option. In November, Intel is set to begin supporting PCI Express natively, meaning it should be simple adding USB 3.0 to the next 17" MBPs (likely set to be released sometime in the next 50 days or so) unless Apple upgrades the USB ports to 3.0, in which case, it becomes moot.
Thanks for the try, but seems to be that there's no ease for my pain.
MBP's used to be near perfect. Now it doesn't look like that anymore.
Maybe new battery and unibody construction just takes too much space. Maybe liquidmetal chassis will make things better. Who knows?
Now that 15" has hi-rez option in matte it would be quite perfect size. Handy to use on the road, but still enough real estate in desktop to work as a secondary monitor with bigger display at home.
But since it now lacks BOTH express card AND all improvements computers have had in past 5 years (eSata, usb3, bd), they are too far away from perfect.
I also wouldn't bet that chipset's support would mean anything for MBP's connectors. For example chipsets has had native support fo eSata for years.
Also it would have been easy to put usb3 chip already in last gen MBPs, but no. Funny now, that even intel's new reference boards have additional usb3 chip...
My use for the laptop would include all that average Joe's doing plus FCS & CS5. I'm just a poor media worker in a small economic area with too many people like me. Finland screwed up the amount of people educated to media field over a decade ago (goverment thought that digitalization will make money out of nowhere) and now we have so huge surplus that people will work in this field for free. So I have to think very carefully what to purchase. 17" with applecare costs about same than 5 minis, so...
Until then, your choices are either to get an external Blu-Ray player for your laptop and rip the film, purchase Blu-Ray films that come with a digital copy (as done by 20th Century Fox), or forgo Blu-Ray and rent/buy 720p digital copies until Blu-Ray becomes so popular that Apple decides the public demands Blu-Ray. As I've stated previously, Blu-Ray accounts for less than 11% of all movies purchased. Should that share rise, I'm sure Apple will begin offering them as an option.
I don't believe anymore that bd share has anything to do with apple's decisions. Dvd share was way lower when it was included in mac ecosystem.
Apple could consider bd when their mac sales start to loose share in overall market, but then again they make bigger profits from iGadgets, so they could just leave macs behind.
I have to travel for my work a bit, so I would take external bd with me only when I would know I would need it for the work. Internal would be needed for casual leisure time.
I really have to wonder how much difference exists between a 720p and 1080p film on a 17" screen or less unless you are outputting to a large TV. If you are outputting to a TV on a regular basis, you are likely better off picking up a dedicated Blu-Ray player, some of which are below $100.
Visual perception is about the angle of view, not the physical size of screen.
As for expandability and being future-proof, I believe you are far better off with a Mac Pro.
[...]
Truth be told, however, I would wait for Lightpeak before dropping any major coin. It should begin appearing sometime in 2012 or possibly even 2011 with some luck.
I have the oldest and slowest MP and I'm very happy with it. Next I'm going to triple its cpu power by upgrading the cpu's. They are pretty cheap now. Then maybe more ram. With 64bit apps 8GB isn't always enough. At least when I have Mail and couple of browsers open with FCS & CS5. Also I could replace my (flashed pc version of) x1950 to something more efficient, when apps start use use more gpu power.
I wouldn't even think about new MP models. Way too overpriced and worst memory architecture on the market.