Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The king of theater (projection) video quality in the U.S. are Dolby Cinema theaters.
That's what Dolby would like to believe you. The quality of projection in theaters is actually pretty bad, the same can be said for audio. I agree with what you say. The installation in theaters are nowhere near 5000:1.

The point is, you can do a lot better at home with a proper setup. Certainly not with a $1500 projector though. What you describe is what you get with a $1500 at home, a dim and flat looking image, but in the theater case with a more expensive projector on a much, much bigger screen. That's why the whole package is important and not just a single component. I've already mentioned a few things about black levels and so on. To a certain degree this is a non-issue with intrascene contrast. It's visible when you look at a black screen. If you want the highest on/off CR with the lowest black level and still a good, despite not DLP-like ANSI CR, then your best bet is Sony. But again, it depends on your screen size. You can always double stack, which some people do. Sony claims a quad stack is an option, but I've never seen it in the wild, only in Sony demos. Pick something with the highest ANSI CR, but still good enough black levels and on/off CR. The whole 300k:1 stuff is marketing nonsense.


If you want the best possible picture you purchase a Lumagen video processor which actually produces an HDR image in an SDR container.
In addition to that, you can do the same with a HTPC as a source. And there's the upcoming madVR box. Don't get me wrong, Jim knows what he's doing with his processors. I've been Beta testing their stuff for ages, before they were actually released to the public (I think I still have one of their HDPs in the basement, which I ran with a Barco 9" CRT projector). But it's time for some competition.


Obviously you need a good calibrator to make all this happen. I did this with my Sony 4k projector/130" screen and at 12' the picture is stunning, absolutely stunning. The video processor (Radiance Pro) will run you about $10k and $1k for a decent calibration.
Calibration is something most people forget. They just buy, set it up and are done when a proper calibration can make a night and day difference. Also, unless prices have changed, the $10k is for the biggest 4446, the others are cheaper.

It's also well worth it to consider the use of an anamorphic lens. Which means you won't have a brightness drop when watching scope movies. Another option when projection + processor + lens is considered would be something like a cinemascope Barco projector, which comes with a scope format DMD and does everything a processor does in it's firmware. So no Lumages required in such a case. The other solution would be more modular though, so processor + lens can be kept and the projector swapped out for a new one. This is all a moot point when $1400 is the budget. A TV is the much better option in this price range it just comes with a really small display in comparison.
 
  • Like
Reactions: HDFan
If you want the best possible picture you purchase a Lumagen video processor which actually produces an HDR image in an SDR container. This blows away anything that you can buy out of the box.
Jim knows what he's doing with his processors. I've been Beta testing their stuff for ages, before they were actually released to the public (I think I still have one of their HDPs in the basement, which I ran with a Barco 9" CRT projector). But it's time for some competition.
So you are both telling me, that squeezing a 4000 nit PQ image into a 100 nit Gamma 2.2 is what produces wonders?
I really do not follow.
I thought the main advantage of HDR display is its extended color and brightness reproduction capability (increased color volume). So how can this be matched in old Rec.709 volume?
I admit, I have had no experience with processors like this. What have I been missing?
 
So you are both telling me, that squeezing a 4000 nit PQ image into a 100 nit Gamma 2.2 is what produces wonders?
Who said that? For one, you don't have 4000 nits sources at this point. And you'll have a hard time finding displays in the consumer range that can actually do it. And you're also not squeezing anything into 100 nits. You're squeezing the content into precisely the range that your display is capable of and that's the advantage. You calibrate the whole system so you get the best possible results, no matter if your display can do 50, 100, 500, 1000, 4000 or 10000 nits. Also, brightness changes over time, so you need to recalibrate, it also depends on the production run, so there's a variance from unit to unit leaving the factory. In addition to that projectors brightness depends on the screen size, the screen fabric, lens and zoom ratio of the lens. What TV manufacturers do is ignore variance in brightness, just use a single setting for a specific model and they're done. This means some units will be close to what they should be, others will be off. And all of them will be off after a few hundred hours when brightness drops.

Projector manufacturers are even worse. They're all doing it in a similar way. Here's the Sony case according to their engineers. They project an image with the same size as the panel, this will be insanely bright (this can't be done with a production unit). From there they assume a (linear) dependency between image size and measured brightness at panel size and come up with a specific brightness mapping for HDR content. It just doesn't work this way because changing the throw ratio alone will have a major impact on brightness.

Start reading here and here. There's plenty of more material, but it's a good staring point since there's no reason to repeat everything here again (besides, it would probably blow up this thread).


I thought the main advantage of HDR display is its extended color and brightness reproduction capability (increased color volume). So how can this be matched in old Rec.709 volume?
You're not matching to Rec709. There's no reason you can't do DCI-P3 or BT2020 without HDR, it just happens that HDR falls within those gamuts. As I've pointed out before, most displays don't even cover DCI-P3 100%, including the beloved OLEDs, which also fall short of 1000 nits.
Let me nitpick... there's no such thing as a HDR display. You have displays with specific brightness, what HDR does is software/firmware to tone map a value to another value and provide backward compatibility. You can easily drop the whole HDR thing, master at 1000 nits and be done if your display is capable of 1000 nits brightness. But then any display which is brighter or less bright would be screwed. What the Lumagen solution does is a mapping that is specifically tailored for your source and display. There's no assumption about screen size, throw ratio, age of the display (again, brightness drops over time) and so on. It's all measured with external equipment so it 100% covers your use case. And that's why macpro2000 mentioned to have your system professionally calibrated, even if it costs $1k. You will have to re-calibrate from time to time, again, because brightness drops. Of course you can do all this on your own if you have the knowledge and proper equipment to do it. The enthusiasts will use an installer to do everything and (unfortunately) the majority will keep buying stickers on devices that say HDR, Atmos, FullHD and so on and keep believing they get something when in fact they're screwed over by the industry. Very similar to manufacturers marketing XPR DMDs as 4k when they're not. That's why it is so important to do some research and not blindly believing everything. Unless of course you don't care, then just shop away and grab anything new.
 
Who said that?
...a Lumagen video processor which actually produces an HDR image in an SDR container.
I may have misread the macpro2000's sentence, but my initial impression was exactly that: produces an HDR image in an SDR container. That does not compute in my head.
For one, you don't have 4000 nits sources at this point. And you'll have a hard time finding displays in the consumer range that can actually do it. And you're also not squeezing anything into 100 nits. You're squeezing the content into precisely the range that your display is capable of and that's the advantage.
I understand that, and that is precisely where video HDR and photo HDR go different ways. In photo, we squeeze the input dynamic range into the limited output range (the exposure bracketing technique). In video, on the other hand, we try to expand the output range. My display is capable of 1800 nits, for example. We will have to wait for Dolby Pulsar class of 4000 nit displays become mainstream, but one day they will.
I think there are, especially lately, more UHD blurays mastered to 4000 nits, Someone has compiled a table here.
You calibrate the whole system so you get the best possible results, no matter if your display can do 50, 100, 500, 1000, 4000 or 10000 nits.
For me, HDR in video reproduction can only be talked about if the display is capable of surpassing the Rec.709 100 nit mark. Otherwise we are really talking about the photo HDR-style tone compression.
In addition to that projectors brightness depends on the screen size, the screen fabric, lens and zoom ratio of the lens. What TV manufacturers do is ignore variance in brightness, just use a single setting for a specific model and they're done. This means some units will be close to what they should be, others will be off. And all of them will be off after a few hundred hours when brightness drops.
I understand, that brightness decreases proportionally to the square of distance (exponentially). I also can live with the fact that brightness of any light source drops with age. But we need to go over 100 nits to be able to talk about HDR in the video sense.
You're not matching to Rec709. There's no reason you can't do DCI-P3 or BT2020 without HDR, it just happens that HDR falls within those gamuts. As I've pointed out before, most displays don't even cover DCI-P3 100%, including the beloved OLEDs, which also fall short of 1000 nits.
Color gamut and brightness are 2 independent characteristics, but they have been both combined into the new video reproduction standard called HDR. Just marketing.
Let me nitpick... there's no such thing as a HDR display. You have displays with specific brightness, what HDR does is software/firmware to tone map a value to another value and provide backward compatibility. You can easily drop the whole HDR thing, master at 1000 nits and be done if your display is capable of 1000 nits brightness. But then any display which is brighter or less bright would be screwed.
I understand it quite differently. For one, HDR display is any that is able to reproduce wider dynamic range than Rec.709. Second, the important ingredient is PQ EOTF. Which is absolute scale. This is first in history. All previous standards and HLG are relative. I am talking about pixel code value and expected display brightness. In PQ, the scale is absolute - pixel value of 769 needs to be displayed at 1000 nits. That implies, that tone mapping is only needed, if the display is incapable of reaching the output brightness asked for. If the display is capable of higher brightnesses, it will stay linear at that pixel value.
So, as I understand it, there are 3 vital ingredients, that make a HDR display:
  1. understanding a ST.2084 PQ EOTF (incidentally, that is also a single input characteristic that forces my display into HDR mode. No other metadata matters, but can be used as additional aid).
  2. able to reproduce at least 1000 nit (ok, the requirement is lower for OLED displays)
  3. support for at least DCI-P3 color gamut. We know, PT.2020 is not achievable with current technologies.
I do not claim to be a high-level expert in this matter, but that is how I understand it. And that is how I have tried to do my own HDR10 exports.
 
Last edited:
Used with something like the SpectraCal C6 HDR2000 one could get a calibration setup for <$1K if you have a Windows PC.
If you replace C6 with X-Rite i1Display Pro, then you will get it for much less than $1k. As far as I understand, 1 license for pattern generator is also included in the price of Calman Home.
So that is what I am looking forward to.
 
Sorry for the late reply, I'm a little short on time right now, so might skip a few things.

I may have misread the macpro2000's sentence, but my initial impression was exactly that: produces an HDR image in an SDR container. That does not compute in my head.
I think I can see where the confusion is. SDR container doesn't mean Rec709 and 100 nits. I just means non-HDR setting. When you check the settings in the most menus you will see that you can either set it to HDR or SDR, where HDR on/off would probably be a better choice. So what you want to do is set it to SDR and BT2020 color. The brightness will be limited by what your display is capable of, be it 50, 100, 500 or 5000 nits and colorspace will be whatever you set it to, mostly BT2020, but of course you could do Rec709 as well. We've had displays capable of 1000 nits and DCI-P3 for a long time, those have always been "normal" and that's what SDR means in this context.

Basically you turn HDR off in the display and let the processor do the HDR processing, it also fools your source into sending HDR to the processors when your display is set to "SDR", which is aware of your displays performance and tailors this 100% to your setup. The output of the processor will then be "non-HDR", "normal" or "SDR" at the max brightness your display is capable of and also works for your setup. You don't really want 4000 nits in a pitch black room, it would burn your eyes out. You might want 4000 nits though when watching in your garden on a sunny day.


I think there are, especially lately, more UHD blurays mastered to 4000 nits, Someone has compiled a table here.
Not sure about every release, but I think those are still mastered to 1000 nits and then "blown up" to 4000. There are even some 10000 nits titles in there.


For me, HDR in video reproduction can only be talked about if the display is capable of surpassing the Rec.709 100 nit mark.
Not sure what you mean. Is 101 nits and 101% Rec709 HDR for you then? Or is surpassing Rec709 and 100 nits actually 1000 nits and DCI-P3?


I understand it quite differently. For one, HDR display is any that is able to reproduce wider dynamic range than Rec.709.
Then we've had HDR long before the term HDR was a thing. I think the problem is really differentiate between the actual technology and marketing.

That implies, that tone mapping is only needed, if the display is incapable of reaching the output brightness asked for. If the display is capable of higher brightnesses, it will stay linear at that pixel value.
That should be the case, yes. Yet some displays fail to do that. Can't tell you what's causing this, though.

able to reproduce at least 1000 nit (ok, the requirement is lower for OLED displays)
Wait, why is the requirement lower for OLED? OLED doesn't change the requirements.


Interestingly enough Calman is introducing Calman Home in April for $145. Used with something like the SpectraCal C6 HDR2000 one could get a calibration setup for <$1K if you have a Windows PC. But it looks as if it is just for Direct displays ....
Not sure about the latest Calman Home, but there are alternatives anyway. Be careful about your meter though, I think (better check the spec sheets on this), the C6 doesn't have the spectral bandwith for laser projectors. You really want a narrow bandwidth here ~2nm, so something like a CR300/250 would be better. Of course you can calibrate it, but it won't be accurate. Out the top of my head, the i1pro is at around 10nm.

And of course you still have to know your magic, so a calibrator might be a better choice for those who don't do this on a regular basis.
 
36584C3E-85F7-4A81-B5F9-EF0BCFA42EFB.jpeg
Yes, even if you had an HDR capable projector, a better picture is obtainable letting the video processor make the HDR from the SDR container. It’s beautiful. Here is a photo I took the other day from Cars 3 UHD on my 130” screen.
 
Not sure about every release, but I think those are still mastered to 1000 nits and then "blown up" to 4000. There are even some 10000 nits titles in there.
I don't know what do you mean by "blowing up" HDR mastering. Mastering to 1000 nits means that no pixel value exceeds 769 in the file. Mastering to 4000 nits means, correspondingly, that no pixel value ecxeeds 856. Indeed, to make the image not to "blow out" at these values, your mastering display shall be physically able to produce these luminance values.
Not sure what you mean. Is 101 nits and 101% Rec709 HDR for you then? Or is surpassing Rec709 and 100 nits actually 1000 nits and DCI-P3?

Then we've had HDR long before the term HDR was a thing. I think the problem is really differentiate between the actual technology and marketing.
Actually, these requirements have been set by UHD Alliance to distinguish UHD Premium product:
UHD Alliance said:
In order to receive the UHD Alliance Premium Logo, the device must meet or exceed the following specifications:
  • Image Resolution: 3840x2160
  • Color Bit Depth: 10-bit signal
  • Color Palette (Wide Color Gamut)
  • Signal Input: BT.2020 color representation
  • Display Reproduction: More than 90% of P3 colours
  • High Dynamic Range
  • SMPTE ST2084 EOTF
A combination of peak brightness and black level either:
  • More than 1000 nits peak brightness and less than 0.05 nits black level
OR
  • More than 540 nits peak brightness and less than 0.0005 nits black level
So we can also agree, that to be "HDR" your display device shall meet those requirements.
Source of this information.
The technical distinguishing factor between SDR and HDR media is the ST2084 being used to encode luminance levels (the gamma curve in old speak).
Wait, why is the requirement lower for OLED? OLED doesn't change the requirements.
OLED of today is not capable of outputting 1000 nits, hence the dual requirement set above.
Not sure about the latest Calman Home, but there are alternatives anyway. Be careful about your meter though, I think (better check the spec sheets on this), the C6 doesn't have the spectral bandwith for laser projectors. You really want a narrow bandwidth here ~2nm, so something like a CR300/250 would be better. Of course you can calibrate it, but it won't be accurate. Out the top of my head, the i1pro is at around 10nm.
The beauty of new expected CalMan Home lies in the integrated into TV AutoCal app. Which I already could install on my set, now the only missing piece in this equation is the calibrating app itself.
 
Last edited:
Thanks everyone for the help!

One last question what is some of the best places to buy a projector screen to make sure it is not damaged during shipping ?
 
Thanks everyone for the help!

One last question what is some of the best places to buy a projector screen to make sure it is not damaged during shipping ?
I have a 119" dalite HCCV screen I bought years ago. Stopped using it in favor of a DIY paint recipe I came across on the AVSforums called S-I-L-V-E-R. Cheap and looks much better. It's a pretty big difference. Now I have a 144" screen that was 10x cheaper including the paint sprayer I had to buy.

My projector is 1080p and since I got my LG OLED I've stopped using the projector altogether. I won't purchase another projector until there is a consumer level 4k with Dolby Vision and using LED bulbs. I'm still waiting.
 
I have a 119" dalite HCCV screen I bought years ago. Stopped using it in favor of a DIY paint recipe I came across on the AVSforums called S-I-L-V-E-R. Cheap and looks much better. It's a pretty big difference. Now I have a 144" screen that was 10x cheaper including the paint sprayer I had to buy.

My projector is 1080p and since I got my LG OLED I've stopped using the projector altogether. I won't purchase another projector until there is a consumer level 4k with Dolby Vision and using LED bulbs. I'm still waiting.


You're going to be waiting a looooong time. I've been using a 4k projector at home for over 4 years and couldn't not have one. OLED are nice but just too small. Plus it's just not the same if you aren't having the sound come at you directly through an AT screen. Love it!
 
You're going to be waiting a looooong time. I've been using a 4k projector at home for over 4 years and couldn't not have one. OLED are nice but just too small. Plus it's just not the same if you aren't having the sound come at you directly through an AT screen. Love it!
My projector is still hooked up, my OLED is just in front of the screen and if I need to use the projector I can just slide it out of the way. The image quality, contrast and HDR(specifically dolby vision) of the OLED makes me want to take image quality over size. And brightness of course. Watching the projector during the day, even with black out blinds, isn't pleasant. And bulb replacement costs are quite expensive. That and the longevity of LED bulbs is enough for me to patiently wait. I do also have an older 720p projector sitting in a closet. The 1080p projector is still mounted. I have the 144" painted screen I mentioned and of course the dalite screen I have in storage. I need to get around to selling that screen and the 720 projector but it's just not worth much anymore. I donated my first projector not too long ago and i'm actually on my 3rd.

If I could get wife approval i'd just get a 75-85" TV and ditch the projector all together. It's looking like the LED projectors I want may be here early next year. And laser projectors are coming down in cost.
 
Last edited:
E3DF6E3F-5E4B-43B5-B942-BE045A29196F.jpeg


Amen on the blacker blacks with OLED but it’s tough to beat this image at 130”
 
Eventually microLED will take over. But they have some technical hurdles to overcome first before they can mass produce them and bring the price down. It will still be awhile.
 
  • Like
Reactions: max2
What is the difference between a tab tensioned projector screen and a non tab tensioned projector screen?
 
I mainly watch Netflix and Blu Rays.

So mostly for movies and tv shows.

I will be sitting 11 to 13 feet away.


What's your budget? Because the last I heard, true 4K projectors start at about $5 Gs, and go up.
[doublepost=1555523549][/doublepost]
If the source maternal is only 1080p can hdr make it look better?

No, not as a general rule.
[doublepost=1555523624][/doublepost]
My budget is $1400 or less.


Um, no. Save and wait. :)
 
  • Like
Reactions: max2
I keep going back and forward.

Don't watch a lot of new movies but some but most are never 4k. Though I do watch some original netflix shows that are in 4k like Stranger Things but are they in true 4k ?
 
I keep going back and forward.

Don't watch a lot of new movies but some but most are never 4k. Though I do watch some original netflix shows that are in 4k like Stranger Things but are they in true 4k ?
The best part of 4k isn't 4k. A good tv with a good image processor can make 1080 content look pretty damn close to 4k.

No, the important part is HDR and Dolby Vision. That's the real game changer. Don't get too hung up on the 4k part of it.
 
The best part of 4k isn't 4k. A good tv with a good image processor can make 1080 content look pretty damn close to 4k.

No, the important part is HDR and Dolby Vision. That's the real game changer. Don't get too hung up on the 4k part of it.

Thanks but I am not getting a tv but a projector.
 
Sorry but if your budget is $1400 or less, all you're gonna buy is garbage. Long and short of it. Save up.
 
I don't know what do you mean by "blowing up" HDR mastering.
Simple transform, linear or non-linear.

Actually, these requirements have been set by UHD Alliance to distinguish UHD Premium product:
Well, these are requirements to put a sticker on a box. If that's what you want, that's all you need. Same non-sense they did back in the day with HD and Full HD and THX Select, THX Ultra and all these stickers no one needs. The requirements for proper HDR are higher, it's just that barely any manufacturer can do it now, especially not in their cheap consumer products. As seen as they can, you will get Premium+, Super Premium, Super Premium+ and so on sticker. I'd like to do things right in the first place, if it's possible. You won't find 100% BT2020 for example, you'd be stuck at around 80% in the best case (with modified DCI Filters).


The beauty of new expected CalMan Home lies in the integrated into TV AutoCal app. Which I already could install on my set, now the only missing piece in this equation is the calibrating app itself.
Not up to date on these apps/calibrations. Seen early versions, none worked as expected. But there's always hope. We've used these type of calibrations ages ago in the medical field for monitors. Even in the $30k price range for a 22" to 24" monitor, they couldn't get it working properly.

I recently watched "normal" HDR without a processor in the chain... couldn't believe how bad it looked in comparison. I can only highly recommend a Lumagen or any other source that does some additional magic for anyone serious about video.

One last question what is some of the best places to buy a projector screen to make sure it is not damaged during shipping ?
Think about what type of screen you want, seating arrangement, etc. Any screen >1.0 gain will have a brightness shift to the side, the cheaper ones a color shift as well. How much brightness you need is a factor as well. If you want to do things right, you need an acoustic transparent screen. There you have to make the choice between wooven or perforated. Any serious manufacturer will ship it in a wooden crate.


That and the longevity of LED bulbs is enough for me to patiently wait.
It's not a LED bulb, it's a LED array. You can't change it. And LED projectors have been around for years. While brightness doesn't drop as fast as regular bulbs, after some point you will only get half the brightness at which point you'll have to buy a new projector. When that's the case, depends on the LED manufacturer. I'd say somewhere between 5000 and 10000 hours, depending on how much additional brightness you have to begin with.


If I could get wife approval i'd just get a 75-85" TV and ditch the projector all together.
That's still tiny though. I'm using that size for "normal" tv viewing (65" is too small). A projector screen is usually much bigger. If 75" to 85" is all you need, then definitely get a TV.


Here's an interesting quote from home theater review about MicroLED:
Sorry, but the article is total non-sense. You can still see the separation of individual panels, which makes it unusable for the home unless you sit very far away. For theatre application this is less of an issue. But these are not acoustically transparent, which makes them unusable for theatre applications, since the speakers are behind the screen, which should be the case for home use as well. One of my dealers ordered one for tech demos ($500k) and if I have to decide between spending $500k on a projection setup or on a µLED wall, I know which way I'd go. In 10 or 20 years? Who knows.


What is the difference between a tab tensioned projector screen and a non tab tensioned projector screen?
Tab tension assists gravity. Regular screens are pulled downwards, they're not 100% planar though and will change as well depending on temperature, climate, etc. A tab tensioned screen also pulls the screen to the side, so it's 100% planar. Make sure the tensioning is strong enough though. You'll always want a tab tensioned screen of you go motorised.

Don't watch a lot of new movies but some but most are never 4k. Though I do watch some original netflix shows that are in 4k like Stranger Things but are they in true 4k ?
That depends on the show, in the case of Stranger Things, yes. Season 1 and 2 were shot in 5k, season 3 in 8k. They used RED cameras. Would not have been my personal choice, but the cams are cheap and content is very easy to distribute with their Red Ray Players if that is a factor.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.