Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Honest question... Could be dumb but whatever.

What is the point of the $43,000 monitor? I understand content creators want to make sure everything looks great.. But if a $5,000 monitor cannot even produce quality high enough to compare... Then why are they spending so much money on these? Who and where is someone consuming tv/movies that it requires details only $43,000 monitors can produce? Who are they producing this for?

They not intended to consume content, but to create it. You don't watch a movie on one of these, it's used by the people who made the movie.
 
I sold consumer audio back in high school, and really loved the people that came in that bragged that they could tell the difference between a 'tube amp', and a 'transistor amp'.

'Yeah, sure, so you going to buy that McIntosh tube amp or not? And those Klipsch speakers? I thought not.'
 
snakeoil.. where can I buy some, I've been watching his reviews for years and never felt him sell anything to me.

Well, did you hire him to calibrate your tv? :)

I said I was kidding, but (like most of the A/V community) he is obsessing over details to the point it's more placebo than perception. In my opinion. Everything he says is technically true, but I think he makes a living talking giving importance to certain very specific aspects of the viewing experience - which is fine. Then he makes a hot take statement like "the Apple monitor is not suitable for pros" or "you can't enjoy a director's vision without calibration" - which is, well, a bit pompous.
 
Honest question... Could be dumb but whatever.

What is the point of the $43,000 monitor? I understand content creators want to make sure everything looks great.. But if a $5,000 monitor cannot even produce quality high enough to compare... Then why are they spending so much money on these? Who and where is someone consuming tv/movies that it requires details only $43,000 monitors can produce? Who are they producing this for?

For producers and directors making $200m tentpole movies and $15m per episode TV shows. There has to be a reference standard for viewing these and that’s why Sony, Panasonic, Flanders Scientific, et al. make these monitors.
 
  • Like
Reactions: ImaxGuy
But Marques Brownlee spuffed himself up the wall over the XDR, and he hasn’t had any backhanders from Apple. 🤔
 
They not intended to consume content, but to create it. You don't watch a movie on one of these, it's used by the people who made the movie.
But who has devices that can display this high of quality? Basically what I am asking is if its so high quality that a $5,000 monitor is vastly inferior then what is the point of making it in such high quality?
 
Honest question... Could be dumb but whatever.

What is the point of the $43,000 monitor? I understand content creators want to make sure everything looks great.. But if a $5,000 monitor cannot even produce quality high enough to compare... Then why are they spending so much money on these? Who and where is someone consuming tv/movies that it requires details only $43,000 monitors can produce? Who are they producing this for?

Before Apple saved IPS panel from going extinct ( In PC market at least ), people were thinking that crappy 8 bit TN panel were good enough. Before Apple show you Retina display, people were saying 80 PPI were enough. The cut it short, most consumer have absolutely zero idea about quality, all they care was pricing.

Those 40K Monitor are not made for in mind with budget of price X. They were made to be the absolute best monitor that is as accurate as technically possible at all times.

If I swap the subject to cars, what is the point of 20 Millions F1 Car? If your top superCar ( Ferrari or Bugatti ) cost $1 million and it can not compare, then why are they spending so much money on it?

The point of F1 car is to be the fastest, the point of reference monitor is to be the best, regardless ( mostly ) of cost.

Although I imagine someday MicroLED would take over and reference grade monitor could be affordable to even consumers.
 
Here's a screenshot, buddy. Also watch the video from HDVTest at 10 seconds in.

Stop being so smug--you are only asking to get dunked on. View attachment 894046
Hey buddy, the ad-homs only diminishes your reply. This thread is an echo chamber. I am looking for a reference where Apple said their XDR monitor can replace the Sony for the most demanding colorist work. Comparing the attributes of two monitors , one which is $37K more than another (unless there is only gold in the Sony) means there are some functions and qualities that the Sony has that the XDR doesn’t have.

It seems the market position of the XDR is a fairly good monitor, whereby one doesn’t have to spend $43K if one isn’t doing the most demanding of work. Right tool for the right job, where the XDR is providing a less expensive option for less demanding work.

There is nothing wrong with Apple comparing the two if some attributes overlap. That doesn’t mean the XDR isn’t capable for the target market Apple has set out to sell into. Whether it’s successful is another story.
 
What is the relevance of having X number of subscribers to someone’s qualifications, experience, education, and credibility?

I’ll answer that for you. Zero.

Anyone can start a YouTube channel and say whatever they want.

Yes, however, he has a pretty decent following (~30mil views) and is respected in the A/V community. He focuses solely on HDTV testing. A person working at a media company may also do a great job (rtings.com is good for TVs/Monitors) or they may have to review multiple different things and not have a specialty. Not sure why there is all the hate here? What do you expect the guy to have, a PhD in monitor reviewing? What qualifications do you want besides thousands of hours of doing what he has been doing? 🙄

On a side note and with my own degree of ignorance, I do wonder how important "reference" monitors are in the scheme of things though. When 95% of the general public has no idea what the brightness function on their monitor does, let alone the interest or desire to calibrate their display picture, what's the difference? Not like Michael Bay's movies are any more watchable because he uses expensive equipment to film/edit it.
 
  • Like
Reactions: ssgbryan
Before Apple saved IPS panel from going extinct ( In PC market at least ), people were thinking that crappy 8 bit TN panel were good enough. Before Apple show you Retina display, people were saying 80 PPI were enough. The cut it short, most consumer have absolutely zero idea about quality, all they care was pricing.

Those 40K Monitor are not made for in mind with budget of price X. They were made to be the absolute best monitor that is as accurate as technically possible at all times.

If I swap the subject to cars, what is the point of 20 Millions F1 Car? If your top superCar ( Ferrari or Bugatti ) cost $1 million and it can not compare, then why are they spending so much money on it?

The point of F1 car is to be the fastest, the point of reference monitor is to be the best, regardless ( mostly ) of cost.

Although I imagine someday MicroLED would take over and reference grade monitor could be affordable to even consumers.

I think this is partially missing the point. One could argue that F1 technologies and usually the "halo car" of a brand's lineup will show off technologies that filter down to the consumer, but I don't think that is the majority of what is being spoken about here. They make it seem like real filmmakers could not use an Apple display to make a movie because it isn't accurate enough. Is that really the case? I think that extra 1-2% of accuracy would be offset by the majority of consumers inaccurately calibrating their display (which is in most cases ~$1,000 and not going to match the reference anyway). Are there any actual experts here that can shed more light on this topic?
 
  • Like
Reactions: bbednarz
Hey buddy, the ad-homs only diminishes your reply. This thread is an echo chamber. I am looking for a reference where Apple said their XDR monitor can replace the Sony for the most demanding colorist work. Comparing the attributes of two monitors , one which is $37K more than another (unless there is only gold in the Sony) means there are some functions and qualities that the Sony has that the XDR doesn’t have.

It seems the market position of the XDR is a fairly good monitor, whereby one doesn’t have to spend $43K if one isn’t doing the most demanding of work. Right tool for the right job, where the XDR is providing a less expensive option for less demanding work.

There is nothing wrong with Apple comparing the two if some attributes overlap. That doesn’t mean the XDR isn’t capable for the target market Apple has set out to sell into. Whether it’s successful is another story.

"World's best pro display" I doubt it.
 
well, of course it can‘t compete with a professional reference monitor that‘s 7 times more expensive. it doesn‘t even have sdi-inputs so it‘s useless in a lot of movie-making and tv scenarios, even when compared to much cheaper video monitors.

i‘d rather love to see a comparison the other way round. how does it stack up against cheaper monitors like a nice eizo for image editing or those calibrated OLED TVs some people who don‘t have the money or need for the sony use for color grading?
 
  • Like
Reactions: BlueTide
99% of local broadcast content doesn't need to be more than 1080p and in SDR.

Currently, antenna is 720p SDR, digital cable is 720p/1080p SDR, satellite users have some better options, but their signal is still highly compressed. Just try watching “The Relic” during a thunderstorm.

Streaming takes the lead if you have the bandwidth, but most of us are watching 1080p SDR content and some with faster connections are watching 4K HDR. If this is your medium, then adopting a 4K HDR workflow is where you need to be if your producing content for Netflix or Amazon.

If you’re producing traditional broadcast content, 4K is nice, and you may capture in that, but you may or may not be mastering in it given time and budget constraints. Meaning, if your mastering a Game of Thrones, 4K and HDR yes, but maybe not Wheel of Fortune or The Unicorn.

If you’re producing corporate video destined for streaming and multiple uses, then you’re mastering with a much smaller budget and narrower focus, so either 2K or 4K in SDR.

If you’re making marketing videos, then you might be shooting in 8K HDR because the bosses want it that way. Think Ferrari or high end furniture.

And if you’re making Fast 9 or Shang-Chi, your shooting in 4K, 6K or 8K and mastering for 2K and 4K projectors. Sure, your mastering in HDR, but your not guaranteed to be shown in HDR.

The bottom line is that mastering in HDR is still not the norm and most video production workflows are still adapting to those demands. The XDR fills a valuable niche for a lot of video production where SDR dominates and HDR is still nascent. If your bread and butter is SDR, but customers are starting to ask for HDR, they will be wowed by the XDR and it’s good enough for all but the most demanding (GoT, $100m+ movies) customers who don’t necessarily buy those $43K Sony monitors, they just rent them during the production (show), just like they don’t buy Panavision lenses and ARRI Alexa 65s, they rent them.

The XDR provides a solution for those caught in the middle. It’s not perfect, but it is cost effective and just in reach for those that need to step up to HDR mastering and deliverables. Anyone lamenting a cheaper option from Apple, I agree that they aren’t filling that hole and they should. If you think the XDR is just some expensive gimmick, you have lost the plot.
 
On a side note and with my own degree of ignorance, I do wonder how important "reference" monitors are in the scheme of things though. When 95% of the general public has no idea what the brightness function on their monitor does, let alone the interest or desire to calibrate their display picture, what's the difference?

It's changing. Apple products have had very good calibration in the scheme of things. For TVs, Dolby Vision and the competing HDR10+ standards require minimum performance and calibration, enforced by the trademark owners. In the computer space, VESA DisplayHDR and Nvidia G-Sync have color gamut and calibration requirements too, at least in the mid-tiers.
 
Hey buddy, the ad-homs only diminishes your reply. This thread is an echo chamber. I am looking for a reference where Apple said their XDR monitor can replace the Sony for the most demanding colorist work. Comparing the attributes of two monitors , one which is $37K more than another (unless there is only gold in the Sony) means there are some functions and qualities that the Sony has that the XDR doesn’t have.

It seems the market position of the XDR is a fairly good monitor, whereby one doesn’t have to spend $43K if one isn’t doing the most demanding of work. Right tool for the right job, where the XDR is providing a less expensive option for less demanding work.

There is nothing wrong with Apple comparing the two if some attributes overlap. That doesn’t mean the XDR isn’t capable for the target market Apple has set out to sell into. Whether it’s successful is another story.
This is exactly what Apple said in the keynote:

"There is one class of displays, it's commonly called reference monitors. And these do deliver true HDR along with these features(>27 inch screen, wide 10-bit color, precise calibration, HDR). But they're still missing these features(>4K res, retina pixel density, highly functional design, quiet operation) and they're incredibly expensive. This one(Sony) is $43,000.

"So our goal, it was simple. make a display that expertly delivers every feature that pros have asked for.

[lots of marketing fluff]

...It's the only display in the industry that delivers every feature on a pros wishlist, and more.

[more marketing fluff]

...making this the world's best pro display.
"


So there you go. It doesn't matter if they didn't specifically claim it can replace the $43K Sony. They very specifically positioned XDR as a reference monitor-class display and used the $43k Sony as an example. Then immediately claimed XDR not only compares to monitors in this class but can do even more. But in reality it falls very short of these claims.

And it doesn't matter if XDR is a really good display. Fact is it's the most overpriced worst value display in history. Deal with it.
 
well, of course it can‘t compete with a professional reference monitor that‘s 7 times more expensive. it doesn‘t even have sdi-inputs so it‘s useless in a lot of movie-making and tv scenarios, even when compared to much cheaper video monitors.

i‘d rather love to see a comparison the other way round. how does it stack up against cheaper monitors like a nice eizo for image editing or those calibrated OLED TVs some people who don‘t have the money or need for the sony use for color grading?

You can add SDI using a Blackmagic Teranex Mini AMD get some of this capabilities back, but that does come at a cost.

I agree, a better comparison might be EIZO and NEC displays that are closer in cost along with other reference displays from Flanders, Panasonic, Canon, et al.
 
They make it seem like real filmmakers could not use an Apple display to make a movie because it isn't accurate enough. Is that really the case? I think that extra 1-2% of accuracy would be offset by the majority of consumers inaccurately calibrating their display (which is in most cases ~$1,000 and not going to match the reference anyway). Are there any actual experts here that can shed more light on this topic?

first, no one would notice if the colors were slightly off. you could probably do most movies on a calibrated imac‘s screen and it would be good enough. more or less everything - from script to sound mix - is more important to a movie than absolute color accuracy. no one cares if the image quality of the blair witch project was less than stellar or if „the raid“ was filmed with a cheapo af100, or if you could see through a tie fighter’s wing.

and, as you said, 99,9% of displays where a movie is watched are off. even those “factory calibrated“ displays by apple all look quite different next to each other. and most people don‘t even care to switch off all that sh** settings that are switched on by default when you buy a new tv, like „motion smoothing, dynamic contrast,...“ - those are definitely not the ways the moviemakers intended their movies to be watched.

but...
as a colourist, you need to see more than the audience, because you get to work on the granular stuff. like a conductor sometimes only needs to hear the brass section while practising. plus, when you‘re using that 43k sony monitor, you‘re probably producing for cinema, i.e.: projectors that are calibrated (hopefully on a regular basis) in a controlled light environment (because, guess what: you‘d need to recalibrate your display everytime the light changes)
 
  • Like
Reactions: waterfta
I'm so ignorant on this topic that I have no idea what you said.

In short, he’s saying film rules, pixels drool because they cannot possibly have the equivalent resolution and clarity of high quality film stock.

The fact that a movie shot in 4K still has CGI created in 2K, up-ressed to 4K but ends up being screened using a 2K digital projector lends credence to his rather obtuse post.
 
  • Like
Reactions: Adult80HD
This is exactly what Apple said in the keynote:

"There is one class of displays, it's commonly called reference monitors. And these do deliver true HDR along with these features(>27 inch screen, wide 10-bit color, precise calibration, HDR). But they're still missing these features(>4K res, retina pixel density, highly functional design, quiet operation) and they're incredibly expensive. This one(Sony) is $43,000.

"So our goal, it was simple. make a display that expertly delivers every feature that pros have asked for.

[lots of marketing fluff]

...It's the only display in the industry that delivers every feature on a pros wishlist, and more.

[more marketing fluff]

...making this the world's best pro display.
"


So there you go. It doesn't matter if they didn't specifically claim it can replace the $43K Sony. They very specifically positioned XDR as a reference monitor-class display and used the $43k Sony as an example. Then immediately claimed XDR not only compares to monitors in this class but can do even more. But in reality it falls very short of these claims.

And it doesn't matter if XDR is a really good display. Fact is it's the most overpriced worst value display in history. Deal with it.
Do reference monitors have a range of performance? If so, there is nothing to deal with. Your opinion of the monitor as just said, has nothing to do with it’s success.
 
  • Like
Reactions: MrUNIMOG
I sold consumer audio back in high school, and really loved the people that came in that bragged that they could tell the difference between a 'tube amp', and a 'transistor amp'.

'Yeah, sure, so you going to buy that McIntosh tube amp or not? And those Klipsch speakers? I thought not.'

Tube Amp generates harmonic distortion giving a more warm feel.
 
  • Like
Reactions: ssgbryan
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.