Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You can add SDI using a Blackmagic Teranex Mini AMD get some of this capabilities back, but that does come at a cost.

but wouldn‘t you need a computer in between, or does the blackmagic have a usb-c out?

in my experience, that‘s not very useful, another point of failure and takes up time to boot up and space when you just need a monitor to check some sdi-output. plus, there‘s quite some input lag then
 
Last edited:
I'd buy something like EIZO ColorEdge or Sony PVM, seriously. This guys are in business for decades and know what they are doing. Apple is non competent to be said at least

You just gave zero value back. Apple was in the market and development the same time Sony was, as in the fact that Trinitron was developed for the Mac Display back in the '80s. Then Sony worked with Steve again for the NeXTColor Displays and the OS which had the most advanced Graphics Drawing Engine around: Display Postscript.

Sorry, but Apple R&D is world class. SONY isn't a leader anymore. EIZO is big in Medical Displays. If you think they wouldn't have invented the XDR that Apple did then think again. EIZO has never captured large scale market appeal, in any sector of their monitor space. They make their money mainly on Medical Displays only.
 
Before Apple saved IPS panel from going extinct ( In PC market at least ), people were thinking that crappy 8 bit TN panel were good enough. Before Apple show you Retina display, people were saying 80 PPI were enough. The cut it short, most consumer have absolutely zero idea about quality, all they care was pricing.

Those 40K Monitor are not made for in mind with budget of price X. They were made to be the absolute best monitor that is as accurate as technically possible at all times.

If I swap the subject to cars, what is the point of 20 Millions F1 Car? If your top superCar ( Ferrari or Bugatti ) cost $1 million and it can not compare, then why are they spending so much money on it?

The point of F1 car is to be the fastest, the point of reference monitor is to be the best, regardless ( mostly ) of cost.

Although I imagine someday MicroLED would take over and reference grade monitor could be affordable to even consumers.

Q3/Q4 2020 MicroLED displays become mass market displays. That is not just talk. Samsung, LG, Japan Display, Apple [being the largest patent holder] have those targets.
 
This is exactly what Apple said in the keynote:

"There is one class of displays, it's commonly called reference monitors. And these do deliver true HDR along with these features(>27 inch screen, wide 10-bit color, precise calibration, HDR). But they're still missing these features(>4K res, retina pixel density, highly functional design, quiet operation) and they're incredibly expensive. This one(Sony) is $43,000.

"So our goal, it was simple. make a display that expertly delivers every feature that pros have asked for.

[lots of marketing fluff]

...It's the only display in the industry that delivers every feature on a pros wishlist, and more.

[more marketing fluff]

...making this the world's best pro display.
"


So there you go. It doesn't matter if they didn't specifically claim it can replace the $43K Sony. They very specifically positioned XDR as a reference monitor-class display and used the $43k Sony as an example. Then immediately claimed XDR not only compares to monitors in this class but can do even more. But in reality it falls very short of these claims.

And it doesn't matter if XDR is a really good display. Fact is it's the most overpriced worst value display in history. Deal with it.
You were making a lot of sense. Then you completely lost all credibility in the last paragraph when you said “Fact is it's the most overpriced worst value display in history.”

Apple oversold it’s capabilities for a colorist and for DCI or Blu-Ray mastering. Deal with it. But it’s still an excellent monitor for many other uses. Saying “it's the most overpriced worst value display in history” is just BS and you know it.
 
It has USB-C


The reason I linked to the fcp.co article was for people to read how professionals working with the new system and display interface with other products.

In that article it's the Blackmagic Ultrastudio 4K Extreme

 
Even with the Sony display how could you guarantee J.J. Abrams that the lens flare looks great on my crappy displays at home or on my ipad in the plane? So whats the benefit from this reference if nothing really works like this?! Its more a benchmark in my eyes.

Let's say J.J. Abrams is looking over your shoulders and wants a bit more lens flare in a particular scene. Can you be 100% sure that the VFX you're adding in post will be reproduced accurately when watched on other displays? With the Pro Display XDR, there's no way you can tell.
 
But who has devices that can display this high of quality? Basically what I am asking is if its so high quality that a $5,000 monitor is vastly inferior then what is the point of making it in such high quality?
A lot of people do, and more people will in the near future. I have a 65” OLED TV that offers better quality blacks than the Apple XDR monitor for basically a third of the price. The Apple monitor gets brighter and has less input lag (which are important for creators), but my LG TV plays the scene from Gravity that Vincent tested significantly better than the Apple monitor does.
 
You were making a lot of sense. Then you completely lost all credibility in the last paragraph when you said “Fact is it's the most overpriced worst value display in history.”

Apple oversold it’s capabilities for a colorist and for DCI or Blu-Ray mastering. Deal with it. But it’s still an excellent monitor for many other uses. Saying “it's the most overpriced worst value display in history” is just BS and you know it.
If I'm wrong then there must be a list of monitors similar or worse than XDR at higher or the same price.
 
The reason I linked to the fcp.co article was for people to read how professionals working with the new system and display interface with other products.

In that article it's the Blackmagic Ultrastudio 4K Extreme


Sorry, I must’ve missed your post. Yes, that looks like a very complete solution. I assume AJA and others will come up with ways to extend the XDR further in professional workflows.
 
Last edited:
It compares more favorably to photographer's displays ($2-4K), also thanks to its 6K resolution (most are 4K). But here, the value proposition is considerably weaker. Apple would also be wise to offer a cheaper non-rotating stand option.

It amazes me to see that people don't even think that there is anything else between über monitors in $40k range and cheap gaming display crap. There are alternatives in about the same range. As you hint, this becomes even more concrete in photography space where... Local dimming zones screw up pixel level editing in Photoshop and other editors. Just see how moving the UI panels can adjust the brightness of pixels. And what about that you can't even calibrate this thing at the moment? Sure, the defaults may be good, but displays do drift over time.
 
Even with the Sony display how could you guarantee J.J. Abrams that the lens flare looks great on my crappy displays at home or on my ipad in the plane? So whats the benefit from this reference if nothing really works like this?! Its more a benchmark in my eyes.

Cinema standards are actual, verifiable, and objective targets that all displays (from garbage Best Buy Specials to cinema projectors) aim for - color gamuts, luminance values, white points, etc. The goal of your reference is to at least know that the problems later down the pipeline aren't your fault.

If you produce work that you can be sure looks correct, then you aren't compounding the issue later down the signal chain. If I try to adjust an image for an iMac, with a notoriously blue tint to its display, it'll be un-viewable on displays that drift in the magenta or green direction. You won't just be guessing at what might work if you have a known reference.

Not to mention - a lot of images are intended to stick around. Displays today may not be able to show full Rec. 2100, but displays 10 years from now may, and your work won't be all messed up if you just do it right the first time.

No different than how we don't master audio by first broadcasting it over AM radio. You can do a sanity check to make sure it isn't impossible to hear, but you aim high, not low.
 
That is not correct. Getting everything correct "up front" with a reference monitor allows everything downstream to have the best fighting chance of displaying the best quality picture with what's available. Getting it wrong up front makes it that much harder to get good results when it reaches the consumer's eyeballs.
Nope, you're just falling into the same trap. "Correct" is not an objective term here, because if you can't see "correct" as defined by a $43k display on any other display in the world, then it doesn't matter. "Good enough" on the XDR display or even a 5K iMac is "correct" enough for all content that will be seen by all people in digital form.
 
If I'm wrong then there must be a list of monitors similar or worse than XDR at higher or the same price.
Good point, surely there must be such a list, otherwise you couldn’t claim the XDR is the most overpriced.

Please do share the other monitors you compared to the XDR in order to pronounce it the “most overpriced” and “worst value display in history”.

Should be interesting.
 
if some guy on YouTube said it then it must be 100% true.

He's a TV reviewer, so I trust his opinion on the quality of the Apple display more than the opinions of 99.9999999999% of the general public, and certainly more than the opinions of representatives from the company selling the product.
 
  • Like
Reactions: ssgbryan and V_Man
But who has devices that can display this high of quality? Basically what I am asking is if its so high quality that a $5,000 monitor is vastly inferior then what is the point of making it in such high quality?

Same reason they don’t mix records on your bookshelf boom box speakers.

They need to make it sound/look good on anything that it gets played on - from lowest to highest quality. It will only translate well if it is referenced with the highest resolution and accuracy possible so that nothing is missed.
 
  • Like
Reactions: bbednarz
Nope, you're just falling into the same trap. "Correct" is not an objective term here, because if you can't see "correct" as defined by a $43k display on any other display in the world, then it doesn't matter. "Good enough" on the XDR display or even a 5K iMac is "correct" enough for all content that will be seen by all people in digital form.

You're sitting at your job with the director looking over your shoulder, and he's telling you to adjust the black levels so you can just barely make out the protagonist in the foreground shadows. You tell him it doesn't matter, because Joe Blow's garbage Vizio TV is going to crush the blacks anyways and this is probably good enough.

10 minutes later, you stub your big toe while being escorted out the building with your belongings after being unceremoniously fired.

("Correct" is actually an objective term, because post houses need to meet certain criteria to be Dolby Vision Certified. A display either meets the criteria or it doesn't. You might as well say that music should have been mastered at 128kbps in the 90s, since MP3s were going to crunch them down to noise anyways.)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.