Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.



Apple charges $5,000 for its Pro Display XDR and has described it as a display designed for professionals, even claiming that it can match the performance of some professional reference monitors on the market that sell for much more.

Vincent Teoh, a TV reviewer at HDTVTest, recently tested Apple's Pro Display XDR claims, comparing it to Sony's BVM-HX310 reference monitor, which uses dual-layer LCD technology and costs over $40,000.


Prior to pitting the Pro Display XDR against the Sony HX310, Teoh does in-depth testing of Apple's display, measuring brightness, contrast, and color accuracy, with the testing demonstrating some of the faults in the Pro Display XDR.

There were problems with contrast and color accuracy at peak brightness along with "so-so" screen uniformity, leading Teoh to call the reference mode of the Pro Display XDR suitable for content consumption rather than content creation.

Teoh then compared the Pro Display XDR to the Sony BVM-HX310 reference display as the Sony display is the one that was mentioned when the Pro Display XDR was unveiled. The Pro Display XDR struggled to keep up with the Sony display, and Teoh said that it is not a viable cheaper reference monitor for professionals."I think the Pro Display XDR is a no go for any serious professional colorist," he concluded. "At the end of the day, the Pro Display XDR is just an IPS display with 576 full array local dimming zones that happens to carry Apple's logo and costs $5,000."

He questions whether it's fair to judge a $5,000 monitor against a $43,000 reference display, but points out that it was Apple that made that comparison first at WWDC. "The Pro Display XDR doesn't deliver anywhere close to the consistency and accuracy demanded of reference monitors."

Teoh's full video on the Pro Display XDR is well worth watching for those who want to see the full testing details prior to making a purchase.

(H/T Matthew Panzarino and The Loop)

Article Link: YouTuber Compares Apple Pro Display XDR to $43K Sony Reference Monitor, Says It's a 'No Go' for Professional Colorists


This is the funniest thing I've read all day!
 
I'm a bit confused. I get that as a creator you want the best you can do. Makes sense, you want that extraordinary lens flare, add it in and you'll see it on an expensive reference monitor. But us consumers that are going to just be watching what you did will never see that slightly increased flare because our panels generally are exponentially worse than what you just created it on. Didn't that happen with GoT? They can see all the dark details on their reference monitors, but when it hit consumers, we couldn't see squat. So what is it referencing to if the majority aren't even using what is considered a reference?

It's important to be as accurate as possible because people have different viewing conditions. Creating for those when your own reference is just as flawed is not a good base to work from.

In the case of GoT that was due to the creators complete oversight in how it was being distributed and then their arrogance in the face of their own ignorance of what was going to be done to the footage between them and the audience.

People watched it via streaming services, Those services make use of heavy compression to lower bandwidth, This normally isn't a problem but in dark scenes like that the encoding requires special attention to retain detail, which nobody bothered to do. Even on a professional reference monitor those streams would have looked terrible because the detail just didn't exist in the frames.

It's a lesson the industry needs to learn as viewership moves to streaming, Either the professionals grading the footage should be creating a compressed stream copy for distribution or the streaming services should hire people to do it properly instead of throwing everything through a preset transcoder.
 
There sure are a lot of people with hurt feelings here. If you bought the monitor, and you like it, that's all that matters.
The guy tested the monitor, and found that it isn't suitable for some of the professional uses that Apple claimed.
[automerge]1581626742[/automerge]
"They compared it to the Sony monitor."

Tell us more about the comparison and the aspects/details and specs that were compared and evaluated by Apple between the two displays.
So, you're saying they didn't compare it to the Sony? Apple actually said their monitor was better (comparing specs) in some ways. Watch the video. I already saw it.
 
Last edited:
I watch everything this professional calibrator puts out. I believe he’s right. His analysis’s are using expensive equipment measuring the same objective data he does time and time again. Everything he says makes complete sense.

It’s clear the naysayers didn’t watch the video and are uninitiated in this arena because Vincent clearly explains his analysis.
 
Last edited:
What's odd about the video is the guy's right hand has a weird looking color when he waves it in the general direction of the Apple display. Is that an artifact or does his hand really look that color?
 
Apple did say that their monitor compared to the Sony. They actually compared theirs to the Sony. Not saying the Sony was better. Not saying theirs was almost as good.
I'm not going to play the exact words game. Context has meaning.

But up above you asserted: "Apple said their monitor was just as good as the Sony."

That's just making stuff up.
 
Apple started it when they made the comparison during the Keynote....
 

Attachments

  • Screen Shot 2020-02-13 at 12.55.34 PM.png
    Screen Shot 2020-02-13 at 12.55.34 PM.png
    395.8 KB · Views: 195
  • Screen Shot 2020-02-13 at 12.55.29 PM.png
    Screen Shot 2020-02-13 at 12.55.29 PM.png
    89.3 KB · Views: 197
What you guys are missing is that this XDR monitor is in many ways WORSE than a normal prosumer monitor from Dell due to the:

A. Inconsistent black level due to the local dimming, which cannot be disabled at this time. This isn't a "lens flare too bright or dark" issue. This is a "can't judge how the effect looks" issue between local dimming zones ... aka parts of the lens flare or any other effect will look different between zones, and the VFX guy won't be able to tell if it's his work or the monitor playing tricks.

B. Off axis viewing is bad enough that the edges are affected even at a normal viewing distance. Again, similar issue.
 
I’m a huge Apple fan boy, I defended the $1,000 stand, the slowing of old iPhones, the bending, the HomePod. But the second Apple started comparing a $5,000 display to a $40,000 display, even I knew it was quite a stretch.
It’s like comparing the HomePod to those huge speakers that extremely high end music studios use that cost $100,000. Obviously we know what is better
 
  • Like
Reactions: 639me
This is consistent with pros I’ve heard from. Not good enough for a colorist or mastering.

However, there are other roles in the upstream pro workflow that this monitor can fulfill. Other uses outside video/film as well.

That's what I was wondering...... This display is way too expensive for the normal consumer, so my assumption was that pro video producers could use it. But now it sounds like you cannot rely on it for accurate reproduction when making a film. So who exactly is the target consumer for this display? What are the other roles in the pro workflow that *can* use this monitor?
 
About the GoT scenes... that was about HDR not color. They underestimated how many people still have a normal display or maybe even didn't think about it since all their screens were HDR which indeed is a stupid mistake.
I watched (tried to at least) those scenes when they came out on an LG OLED77G7, which has HDR, Dolby Vision and everything any consumer will have and it forced me to completely darken the room (it was daytime - not something I normally do during the day) to even be able to see what was going on.
The content as delivered was edited to be WAY too dark. Sure in ideal situations you could see a bit of it, but anybody without an ideal setup (i.e. every single consumer of said content) would be staring at reflections instead of enjoying the content. Assuming your users don't have the same or better hardware than you have while making the software is not just smart, it's a basic requirement.
 
Last edited:
  • Like
Reactions: Successful Sorcerer
Everyone saying "of course" to this:
Who is the apple monitor for? What user needs a monitor that expensive BUT also not a colorists monitor?

Exactly -- because the Pro Display XDR was certainly not meant for me. I still want an "Apple 5K Thunderbolt 3 Display" with a built-in Thunderbolt-3 dock including 10 gigabit ethernet, HDMI, USB-A, USB-C, Firewire, Super-fast SD Card Reader, etc.... A thunderbolt 3 display with ONLY more USB-C ports is worthless since because.... the MacBook Pro already has those ports!
 
  • Like
Reactions: aylk
Now let's compare it to a cheaper IPS display with dimming zones...
 
Last edited:
"Reference Monitor"

Interesting name for it, considering...

If you're telling me that this monitor stands alone...that no low cost display, or Apple's high end XDR display, or anything in between can match it....

...then what is your monitor a "reference" for? Nothing. The entire rest of the world will see the content differently, so your $43k monitor is useless because it is not a reference for anything other than...other $43k monitors of the same brand and calibration.
 
  • Like
Reactions: opiapr
This is consistent with pros I’ve heard from. Not good enough for a colorist or mastering.

However, there are other roles in the upstream pro workflow that this monitor can fulfill. Other uses outside video/film as well.
Sooo...audio? :D
[automerge]1581628544[/automerge]
If anyone watched the video, the XDR does great in bright scenes and struggles to match dark scenes with the usual gray blacks found in LCDs.
That's a pretty big deal. Especially since that struggle you mention isn't very usual in this price range.
That was pretty bad.
 
"You get what you pay for" rings very true here. I'm wondering why Apple thought it was wise to oversell its capabilities and make the comparison in the first place.
I guess it was harder to justify its price if you compared with other regular IPS displays with dimming zones.
 
Last edited:
  • Like
Reactions: BlueTide
I'm a bit confused. I get that as a creator you want the best you can do. Makes sense, you want that extraordinary lens flare, add it in and you'll see it on an expensive reference monitor. But us consumers that are going to just be watching what you did will never see that slightly increased flare because our panels generally are exponentially worse than what you just created it on. Didn't that happen with GoT? They can see all the dark details on their reference monitors, but when it hit consumers, we couldn't see squat. So what is it referencing to if the majority aren't even using what is considered a reference?

I haven't worked in high-end film and TV (only small markets), but with sound production every engineer worth their salt mixes with their reference cans but also has some standard $50 computer speakers they also check against; I know some people who throw it on a phone these days too. You're never going to make it sound amazing on cheap equipment, but they know to make sure to make sure it sounds okay.
 
  • Like
Reactions: Kram Sacul
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.