Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That's why all retina displays run in scaled mode. In the case of XDR(assuming the optimal @2x), the usable screen space is equivalent to 3008x1692. Yes I am aware you can still view a 4k frame at 1:1 mapping within that space, but your image is still completely wrong, and doesn't even include a second 4k screen. Not to worry, I fixed it up for you.

View attachment 894240

The products you've suggested run at 163 ppi and 138 ppi. That's too high to run them at 100%. Your chart is wrong.
 
The products you've suggested run at 163 ppi and 138 ppi. That's too high to run them at 100%. Your chart is wrong.
My reply refers to 4K on 32", about 20% smaller UI compared to apples '110ppi equivalent' standard. Not too small. Chart is not wrong.
 
Who can educate me about the need for calibrated monitors?

From the producer (film camera and chip) to the consumer (consumer with subjectively different color sensitivity of the retina) there is something like white balance or color spaces on the technical line, which are now only determined by software and certainly do not require human adaptation (with all the shortcomings of the retina's color perception that changes with age).

In the printing business I can understand that the color master has to be adjusted and checked very precisely in some cases (e.g. bank notes), but this also is not done with monitors and a human behind them. So: Why such Sony monitors? I have no idea!

I know the difference between standard color spaces (H264 etc) and e.g. HDR in my projector. In the compromise between volume of stream data and quality (frame rate, resolution, detail and color space) this discussion will completely pass the real consumer world (me and the typical MR member). Apple's monitor will already display much more than our end devices (TV, home/cinema projector) can display.
 
Last edited:
Who can educate me about the need for calibrated monitors?

From the producer (film camera and chip) to the consumer (consumer with subjectively different color sensitivity of the retina) there is something like white balance or color spaces on the technical line, which are now only determined by software and certainly do not require human adaptation (with all the shortcomings of the retina's color perception that changes with age).

In the printing business I can understand that the color master has to be adjusted and checked very precisely in some cases (e.g. bank notes), but this also is not done with monitors and a human behind them. So: Why such Sony monitors? I have no idea!

I know the difference between standard color spaces (H264 etc) and e.g. HDR in my projector. In the compromise between volume of stream data and quality (frame rate, resolution, detail and color space) this discussion will completely pass the real consumer world (me and the typical MR member). Apple's monitor will already display much more than our end devices (TV, home/cinema projector) can display.

Tbh, in terms of color accuracy and calibration, Eizo CG series are the best especially with a built in calibration tool.
 
I can remember paying around $3000 for my Apple Cinema Display HD (23-Inch) besides the $5000 for the G5 Tower. It was great at the time and I got my money's worth from it.
I can see someone who is producing 1080 local tv commercials on a budget to pick up one of these.

The industry has come a long way from creating videos on my Radius Video Vision Nubus card in my PowerPC 8100.
 
My reply refers to 4K on 32", about 20% smaller UI compared to apples '110ppi equivalent' standard. Not too small. Chart is not wrong.

The standard is 96ppi, not 110, since we're talking Windows here. Therefore, 138ppi is more appropriate on 125%, if not 150%, and 163ppi is almost 175%. Using those screens at 100% for extended periods of time while taking meaningful advantage of the screen estate wouldn't be very healthy.
 
Hilarious thread. I remember watching Jobs introduce the iPod Hi-Fi. He said it sounded as good as high-end audiophile equipment, and he had replaced all the high-end equipment in his home with iPod Hi-Fi. I was intrigued, and immediately went to the local Apple store to listen to the new iPod Hi-Fi. Yeah, Altec Lansing and McIntosh had *absolutely nothing* to worry about. Unless Steve had some very, very low quality gear at home (which I doubt), he blatantly lied. So not much new here re: claiming that a $5K monitor can replace a $43k monitor. However, Steve did not make the mistake of putting what he said during the present into a slide for God's sake.
 
  • Like
Reactions: Marekul
From the producer (film camera and chip) to the consumer (consumer with subjectively different color sensitivity of the retina) there is something like white balance or color spaces on the technical line, which are now only determined by software and certainly do not require human adaptation (with all the shortcomings of the retina's color perception that changes with age).

You're talking about color grading. The purpose is artistic intent. It's like filters on Instagram or any other photo app. Check out some tutorials on YouTube, once you see what they do, you'll realize that almost every single professional video has some level of grading done to it and that's why even home editing tools like iMovie have manual grading controls.

In the printing business I can understand that the color master has to be adjusted and checked very precisely in some cases (e.g. bank notes), but this also is not done with monitors and a human behind them. So: Why such Sony monitors? I have no idea!

Who chose the colors to begin with? A designer. So the designer has to have a calibrated monitor, or in the case of press, often a calibrated light booth with a library of special color swatches. This is the physical equivalent of a calibrated high-end monitor.

P5D50840-pantone-color-evaluation-5-light-booth-product-2.jpg
 
Last edited:
Well, it has to be said, Apple has sometimes lied-- er, distorted reality, on occasion. But the fact remains, if you're an editor, not a color grader, it still is a fantastic monitor. Interesting to hear the different things it lacks for the ultimate grading. But who says this is a grading monitor? Well, it is, you can use it as an editor to get an accurate view-- give or take -- of what the director liked. So correct it from there on a Sony. If I'm a small independent feature? Close enough, maybe. Technical perfection, when it can't be distringuished with great exactness? Or good enough if you need to save the money or time?

You're a hell of a lot better off using this in production, with maybe a few tweaks every night by the grader. Few directors would notice.

Of course, he's right. Apple bends reality. It's not REALLY true, but it's not ENTIRELY false.
 
  • Like
Reactions: Adult80HD
Tube Amp generates harmonic distortion giving a more warm feel.

So it's less accurate. Interesting.

I did read, a number of years ago, about the 'nuances' of tubes, and tube amps in particular. The sources I read didn't sum it up in your language. I was led to believe that it was 'different', but not 'fudged'. But I guess one could argue that the 'distortion' mimics the natural distortion in a live concert? *shrug*

When the stuff I was reading started getting into the distortion from certain speaker cone materials, and which one was 'better', I lost interest.

Tangent: I did get ripped for listening to an iPod on a transcon flight. 'Stupid devices, you get the fidelity of music played through two tin cans and a stretched string. Yeah, but I want more than '6' songs on my iPod. Philistine? Whatever.... If it lets me drown out the screaming spawn of Satan two rows behind me, and drowns out the yacking twit beside me, I'm so grateful...
 
The standard is 96ppi, not 110, since we're talking Windows here. Therefore, 138ppi is more appropriate on 125%, if not 150%, and 163ppi is almost 175%. Using those screens at 100% for extended periods of time while taking meaningful advantage of the screen estate wouldn't be very healthy.
We're not talking about Windows at all, where did you even get that?? 🤦‍♂️ Apple has always targeted 110ppi equivalent size or higher. But if you want to bring up Windows, the way scaling works on Windows is completely different to MacOS anyway.

And if you think 138ppi physical size is bad, then you better start writing your complaint email to Apple.
Both MacBook Air and 13" MBP default scaling is 128ppi equivalent size and goes up to 149ppi equivalent.
15" MBP default is 129ppi equiv and goes up to 147ppi equiv.
16" MBP default is 132ppi equiv, goes up to 151ppi equiv.
 
We're not talking about Windows at all, where did you even get that?? 🤦‍♂️ But if you want to bring up Windows, the way scaling works on Windows is completely different to MacOS anyway.

Which is precisely why these displays are for Windows. They’re a poor choice on macOS.

And if you think 138ppi physical size is bad, then you better start writing your complaint email to Apple.
Both MacBook Air and 13" MBP default scaling is 128ppi equivalent size and goes up to 149ppi equivalent.
15" MBP default is 129ppi equiv and goes up to 147ppi equiv.
16" MBP default is 132ppi equiv, goes up to 151ppi equiv.

Um, yeah. Do you actually believe the typical viewing distance between external displays and laptops to be the same, or are you just playing dumb? Laptops have higher ppi and phones have even higher ppi: our eyes are closer to those displays.

But you knew that already.
 
Well, it has to be said, Apple has sometimes lied-- er, distorted reality, on occasion. But the fact remains, if you're an editor, not a color grader, it still is a fantastic monitor. Interesting to hear the different things it lacks for the ultimate grading. But who says this is a grading monitor? Well, it is, you can use it as an editor to get an accurate view-- give or take -- of what the director liked. So correct it from there on a Sony. If I'm a small independent feature? Close enough, maybe. Technical perfection, when it can't be distringuished with great exactness? Or good enough if you need to save the money or time?

You're a hell of a lot better off using this in production, with maybe a few tweaks every night by the grader. Few directors would notice.

Of course, he's right. Apple bends reality. It's not REALLY true, but it's not ENTIRELY false.
OK, but remember, reference monitor comparison is only on the higher end of price. On the lower end, the LG Ultrafine 5K or equivalent iMac screen does the job just as well for most people also and is 1300 bucks new. So if you are just trying to get really close, what does this monitor get you for 5000 bucks?

I think there are two strictly better things about the XDR monitor vs the Ultrafine 5K ... 32” size at retina DPI, and the HDR support definitely enables some HDR workflow scenarios.

Frankly, I don’t find the transition from 27” to 32” compelling or even desirable, so that further reduces the value proposition in my eyes, but YMMV.
 
  • Like
Reactions: Marekul
This is a great monitor. Far more monitor than 99% of us need. is it the absolute best? Could more money buy a better monitor to squeeze out that last percent of performance and quality. You betch ya.

But most of us would absolutely love this monitor and it would improve the quality of our work enormously. Once we match the abilities of this monitor, we *may* then be ready to upgrade to a $43,000 screen....

But probably not.
 
  • Like
Reactions: Adult80HD
Apple knew exactly what they were doing. I’m sure this monitor (and stand) is going to sell or exceed their internal targets.

That's not a foregone conclusion, but it's certainly possible. I wasn't actually speculating on sales targets. I'm saying they deserve any criticism they get on their over-inflated claims. This is not suggesting it's a bad display. They went from making mostly garbage displays (17" ACD early lcd, 23" with extreme unevenness and color temp, although the 30" was nice) to making decent but expensive displays with the 27" and later to trying higher end displays.

They are (or appear to be) making over-inflated claims though. There have been many cases in history where cheaper options displaced really expensive ones by being good enough, but this doesn't require nonsense marketing.
 
Which is precisely why these displays are for Windows. They’re a poor choice on macOS.



Um, yeah. Do you actually believe the typical viewing distance between external displays and laptops to be the same, or are you just playing dumb? Laptops have higher ppi and phones have even higher ppi: our eyes are closer to those displays.

But you knew that already.
Lol seriously, you people. The only way to solve what you claim is a problem is to use a 5k monitor @2x. These didn't even exist until 3 years ago. And XDR didn't exist until 2020. So you think prior to 2017 every pro using a Mac and a 4k monitor has been, and is continuing to do all their work on a 'poor choice'. And the only non-poor choice between 2017-2020 was a single LG monitor? 🤣 Is there enough oxygen left in your bubble?
 
Lol seriously, you people. The only way to solve what you claim is a problem is to use a 5k monitor @2x. These didn't even exist until 3 years ago. And XDR didn't exist until 2020. So you think prior to 2017 every pro using a Mac and a 4k monitor has been, and is continuing to do all their work on a 'poor choice'. And the only non-poor choice between 2017-2020 was a single LG monitor? 🤣 Is there enough oxygen left in your bubble?

The products you've suggested run at 163 ppi and 138 ppi. That's too high to run them at 100%. Your chart is wrong.

The truth is in between both of you. Both Windows and MacOS can handle displays that aren't a integer multiple of the "native" DPI. They do it in different ways.

Windows exposes the native DPI to the application, which results in perfect vector graphics, but bitmaps often being scaled.

MacOS exposes only 1x and 2x scaling. However, it can apply scaling to the entire desktop. This is done by default in any 2016 and later laptop. Thus, when you are at 2x scaling, it is optimal, anything else results in everything having a blur.

Microsoft's support of noninteger scaling resulted in blurry apps for the longest time, but by now, most apps have gotten it figured out, whereas Apple has difficulty now supporting the intermediate DPIs.
 
  • Like
Reactions: Marekul
Lots of sanity. A monitor that costs $43k is better than a monitor that cost $6. Apple never said the xdr could replace a $43k monitor apple used the word compete.

If you were about to invest in a high quality monitor your choices were $43k or ? Now you choices are $43k or $6k.
You are right but I would still be pissed if I Purchased an XDR monitor after Apple saying it competes with the Sony display, it's was quite disingenuous to say the least even if commonsense tells us there's no way a 6k monitor will actually compare to a 40k one.

The truth is no real professional was ever going to consider an XDR as they know better but i bet a few prosumers did after Apple telling them 'just how good it was'. Turns out it can't even get some of the basics right like viewing angles eekkk.
 
Lol seriously, you people. The only way to solve what you claim is a problem is to use a 5k monitor @2x. These didn't even exist until 3 years ago. And XDR didn't exist until 2020. So you think prior to 2017 every pro using a Mac and a 4k monitor has been, and is continuing to do all their work on a 'poor choice'. And the only non-poor choice between 2017-2020 was a single LG monitor? 🤣 Is there enough oxygen left in your bubble?

Given that macOS doesn’t support fractional scaling, yes, that’s correct (and unfortunate). But since your original point was to rail against Apple’s display, and you’re not even trying to argue that any more, my guess is you’ve realized you have no case.
 
Well, did you hire him to calibrate your tv? :)

I said I was kidding, but (like most of the A/V community) he is obsessing over details to the point it's more placebo than perception. In my opinion. Everything he says is technically true, but I think he makes a living talking giving importance to certain very specific aspects of the viewing experience - which is fine. Then he makes a hot take statement like "the Apple monitor is not suitable for pros" or "you can't enjoy a director's vision without calibration" - which is, well, a bit pompous.
Omg you are reaching there. It's all just a placebo now? If video quality is not really important as you are alluding too then why would Apple themselves have compared the XDR to a reference monitor.
[automerge]1581827985[/automerge]
 
  • Like
Reactions: Marekul
As far as I can tell, not one single video pro involved with say, mastering UHD or DCI 4K content, or HDR color grading—ever, in their wildest dreams—thought this monitor would replace a $43,000 reference monitor. Not a single one.

Did Apple oversell the XDR’s capability as a reference monitor? Sure. Newsflash: corporations occasionally engage in marketing puffery. Does Apple deserve all the crap they get about that? Sure, have at it lol. Knock yourself out. Have fun! Apple really couldn’t care less 😄

But if you’re not mastering or color grading HDR content where using $43k reference monitors is a no brainer, guess what? This monitor might very well be good enough. There are plenty of pro uses where this display is going to be an upgrade compared to what they were using before.

I think Apple is going to sell a ton of these, and the next version will likely have mini-LED backlighting and be even better. Time will tell.

What I see mostly in this thread is a lot of non-pros, who know little about who would use this monitor and why, spouting off about how this monitor sucks because it can’t be used in a few very specific instances that represent significantly less than 5% of working video/film pros. That’s simply not true, and if you think it is, you don’t know what you’re talking about.

Yeah, Apple oversold the XDR if you’re trying to replace a very expensive reference monitor. But if like most pros the work doesn’t require a reference grade monitor, the XDR is probably an upgrade, and a very well-priced one at that.

Like the Mac Pro, the XDR is mainly for business, corporate and enterprise customers. They have employees they’re paying $8k+ a month and find it trivial to spend $200/month (before taxes) on a monitor to make them more productive.

Scientific, engineering, software developers, finance, even project managers... lots of users can benefit from a 32” Retina display. If you think $6-7k is a lot of money for a monitor, then it is. But you’re not the target market, are you? To those who can benefit from it—including one or two person shops where time is money—it’s pretty small change.

Non-video/film pros in this thread: it’s not for mastering or color grading. Deal with it. The pros you might think you’re speaking out on behalf of already have. They know whether they need a $43k reference monitor or not. You really don’t need to be concerned about pros who know their job.
 
Last edited:
Given that macOS doesn’t support fractional scaling, yes, that’s correct (and unfortunate). But since your original point was to rail against Apple’s display, and you’re not even trying to argue that any more, my guess is you’ve realized you have no case.
It does just not in the same way Windows does it. My original point was not to 'rail against Apple's display', I think you've assumed that because you probably only read my post that immediately preceded your first reply to me and haven't considered the context. And I don't know why you feel I would need to continue arguing any of my previous points when it's all laid out with sound reasoning and the other guy gave up responding.
 
It does just not in the same way Windows does it. My original point was not to 'rail against Apple's display', I think you've assumed that because you probably only read my post that immediately preceded your first reply to me and haven't considered the context. And I don't know why you feel I would need to continue arguing any of my previous points when it's all laid out with sound reasoning and the other guy gave up responding.
Lol I’m the other guy. Your original point was nothing but railing against Apple lol. Remember? “Fact is it's the most overpriced worst value display in history.”

Just because I can tell when someone isn’t willing to sack up and admit they’re wrong doesn’t mean you’ve proved your point. It simply means I’m not willing to waste any more time beating my head against a brick wall. You think you proved your point, fine, good for you. Everybody in this thread can see the facts for themselves laid out and form their own opinion 🤷‍♂️

Post #398 is right in your wheelhouse. If you can form a cogent counterargument, I’ll be glad to respond.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.