Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you’re kind to your eyes and you’re laughably dismissing the core function of a monitor: To see things clearly and sharp.

As a creative professional and developer on visual-related things, you’re flat out wrong.

You NEED high pixel density to read and see visual content optimally; that is why high pixel density is a high priority and key thing that sell mobile phones despite their much higher costs relative to their computing capabilities of larger devices for well over a decade!

A very meaningful amount of people regardless need higher resolutions that necessitate higher PPI on larger displays to have sharpness parity with mobile devices.

This is particularly true for creative professionals and other prosumers who value and need sharpness consistency to do their work.

That necessitates resolutions higher than 4K on large display panels directly across someone’s face (or directly in the case of spatial computing hardware).

Several monitor manufacturers, standard committees, ergonomic experts, and HCI/UX academia vehemently disagree with you.

This is common knowledge in Human-Computer-Interaction (HCI) Computer Science and other fields. That is that.

Using some of the best references monitors in the world for years, reference monitors such as Sony’s are for QAing/referencing very particular aspects of color accuracy, HDR performance, contrast levels, and other aspects of visuals which is different than the everyday needs of people and optimal resolution of a sourced work.

The work is outputted deliberately at FAR higher resolutions than 4K necessary for far better sharpness in the source panels the work will eventually be.

The Pro Display XDR and Apple’s core traditional computing prosumer hardware deliberately does this (outside the Studio Display as far as HDR): They all by no coincidence offer high PPI screens in addition of 1000 sustained nits, 1600 peak nits, Dolby Vision HDR, and HLG HDR.

6K minimum is required for that for 32” panels which the Pro Display XDR and several professional monitors offer by no accident.

5K minimum is required for that for 27” which again several professional monitors offer by no accident.

Modern display standard such as Thunderbolt 5, DisplayPort 2.1+, and so on explicitly were designed to have these resolutions run faster and be prioritized—not 4K.

4K was never intended to be the be-all resolutions for monitors and TVs, but a stepping stone.

High PPI has been commercially viable and functionally superior on mobile devices for years and merely now being viable on larger displays as always intended.

This isn’t changing for the better.
Totally wrong.

2K or even FHD for 27 inch monitors are still standard for professional uses. Do you even work for photo and video production? I work at e-comm and production studios and the resolution was never the problem cause we care more about the quality such as color, brightness, contrast ratio, and more. Besides, having high resolutions gives more headache than functions.

What you are saying is totally nonsense who never worked as a professional.
 
Totally wrong.

2K or even FHD for 27 inch monitors are still standard for professional uses. Do you even work for photo and video production? I work at e-comm and production studios and the resolution was never the problem cause we care more about the quality such as color, brightness, contrast ratio, and more. Besides, having high resolutions gives more headache than functions.

What you are saying is totally nonsense who never worked as a professional.
I disagree with you. It's not total nonsense, because just as you think that resolution doesn't play such a big role, but rather color fidelity and contrast, there will undoubtedly also be areas where maximum resolution is important!

I myself like working at 220PPI (for 10 years now) and consider anything below that to be a step backwards in terms of display quality and yes, I can tell the difference if you put a 27“ in 4K and a 27” in 5K in front of me at a distance of 1m.
 
Totally wrong.

2K or even FHD for 27 inch monitors are still standard for professional uses. Do you even work for photo and video production? I work at e-comm and production studios and the resolution was never the problem cause we care more about the quality such as color, brightness, contrast ratio, and more. Besides, having high resolutions gives more headache than functions.

What you are saying is totally nonsense who never worked as a professional.
…Don’t attempt weak ad hominem fallacies; it’s a terrible and reckless thing to do with complete strangers on the internet that means and proves little being purely anecdotal:

I’ve done stuff for the likes of athletes, creative agencies, learning platforms, government entities, and big tech companies for over a decade.

I again literally sync and discuss modern panel matters with VESA members and reps of monitor manufacturers.

Heck my university has programs honoring cinema legends of Steven Spielberg in which I helped the next generation of creatives to not be as incorrect about HCI Computer Science as you as a TA and alum making hundreds of thousands with my expertise (and making sure they do as well).

You’re defying once again academia common knowledge on the matter of the importance of PPI over resolution in which 4K and below are ill-equipped to provide a sharp image optimal for reading and seeing visual on true 27” and 32” moniitors.

As far as content production being allegedly a peer…

Settling for not ideal resolutions does not make ideal nor good. At best passable and inconvenient limitation in your content production pipelines to reduce costs.

Settling and making do with 2K instead of 8K+ raw footoage is no different than using pure rasterizarion instead of ray-tracing.

The bottlenecks in many pipelines towards downgrading to 2K and non-high-refresh-rates (non-HFR) is the constraints of VFX team’s work and budget

To make ideal premium content making the lives of every professional in a content pipeline of any medium, it has always been ideal and best to have raw footage from higher resolutions (and ideally higher aspect ratios) than the consumer.

That is a factor why 8K+ hasn't taken off as many creatives don't have the budget or hardware to work well/efficiently with higher resolutions.

That has particularly bottlenecked domestic 3D and spatial content taking off as you need much higher baseline resolutions and refresh rates than what tradional content production has been able to get away with.

Like budget gamers making due with suboptimal 1080p and 1440p screnes, not every content production has the benefit of using 8K+ RED Cameras and more ideal working resolutions by the studios hiring themand provisioning their hardware.

Consumer market allows them to get away with it similar to gamers willing to play games at abysmal PPI and resolutions horrible for everything else you want to do on a computer monitor such as read text and view images.

Some are content doing that only well on retina Macbooks and phones.

That said, a meaningful amount of people use monitors a lot in a given day and are not trying to downgrade in readability and sharpness of images/video when they sit down on a desk to use a monitor.

It’s inefficient and not ideal from a content production standpoint as well.

4K is once more not high enough of a resolution to use for true 27” (florian compromise now is 24.5” panels marketed as 27”) and 32” monitors for a high PPI image.

4K loses its high PPI capabilities for monitor use at ergonomically recommended for monitor iss (not to be mistaken with distances for TVs) dramatically after 24”.

Subpar PPI being common isn’t that relevant for those who want to a good, great, and ideal experience for the things they most often do, want to do, and what pays the bills.

Especially creatives making premium content adding longevity, accessibility, and optimal holistic value to their work outputting it in the best ways modern tech enables than the bare minimum.
 
Last edited:
  • Haha
Reactions: BNBMS
…Don’t attempt weak ad hominem fallacies; it’s a terrible and reckless thing to do with complete strangers on the internet that means and proves little being purely anecdotal:

I’ve done stuff for the likes of athletes, creative agencies, learning platforms, government entities, and big tech companies for over a decade.

I again literally sync and discuss modern panel matters with VESA members and reps of monitor manufacturers.

Heck my university has programs honoring cinema legends of Steven Spielberg in which I helped the next generation of creatives to not be as incorrect about HCI Computer Science as you as a TA and alum making hundreds of thousands (and making sure they do as well).

You’re defying once again academia common knowledge on the matter of the importance of PPI over resolution in which 4K and below are ill-equipped to provide a sharp image optimal for reading and seeing visual on true 27” and 32” moniitors.

As far as content production being allegedly a peer…

Settling for not ideal resolutions does not make ideal nor good. At best passable and inconvenient limitation in your content production pipelines to reduce costs.

Settling and making do with 2K instead of 8K+ raw footoage is no different than using pure rasterizarion instead of ray-tracing.

The bottlenecks in many pipelines towards downgrading to 2K and non-high-refresh-rates (non-HFR) is the constraints of VFX team’s work and budget

To make ideal premium content making the lives of every professional in a content pipeline of any medium, it has always been ideal and best to have raw footage from higher resolutions than the consumer.

That is a factor why 8K+ hasn't taken off as many creatives don't have the budget or hardware to work well/efficiently

That has particularly bottlenecked domestic 3D and spatial content taking off as you need much higher baseline resolutions and refresh rates than what tradional content production has been able to get away with.

Like budget gamers making due with suboptimal 1080p and 1440p screnes, not every content production has the benefit of using 8K+ RED Cameras and more ideal working resolutions by the studios hiring them.

Consumer market allows them to get away with it similar to gamers willing to play games at abysmal PPI and resolutiona horrible for everything else you want to do on a computer monitor such as read text and view images.

Some are content doing that only well on retina Macbooks and phones.

A meaningful amount of people use monitors a lot in a given day and are not trying to downgrade in readability and sharpness of images/video when they sit down on a desk

4K is once more not high enough of a resolution to use for true 27” (florian compromise now is 24.5” panels marketed as 27”) and 32” monitors for a high PPI image.

4K loses its high PPI capabilities for monitor use at ergonomically recommended for monitor iss (not to be mistaken with distances for TVs) dramatically after 24”.
Again, you keep failing to convince yourself while typing a long comment doesn't make you right.

As you keep ignoring the fact, Pro Display XDR was a failure due to high resolution + low dimming zones as there are reasons for that. Now, you are telling me that more resolution is better without any facts or truths.

This is not a good conversation especially if you cant support your claim after all.
 
Again, you keep failing to convince yourself while typing a long comment doesn't make you right.

As you keep ignoring the fact, Pro Display XDR was a failure due to high resolution + low dimming zones as there are reasons for that. Now, you are telling me that more resolution is better without any facts or truths.

This is not a good conversation especially if you cant support your claim after all.
It’s supported by human-computer-interaction compiter science, standard committees, and mobile/monitor manufacturs explicitly and vehmetically disagreeing with you for decades.

I know it's popular by some to defy and ignore academia and science, but ultimately your money and your choice besides the professional bit.

I’m fortunate and grateful during my peak adult life I did not have to settle with subpar resolution for everyday and professional use such as it seems your reality.

Maybe you can see 5K+ 27” and 6K+ 32” more commonly that is inevitable and will indefinitely happen before your eye sight deteriorates to actually appreciate it too(about when you need reading glasses for adults in their 40s).
 
Last edited:
  • Haha
Reactions: BNBMS
It’s supported by human-computer-interaction compiter scence, standard committees, and mobile/monitor manufacturs explicitly and vehmetically disagreeing with you for decades.
To someone who actually work for photo and video industry? Good luck.
 
To someone who actually work for photo and video industry? Good luck.
…I again also work in the photo and video industry as well as the tech industry; glad my clients, employers and other benefactors seemingly aren’t as cheap as yours to prevent me to deal with the subpar resolutions and PPI you work with.

I’m also glad they have provided me a stable salary to not be robbed of high PPI monitors in my home office living in a major tech city necessitating hundreds of thousands be allocated for my services.

It would suck for any professional to not be able to independently produce or use high PPI hardware at home (as well as beyond laptops and phones).

It's priceless the benefits, and it is sad it is not eaasily accessible to most people before their eyes deteriorate to the point they need reading glasses (around the age of 40).

You have a modern high-end iPhone or Android device or Macbook or 14”/15” 4K laptop? You notice how sharp and clear they are?

You can only have that level of sharpness on a full 27” and 32” monitor using at minimum 5K and 6K respectfully (kinda as well with 24.5” panels using 4K).

4K suffices for high PPI capabilities for OLED monitors 24” and below.

Anyways go to an Apple store and ask to see the Pro Display XDR; go back to your 32” 4K monitor afterwards.

With your time with the Pro Display XDR, notice how sharp the text and images are like the sharpest visuals you’re supposed to see during your hopefully annual eye exam if you have good insurance…

Compare that experience with your 32” 4K monitor [one I use for professional work whenever I need 120hz with professional color capabilities is the $5000 (now $3000) Asus PA32UCG monitor].

If you don't see a difference your eyes have deteriorated to the point of no return or you need to see an eye specialist to improve your eye sight
 
Last edited:
That is a good recital of the marketing talking points Apple feeds to you. Keep practicing them and you might actually believe them.

If that were truly the reason why Apple omits the stand, they would include VESA mount hardware in the box like every other manufacturer on the planet.
To be honest gp isnt wrong though, every office I’ve ever worked at has a closet full of wasted monitor stands somewhere, which usually eventually just get junked every so often before they accumulate again, and personally I’ve got a nice collection in my garage right now
 
  • Like
Reactions: lilkwarrior
I use a Thunderbolt 4 KVM with mine to connect to several PCs at once, but Thunderbolt 5 natively has KVM tech to technically minimize this.

I wouldn’t mind additional Thunderbolt 5 ports specifically.

I don’t think it needs HDMI nor DisplayPort 2.2 ports with Thunderbolt 5 able to replace and technically faster than both currently.

You can use Thunderbolt to DisplayPort/HDMI cables which I do to connect my Pro Display XDR to PCs when I don’t use my Thunderbolt 4 KVM.

Only main reason I need to use such cables instead of the KVM is to connect the display directly to Nvidia GPUs without a USB-C DisplayPort/HDMI.
What Thunderbolt 4 KVM do you use? Frankly, I found a lot of information on KVM tech hard to follow. But it shouldn't be necessary to purchase a separate device. Like many people, I have a Mac and a PC laptop, the latter of which is required for work. And it just seems crazy that Apple would design a crazy expesive monitor with just one video input.
 
  • Like
Reactions: drrich2
What Thunderbolt 4 KVM do you use? Frankly, I found a lot of information on KVM tech hard to follow. But it shouldn't be necessary to purchase a separate device. Like many people, I have a Mac and a PC laptop, the latter of which is required for work. And it just seems crazy that Apple would design a crazy expesive monitor with just one video input.
I use Sabrent’s, but there are other ones as well.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.