Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thanks. Does that mean if I connect a M4 Mini to a 4K display, the HiDPI option will not show up?
HiDPI works fine on the M4, but some of the very high HiDPI resolutions do not show up on the M4.

So if I may buy a 6K monitor (be it this one or from the other brand) and run HiDPI, it is better to get a M4 Pro Mini even on paper, the M4 should be able to do it?
If you are using very high HiDPI resolutions like 3200x1800 and above then yes, you should get the M4 Pro just to be safe.

I still don't understand why this is the case though. At least on paper, the M4 should support this just fine, but the 3200x1800 option is missing on the M4 at least on the 27" 5K, according to several reports here. It's there with the M4 Pro.

What are the advantages of using a 6K monitor?
It's bigger of course. Plus it's a 218 ppi monitor like the 27" 5K. At 2X scaling, default text sizing and screen elements will be the exact same size as the 27" 5K, but with a bigger screen.

Lol 40-60cm ain’t the normal viewing distance for 32”, 60cm is like the minimum…
OSHA states that the recommended viewing distance for a desktop screen is 50-100 cm.

As for minimum, it really depends the pixel density. 32" 4K is 138 ppi, which is too low even for 60 cm. Retina would require at least 63.5 cm for 32" 4K yes, but for example the recently announced 31.5" 5K is 186 ppi, so that would easily meet OSHA's minimum 50 cm requirement. Retina on a 31.5" 5K monitor is at 47 cm or further. ie. Acer's 186 ppi 31.5" 5K display is Retina if you follow OSHA's viewing distance recommendations. Thus, I am very much looking forward to this monitor and to other as yet unnannounced monitors that would use the same panel. I usually sit at about 55-60 cm from the screen.

Apple's 218 ppi supports a minimum of 40 cm to achieve Retina, but according to OSHA, 40 cm is too close for a desktop screen. I suspect there are a couple of reasons for this. First, the closer the screen, the harder it is to focus for those with presbyopia, causing eye strain. Second, with a large screen that is too close, it means a person may be tilting their body or head in order to view the entire screen, which can lead to neck and back pain.
 
HiDPI works fine on the M4, but some of the very high HiDPI resolutions do not show up on the M4.


If you are using very high HiDPI resolutions like 3200x1800 and above then yes, you should get the M4 Pro just to be safe.

I still don't understand why this is the case though. At least on paper, the M4 should support this just fine, but the 3200x1800 option is missing on the M4 at least on the 27" 5K, according to several reports here. It's there with the M4 Pro.

Yes, it seems that nobody knows the exact reason and Apple remains silent. Some said it is hardware problem of the M4 series. Others said Apple disabled those options in software. Some also said the cause is due to users using cheap cables. It is strange that M1-M3 series have no such problem.

I just cancelled a MacBook Pro 16 M4 Pro" due to such uncertainty. Perhaps just to get the cheapest Mini + external boot drive in case Apple does not resolve this issue. However, I worry about doing so compromises Secure Enclave. I read the article about it from Apple but I don't fully understand it.
 
Lol 40-60cm ain’t the normal viewing distance for 32”, 60cm is like the minimum…
When doing work and pixel peeping I am definitely in the 40-60 cm range. The desk itself is only 62 cm deep. The Logitech desktop mat is 40 cm deep.

Mind you I usually have two applications open side by side (3840x4320 + 3840x4320) depending on what I am currently doing.

(Don't mind the "cable management" disaster ... working on it).

Computer_desk.jpg
 
Yes, it seems that nobody knows the exact reason and Apple remains silent. Some said it is hardware problem of the M4 series. Others said Apple disabled those options in software. Some also said the cause is due to users using cheap cables. It is strange that M1-M3 series have no such problem.
It's missing on the M1-M3 as well AFAIK, at least with some setups. For example, on both my M1 Mac mini and my M4 Mac mini with my 4K+ 3:2 3840x2560 monitor, I have the HiDPI options of 2304x1536 (which is what I use), 2560x1707, 3008x2005, and 3840x2560. There is nothing in between 3008x2005 and 3840x2560, even though that is a huge jump.

Interestingly though, on my iMac 5K, while I get the default 2X scaled 2560x1440 option, there is no option for 2304x1296.

Cables would not make any sense as being the cause, as the final resolution represents a downscaled / upscaled resolution by the Mac to match the monitor. IOW, when you're using a HiDPI resolution setting, what is sent to the monitor is always the native resolution of the monitor.
 
4K UHD is the lowest accepted video resolution for all major studios and streaming services. They will also not accept anything less than 10-bit 240Mbps intraframe codecs. It's true that everything is mastered to 4K UHD but not the actual recording.

Most approved cinema cameras shoot much higher resolution and either lightly compressed or uncompressed RAW. A few cameras on the approved lists only shoot at 4K DCI (4096x2160). One of the advantages of shooting at much higher resolution is framing, zoom and crop control or the ability to oversample 8K to 4K for improved color fidelity, "sharpness / crispness" and better shadow quality. VFX plates and motion capture are also much better with higher resolution video.
Which is why I'm speaking about deliverables -- content people actually display on their monitors, since this is a discussion of displays and not video recording. I would also point out that professional cinema studios use equipment far beyond the consumer market which is where Apple lives.

8k RAW footage is generally over 100 GB per minute of recording. We're a very long way off of using anything beyond 4K for deliverables since we don't even use low-compression 4K. 8K or beyond at the same bitrate doesn't really look any better so until the bitrate improves (which there's tons of headroom in 4K still), I don't think you'll see it widely adopted.

Working on a high resolution display gives you the ability to show the full resolution video for delivery (4K UHD) while still giving you access to your timeline and controls. It is also awesome for photography and printed media. Even Microsoft Excel warriors will love the added screen real estate etc.
It doesn't give you more "real estate" because there's a minimum sizing that your eyes will tolerate, and 4K already saturates that until you get to massive sizes. If you really need more real estate, there's no substitute for more display size like adding a second monitor. Nobody says "wow I wish I could see more Excel cells on my monitor, I better get one the same size with more pixels". :p That argument only worked when displays were such low resolution that they couldn't clearly display things as small as you wanted them but with 4k, we're beyond that standard (who uses 1:1 UI scaling with 4k or 5k?).

Similarly with video editing on 5K displays: there still isn't enough room for all the editing tools and 4K video at full resolution. And if there were, is it too small to matter? That's why people will use a second display for monitoring output so they can see how it will actually look to viewers. Or just do half-res for all your editing and spot-check at full resolution.

Same with photo editing: 8K is still only 33MP, about half the pixels of my A7RV. But I don't want an 8k or 16k display so I can see more pixels at once -- it doesn't help me one bit to see what's going on if the pixels are all too small. The client then opens up on their 4k display and sees flaws in the image that were too small to be seen, or moire effects and other artifacts that were masked by the extreme pixel density of the 8k/16k. It simply doesn't help you.

It sounds like you haven't seen an 31.5-inch 4K UHD and 8K UHD monitor next to each other. The difference is absurdly clear at normal viewing distances (40-60 cm) and extreme when pixel peeping high resolution media at closer distances.

Before Apple decided to push the world into the "retina" era most PCs would still come with 8-bit TN panels with downright bad viewing angles and low resolution. We are pushing boundaries here.
The difference between 4K->8K->16K is diminishing and the amount of data required to feed it increasing exponentially. Again, for anything in the consumer and prosumer market, it's still a long ways off being a standard.

Apple Studio Display is still 8-bit and 60Hz for $1500+, not exactly "pushing boundaries".

Anyways, the discussion went from 5k and 6k to now 8k and 16k. My point was 5k and 6k are not standards outside of the Mac world and exist primarily for Apple scaling purposes. 4K is the current standard and I suspect will be for some time.
 
When doing work and pixel peeping I am definitely in the 40-60 cm range. The desk itself is only 62 cm deep. The Logitech desktop mat is 40 cm deep.
...
Why not just pixel peep by zooming in when you need to and save your eyesight? Even on 4K you're only just starting to pixel peep by 40cm (15") if your eyes are extremely sharp.
 
  • Like
Reactions: AppelGeenyus
Anyways, the discussion went from 5k and 6k to now 8k and 16k. My point was 5k and 6k are not standards outside of the Mac world and exist primarily for Apple scaling purposes. 4K is the current standard and I suspect will be for some time.
I am happy to see 5K coming to the PC gaming world now. I guess CPUs and GPUs are now fast enough at the uber high end to actually support 5K gaming. And for those who can't 5K game, they can use the monitor's 2.5K 1440p mode. And yes, that gaming 5K monitor supports 144 Hz.

Also, given it has multiple inputs, such a 5K gaming monitor could support both a Mac at 5K and a gaming PC at 2.5K without having to unplug anything.
 
It's missing on the M1-M3 as well AFAIK, at least with some setups. For example, on both my M1 Mac mini and my M4 Mac mini with my 4K+ 3:2 3840x2560 monitor, I have the HiDPI options of 2304x1536 (which is what I use), 2560x1707, 3008x2005, and 3840x2560. There is nothing in between 3008x2005 and 3840x2560, even though that is a huge jump.

Interestingly though, on my iMac 5K, while I get the default 2X scaled 2560x1440 option, there is no option for 2304x1296.

Cables would not make any sense as being the cause, as the final resolution represents a downscaled / upscaled resolution by the Mac to match the monitor. IOW, when you're using a HiDPI resolution setting, what is sent to the monitor is always the native resolution of the monitor.
Samsung 55" 4K. From HDMI. From HDMI to Thunderbolt 4. MacOS Sequoia Studio M2 Max
 

Attachments

  • Screenshot 2025-01-12 at 21.34.11.png
    Screenshot 2025-01-12 at 21.34.11.png
    90.3 KB · Views: 38
  • Screenshot 2025-01-12 at 21.30.04.png
    Screenshot 2025-01-12 at 21.30.04.png
    218.1 KB · Views: 34
  • Screenshot 2025-01-12 at 21.35.43.png
    Screenshot 2025-01-12 at 21.35.43.png
    224.8 KB · Views: 39
  • Like
Reactions: EugW
I am happy to see 5K coming to the PC gaming world now. I guess CPUs and GPUs are now fast enough at the uber high end to actually support 5K gaming. And for those who can't 5K game, they can use the monitor's 2.5K 1440p mode. And yes, that gaming 5K monitor supports 144 Hz.

Also, given it has multiple inputs, such a 5K gaming monitor could support both a Mac at 5K and a gaming PC at 2.5K without having to unplug anything.
Well, 5K is a 20% reduction in pixel size while requiring 77% more pixels.

That's a massive increase in GPU requirements, both VRAM and power, for a very small increase in perceived detail. Given that 4k120 Hz gaming already requires not only the most powerful cards available, but also tricks like AI upscaling and frame generation, and then a very big increase in the cost. DLSS4 generates 3 artificial frames for every 1 rendered frame, and if I'm not mistaken, that one 4k render is upscaled via AI from 1440p.

So will your game look or feel better with 5k and more of that trickery going on? I'm not so sure. 5K 120Hz is really only just now possible with Thunderbolt 5.
 
  • Like
Reactions: drrich2
Well, 5K is a 20% reduction in pixel size while requiring 77% more pixels.

That's a massive increase in GPU requirements, both VRAM and power, for a very small increase in perceived detail. Given that 4k120 Hz gaming already requires not only the most powerful cards available, but also tricks like AI upscaling and frame generation, and then a very big increase in the cost. DLSS4 generates 3 artificial frames for every 1 rendered frame, and if I'm not mistaken, that one 4k render is upscaled via AI from 1440p.

So will your game look or feel better with 5k and more of that trickery going on? I'm not so sure. 5K 120Hz is really only just now possible with Thunderbolt 5.
The Acer monitor I'm talking about doesn't even support Thunderbolt. The 5K 144 Hz is via HDMI or DisplayPort. And like I said, it has a specific 2.5K gaming mode too.

Personally though, I don't game, and I'm not really concerned about 144 Hz either. I'm more interested in the fact that it isn't 218 ppi. I'm not a huge fan of 218 ppi on Mac desktops. I want a lower Retina pixel density. 200 ppi would be perfect, but this 31.5" 5K monitor's 186 ppi should be good too.
 
It doesn't give you more "real estate" because there's a minimum sizing that your eyes will tolerate, and 4K already saturates that until you get to massive sizes. If you really need more real estate, there's no substitute for more display size like adding a second monitor. Nobody says "wow I wish I could see more Excel cells on my monitor, I better get one the same size with more pixels". :p That argument only worked when displays were such low resolution that they couldn't clearly display things as small as you wanted them but with 4k, we're beyond that standard (who uses 1:1 UI scaling with 4k or 5k?).

Similarly with video editing on 5K displays: there still isn't enough room for all the editing tools and 4K video at full resolution. And if there were, is it too small to matter? That's why people will use a second display for monitoring output so they can see how it will actually look to viewers. Or just do half-res for all your editing and spot-check at full resolution.

Same with photo editing: 8K is still only 33MP, about half the pixels of my A7RV. But I don't want an 8k or 16k display so I can see more pixels at once -- it doesn't help me one bit to see what's going on if the pixels are all too small. The client then opens up on their 4k display and sees flaws in the image that were too small to be seen, or moire effects and other artifacts that were masked by the extreme pixel density of the 8k/16k. It simply doesn't help you.


The difference between 4K->8K->16K is diminishing and the amount of data required to feed it increasing exponentially. Again, for anything in the consumer and prosumer market, it's still a long ways off being a standard.

Apple Studio Display is still 8-bit and 60Hz for $1500+, not exactly "pushing boundaries".

Anyways, the discussion went from 5k and 6k to now 8k and 16k. My point was 5k and 6k are not standards outside of the Mac world and exist primarily for Apple scaling purposes. 4K is the current standard and I suspect will be for some time.
You could have just said no. It's clear you haven't seen an 8K and 4K display next to each other and evaluated it yourself.

We are not getting a consensus here besides that we disagree.
 
It's worth noting that all the Apple shills here can't even see anything on these desktop 218 PPI screens, as objects are too small 💀

Apple came up with this PPI number for their laptop screens and proclaimed it to be The One and Only, but they forgot to consider that desktops are meant to be viewed from farther away. Now all Mac desktop users suffer*, though only a few are bold enough to say the quiet part out loud.

* unless they buy 4K 27" displays, of course.
1440p (or HiDPI 5K) on a 27" has been a standard for 15 years now. I have no problem seeing small text and other objects on my monitors even though they are beyond arm's length away from me. 1080p (or HiDPI 4K) on a 27" renders everything way too big.
 
I agree with your estimates, although I'm not optimistic for US$1999 retail. I'm thinking more like US$2299 MSRP or so. However, we also should remember that LG's retail prices are basically suggestions, not fixed pricing like Apple's. For example, the 27" 5K was lucky (goldstar) US$888 on Amazon USA last fall.


Yep, the included stand alone will make things way, way cheaper. If they have the LG 6K + tilt/shift stand on sale next year for under CA$2500 (which is about US$1740), that would be really tempting.
This guy is reporting $1199 for the Asus 6K 32”. 😮

 
This is incorrect. Apple itself uses non-integer scaling by default for some of its Retina displays.

I believe it is 178% for the MacBook Air.
Yes, because Apple hates their customers and saves a few pennies by shipping low-resolution screens in MB Air, and in MP Pros until 2021 models.

But it looks atrocious, and it's the first setting that I would disable on such laptop (it should be 200%, otherwise looks blurry).
 
Yes, because Apple hates their customers and saves a few pennies by shipping low-resolution screens in MB Air, and in MP Pros until 2021 models.

But it looks atrocious, and it's the first setting that I would disable on such laptop (it should be 200%, otherwise looks blurry).
It looks totally fine to the vast majority of the population in normal usage. That's the point of Retina after all.

Anyhow, nice backtrack, after it was pointed out that Apple itself doesn't even use 200% scaling on all its Retina displays.
 
You could have just said no. It's clear you haven't seen an 8K and 4K display next to each other and evaluated it yourself.

We are not getting a consensus here besides that we disagree.
You could have just said you're making up your own standard. Why would I care for 8k or 16k when even going from 4k to 5k or 6k doesn't really matter to me?

If you had a counter-point, you'd have made it.

The market will ultimately speak as its own consensus, as it always has, and it says 4k is the standard now and will be for some time -- for content, for gaming, for televisions, for monitors, and so on. I think 120 Hz or above will be standardized before 8k.
 
  • Like
Reactions: EugW
You could have just said you're making up your own standard. Why would I care for 8k or 16k when even going from 4k to 5k or 6k doesn't really matter to me?

If you had a counter-point, you'd have made it.

The market will ultimately speak as its own consensus, as it always has, and it says 4k is the standard now and will be for some time -- for content, for gaming, for televisions, for monitors, and so on. I think 120 Hz or above will be standardized before 8k.
Sure.

However I think Apple made the right choice by going retina as that was a great way to bring display improvements. It is definitely more "wow" than going from 16.7M to 1B colors, although I am pretty sure the newest LM270QQ2-SPA3 used in several 27-inch monitors are 10-bit.

I still remember holding the iPhone 4 in 2010 and being in awe.

With that said 4K HiDPI at 32-inches looks way better than 4K at the same size. This scaling improvement holds true at any size, from the handheld iPhone 4 to large monitors like the one I have.

Due to text being much more eligible with 4 times the pixels you can easily use smaller fonts and display large media at 1:1 without scaling. High PPI screens do give you more usable real estate.
 
Ugh, such an ugly display. I'd rather pay more for at least some aluminum housing, than having to use this plastic screen.
I personally think the better colour calibration, multiple input selection, built-in KVM, and, of course, the lower price of the ProArt would be far more useful to someone's workflow and budget than an aluminum shell would be. But that's just the opinion of some weird guy who thinks that the front of a monitor is more important than the back of it.
 
Ugh, such an ugly display. I'd rather pay more for at least some aluminum housing, than having to use this plastic screen.
My first thought on seeing that was Apple's XDR has metal.

This isn't unlike the debates over the Apple Studio Display - a reviewer may praise the build quality, talk about how when you grasp and reposition it it doesn't flex or creak, and how if you opt for the height-adjustable stand you can raise and lower it with one hand and it's so smooth.

Someone preferring the much price of the plastic-cased competitors may then say I rarely shift it around, slight flexing and creaking when I do is no big deal, I have 2 hands, and I ain't paying an extra $400 so I can shift it up and down a bit even if it is 'so smooth.'

The price point may be key here. At roughly $1,200, especially if you don't need to pay hundreds for a beefed up stand and added length warranty (looking at you, Apple Care +), it may be at the upper level of what many home users might justify for personal enjoyment in a display they plan to use for a decade. Somebody shopping $400 - $500 4K 27" monitors, or the $800 ASUS 5K 27" as a self-indulgence buy, might see that $1,200 as an extravagance, but you know...it would be so sweet...

Jacking it up to $1,500 (oh, '$1,499' maybe?) might take it above the range many people are willing to pay.

Different strokes for different folks. Yesterday I drove my Toyota Corolla by somebody in a hot yellow Corvette. Same road, same function. Style was worth more to him.
 
  • Like
Reactions: EugW
I personally think the better colour calibration, multiple input selection, built-in KVM, and, of course, the lower price of the ProArt would be far more useful to someone's workflow and budget than an aluminum shell would be. But that's just the opinion of some weird guy who thinks that the front of a monitor is more important than the back of it.
I don’t—it’s about equally important to me. That’s why I’ve always appreciated Apple products. As a creative person—composer, musician, writer, etc.—design is really important to me.

When Apple reached a wider audience with the iPhone, there was a large influx of more “normative” consumers into the mix, for all Apple products. This is unfortunate, as Apple was traditionally more of a haven in tech for people who think, well, different.
 
Sure.

However I think Apple made the right choice by going retina as that was a great way to bring display improvements. It is definitely more "wow" than going from 16.7M to 1B colors, although I am pretty sure the newest LM270QQ2-SPA3 used in several 27-inch monitors are 10-bit.

I still remember holding the iPhone 4 in 2010 and being in awe.
First, sorry that I had you and another poster mixed up regarding the standards thing!

I agree it was a great move and I also remember how amazing the iPhone 4 retina display was -- truly remarkable compared to the 3GS. But I haven't had that same experience since. I think we're at the limit where increasing pixel density is hardly noticeable compared with how incredible it used to be. Going from 4k to 5k to 6k to 8k is underwhelming for the cost.

The Studio Display panel is still 8-bit and uses trickery to try to display 10-bit colours. I would argue that there's little point in having a 10-bit display even if you're working on 10-bit video. The extra data from the 10-bit is really useful for storing the data, especially video recorded in a log format, for the calculations the computer does in colour grading and exposure correction. Same with photography which is often 14-bit colour in RAW -- can't and don't care to see it on a monitor but crucial for the calculations done in post-processing the images.
Due to text being much more eligible with 4 times the pixels you can easily use smaller fonts and display large media at 1:1 without scaling. High PPI screens do give you more usable real estate.
There's a limit to how small your UI and text can get, though. When Apple halved the pixel size in the 27" iMac from 1440p to 2880p, they simply scaled it 2:1 with "looks like 1440p" so everything looks exactly the same size with no more screen real estate. 4k already saturates that minimum size sharply so I really don't think going 5k, 6k, etc will help you get more content on your screens.

You can make it smaller but is it usable? I don't know anyone who scales smaller than "looks like 1440p" on their 27".
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.