Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I recently picked up a Dell UP3216Q 32 inch 4K monitor. I love it for Affinity Photo as I can see my images at 100% for an installation I'm working on, beats zooming in and out with my old Dell U3011.
Problem I've run into is the UI elements are tiny and that scaling with the different settings chews through my RX580 8GB of VRAM easily. Is there a way to keep 4k resolution and ONLY scale UI and menubars? Its ridiculous that the menubar has newspaper sized UI at arms length.

Also some of my older games and software doesn't play nice with 4k. Is there a quick and easy resolution menu option I can just convert the screen into 720p when I need it?
 
Conclusion: For all Mac users, changing the resolution only effects the size of stuff instead of image quality or text sharpness. So don’t be afraid to change the resolution on macs for larger text
Yes it does same way zBrush does.

Does not really solve the issue as I find Mac Os text either to small (4k native) or to big (looks like 1080p) and everything between those 2 setting gets a high performance hit.

I wish Mac Os could scale the UI text as blender does. I am not a fan of the way they do scale at the moment.

I actually nearly went for the Studio display as it looks to have the perfect screen resolution ratio and decided not to because.
1. Nano Glass seeems not great clean ability wise.
2. Even if I only take the height adjustable stand, I can get 2 of those Eizos.
3. Calibrating it not sure how well that would work and dont think it has “hardware calibration”.
4. Studio display is not a true 10 bit panel but 8bit + frc.

I might have tested it if it would be the same price with the height adjustable stand. I find it a bit stupid to pay about 400 bucks for basic ergonomics.
You’re right. No hardware calibrated monitor. Indeed not 10 bit. Between the Studio display and a ColorEdge Eizo monitor for photography, I would choose the Eizo anytime….as you….and I did:).
The Studio Display doesn’t attract me at all.
 
  • Like
Reactions: l0stl0rd
4. Studio display is not a true 10 bit panel but 8bit + frc.
You’re right. No hardware calibrated monitor. Indeed not 10 bit.

Interesting. Sifting through Apple-speak and its marketing BS, I suspect some models indeed not true 10 bit panels. But I think the panels in iMac 5K and Studio Display are true 10 bits. What evidence have made you think otherwise other than their price points?
 
Interesting. Sifting through Apple-speak and its marketing BS, I suspect some models indeed not true 10 bit panels. But I think the panels in iMac 5K and Studio Display are true 10 bits. What evidence have made you think otherwise other than their price points?
Not sure if anyone really knows but considering it is pretty much the same as the LG 5k and that one is 8bit + A-FRC.

Things would be easier if Apple just would be more straightforward with spec sheets.

An I have seen things like this in reviews “Apple also confirmed this is not a true 10-bit panel. There were several mentions of "over 1 billion colors" during the presentation, but it's an 8-bit panel with temporal dithering (AKA Frame Rate Control) just like the LCD panels in the 14- and 16-inch MacBook Pros. The only true 10-bit panel is Apple's lineup is still the Pro Display XDR, and it's safe to assume that it will stay that way.”

liked above by Dutch60
 
Indeed Apple *confirmed* to the dpreview reviewer the panel (in Studio Display) is not true 10 bit !

Damn. I've to downgrade the meaning of Apple's lingua franca. Now I think those models that I suspect not "true 10 bit" are even sh**tier.
 
Basically nothing wrong with Apple Studio Display. Especially build quality and overall image quality seem great. 8bit+frc doesn’t have to be a major problem either. But…for specific usage like photo editing/printing, I chose another monitor.
 
  • Like
Reactions: l0stl0rd
Indeed Apple *confirmed* to the dpreview reviewer the panel (in Studio Display) is not true 10 bit !

Damn. I've to downgrade the meaning of Apple's lingua franca. Now I think those models that I suspect not "true 10 bit" are even sh**tier.
You shouldn’t suspect in terms of “sh**tier”. Ofcourse 10 bit is better than 8bit+frc, but 8bit+frc can be very good as well. I think that even Eizo has 8bit+frc monitors in their ColorEdge lineup (not 100% sure). True 10 bit is better, but you have to ask yourself how well you managed with non 10 bit panels so far (if you used them ofcourse); if there wasn’t any problem, then “no worries”;)
 
You shouldn’t suspect in terms of “sh**tier”. Ofcourse 10 bit is better than 8bit+frc, but 8bit+frc can be very good as well. I think that even Eizo has 8bit+frc monitors in their ColorEdge lineup (not 100% sure). True 10 bit is better, but you have to ask yourself how well you managed with non 10 bit panels so far (if you used them ofcourse); if there wasn’t any problem, then “no worries”;)

Taking a little step back, I agree with you. The surprise is more on overestimating my understanding of Apple's marketing Greek.

I'm more a coder, only casual photo/video editor. So should be sufficiently good for my usage. I just thought technology had advanced so much in the past decade true 10-bit panels are finally accessible to more people.

Turns out the game of 10 bit panels is still early. Even Apple isn't able to bring it to the mass at Studio Display's price point.
 
  • Like
Reactions: Dutch60
Wait 'till a microLED/true HDR version comes out in 6-12 months, then you'll break the other leg :)
If you can justify the cost, there's no problem (or why not get teh Pro XDR while you're at it) - but I'd re-iterate that you can get 2-3 half decent 4k screens for the price, and enjoy extreme "real estate".
Definitely don't need mini LED or XDR here. My 14" MBP has XDR display and quite frankly it gets used occasionally to go oooh and ahhh and some dude walking around Tokyo with an HDR camera on YouTube but that's about it ?

As for lots of 4k screens, I'll pass on that. I'll let the younger folk into a little secret. One screen and cmd+tab is the optimum amount of screen. You will find this when you hit about 40-45 years old and you realise your neck hurts all the time.

Alas I just hit a wall of financial difficulty so I'm probably going to have to shrivel up and go back to the old 4k Iiyama display anyway.
 
Do you notice extra power consumption at idle (in GPU, system..) when 5K is down-sampled to 4K e.g. by comparing "4K 1440p" against "4K 1080p"?
I just tried a non-productive scenario with my Intel MacBook Pro (Intel Iris Plus 645 with native resolution 2560x1600).
Short version: No relevant difference in power consumption.

Playing a 1080p H.264 mkv with IINA in fullscreen and overlaying the iStat menus GPU graph:
  • CPU: In both cases IINA and WindowServer use the same amount.
  • GPU:
    • - [1] Using "looks like 1280x800" (retina 2x):
      • 53% processing, 6% memory
    • - [2] Using "looks like 1680x1050" (retina 3360x2100 then downsampled back to 2560x1600):
      • 54% processing, 8% memory
edit: I know it's not 4K / 1080p but it should give an idea how much resources the additional downscaling process needs (not talking about the negative impact when doing GPU-intense tasks/rendering). By the way, the base model Mac Pro 2013 with 2GB D300 GPUs is officially capable of driving three 5K displays at once. This means 44 Megapixels @60 Hz with almost 9 year old technology.
 
Last edited:
  • Like
Reactions: kvic
Taking a little step back, I agree with you. The surprise is more on overestimating my understanding of Apple's marketing Greek.

I'm more a coder, only casual photo/video editor. So should be sufficiently good for my usage. I just thought technology had advanced so much in the past decade true 10-bit panels are finally accessible to more people.

Turns out the game of 10 bit panels is still early. Even Apple isn't able to bring it to the mass at Studio Display's price point.
True I think the problem is 5k panels are rare besides the LG and the Apple I could not find another real 5k (not counting the ultra wide screens).

If the Studio Display was 10bit and not 8+frc I bet it would easily be 1-1,5k more.
 
Also some of my older games and software doesn't play nice with 4k. Is there a quick and easy resolution menu option I can just convert the screen into 720p when I need it?
In Display Settings, Option Click on "() Scaled", then check "Show all resolutions" ("Show low resolution modes" on earlier OSs) and you'll see some modes marked "low resolution" which really do output the stated resolution.

Many full screen games have their own option to choose resolution, though.
 
Nope.



"When I was a young boy dadabdada...."

2 or more screens can be a pain when you have your windows scattered more or less randomly, but with a little bit a planning it can be perfect.

aka your mileage may vary
How do you use 2 screens? Just curious.
 
A factor that stopped me getting a 4k display years ago was I prefer matte screens.
And it's less strain on your gpu (depending on what your doing).
I believe now most of sold 4k monitors have matte screens, including this year UP series from LG. But for instance LG ultrafine 23,7" 24MD4KL-B has glossy screen, I don't like it for external monitors.

I will share this as that is what I discovered to and I think he is right.
I'd say it depends on your workflow and apps used. Here's comprehensive test of some GPU intensive apps, and slowdown due to scaling is not problematic until you go extreme (3 or 4 external 4k monitors). It looks in general like Apple designed ARM chips GPU with hardware accelerated scaling unlike all older designs with AMD or Intel video chips which cannot handle scaling that easy (MBP 16" is included for comparison, it is doing much worse despite having discrete GPU):
 
  • Like
Reactions: mr_jomo and Joe D33
How do you use 2 screens? Just curious.
Mac side:

2 4k 32" BenQs, so rather cheap.
Main "work" display straight ahead with slight offset to the right.
2nd "content" display on the left angled to me (30° I would say).

If I need to keep something up for reference it might get pushed to the left.


MorphOS side:
20" iSight iMac G5 for desktop, debug output etc on the right
22" LG just for the IDE and maybe running the SW I just compiled (depends on the project)

This more of a HW limitation as 1 20/22" is just to small and I don't want to drag out the PowerMac to run the 27".

1st dual screen setup I had was in 2002 running Amithlon (Linux based Amiga emulator) on 2 18" LCD and I always feel cramped when restricted to 1 screen.
 
In Display Settings, Option Click on "() Scaled", then check "Show all resolutions" ("Show low resolution modes" on earlier OSs) and you'll see some modes marked "low resolution" which really do output the stated resolution.

Many full screen games have their own option to choose resolution, though.
 
Mac side:

2 4k 32" BenQs, so rather cheap.
Main "work" display straight ahead with slight offset to the right.
2nd "content" display on the left angled to me (30° I would say).

If I need to keep something up for reference it might get pushed to the left.


MorphOS side:
20" iSight iMac G5 for desktop, debug output etc on the right
22" LG just for the IDE and maybe running the SW I just compiled (depends on the project)

This more of a HW limitation as 1 20/22" is just to small and I don't want to drag out the PowerMac to run the 27".

1st dual screen setup I had was in 2002 running Amithlon (Linux based Amiga emulator) on 2 18" LCD and I always feel cramped when restricted to 1 screen.
Thank you. Sounds good. Should be possible with 2 27” screens as well, ofocurse. Do you edit photos as well? If so, how with 2 screens?
 
Do you edit photos as well?

Sure, cutting and and resizing them in "Preview" so they can be uploaded into forums :p

For real, none of my screens would be good enough for serious photo editing.
I do some coding but also just as hobby in the same way others may solve sudoku puzzles.
 
In Display Settings, Option Click on "() Scaled", then check "Show all resolutions" ("Show low resolution modes" on earlier OSs) and you'll see some modes marked "low resolution" which really do output the stated resolution.

Many full screen games have their own option to choose resolution, though.
What' s the difference between like 2560x1440 and 2560x1440 (low resolution)?
 
Sure, cutting and and resizing them in "Preview" so they can be uploaded into forums :p

For real, none of my screens would be good enough for serious photo editing.
I do some coding but also just as hobby in the same way others may solve sudoku puzzles.
ok, thnxs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.