It is fine I will stick with my Eizo order.It really depends on which vesa mount you get. There some really good ones on Amazon for a good price.
It is fine I will stick with my Eizo order.It really depends on which vesa mount you get. There some really good ones on Amazon for a good price.
Conclusion: For all Mac users, changing the resolution only effects the size of stuff instead of image quality or text sharpness. So don’t be afraid to change the resolution on macs for larger text
You’re right. No hardware calibrated monitor. Indeed not 10 bit. Between the Studio display and a ColorEdge Eizo monitor for photography, I would choose the Eizo anytime….as you….and I didYes it does same way zBrush does.
Does not really solve the issue as I find Mac Os text either to small (4k native) or to big (looks like 1080p) and everything between those 2 setting gets a high performance hit.
I wish Mac Os could scale the UI text as blender does. I am not a fan of the way they do scale at the moment.
I actually nearly went for the Studio display as it looks to have the perfect screen resolution ratio and decided not to because.
1. Nano Glass seeems not great clean ability wise.
2. Even if I only take the height adjustable stand, I can get 2 of those Eizos.
3. Calibrating it not sure how well that would work and dont think it has “hardware calibration”.
4. Studio display is not a true 10 bit panel but 8bit + frc.
I might have tested it if it would be the same price with the height adjustable stand. I find it a bit stupid to pay about 400 bucks for basic ergonomics.
4. Studio display is not a true 10 bit panel but 8bit + frc.
You’re right. No hardware calibrated monitor. Indeed not 10 bit.
Not sure if anyone really knows but considering it is pretty much the same as the LG 5k and that one is 8bit + A-FRC.Interesting. Sifting through Apple-speak and its marketing BS, I suspect some models indeed not true 10 bit panels. But I think the panels in iMac 5K and Studio Display are true 10 bits. What evidence have made you think otherwise other than their price points?
You shouldn’t suspect in terms of “sh**tier”. Ofcourse 10 bit is better than 8bit+frc, but 8bit+frc can be very good as well. I think that even Eizo has 8bit+frc monitors in their ColorEdge lineup (not 100% sure). True 10 bit is better, but you have to ask yourself how well you managed with non 10 bit panels so far (if you used them ofcourse); if there wasn’t any problem, then “no worries”Indeed Apple *confirmed* to the dpreview reviewer the panel (in Studio Display) is not true 10 bit !
Damn. I've to downgrade the meaning of Apple's lingua franca. Now I think those models that I suspect not "true 10 bit" are even sh**tier.
You shouldn’t suspect in terms of “sh**tier”. Ofcourse 10 bit is better than 8bit+frc, but 8bit+frc can be very good as well. I think that even Eizo has 8bit+frc monitors in their ColorEdge lineup (not 100% sure). True 10 bit is better, but you have to ask yourself how well you managed with non 10 bit panels so far (if you used them ofcourse); if there wasn’t any problem, then “no worries”![]()
Definitely don't need mini LED or XDR here. My 14" MBP has XDR display and quite frankly it gets used occasionally to go oooh and ahhh and some dude walking around Tokyo with an HDR camera on YouTube but that's about it ?Wait 'till a microLED/true HDR version comes out in 6-12 months, then you'll break the other leg
If you can justify the cost, there's no problem (or why not get teh Pro XDR while you're at it) - but I'd re-iterate that you can get 2-3 half decent 4k screens for the price, and enjoy extreme "real estate".
I just tried a non-productive scenario with my Intel MacBook Pro (Intel Iris Plus 645 with native resolution 2560x1600).Do you notice extra power consumption at idle (in GPU, system..) when 5K is down-sampled to 4K e.g. by comparing "4K 1440p" against "4K 1080p"?
True I think the problem is 5k panels are rare besides the LG and the Apple I could not find another real 5k (not counting the ultra wide screens).Taking a little step back, I agree with you. The surprise is more on overestimating my understanding of Apple's marketing Greek.
I'm more a coder, only casual photo/video editor. So should be sufficiently good for my usage. I just thought technology had advanced so much in the past decade true 10-bit panels are finally accessible to more people.
Turns out the game of 10 bit panels is still early. Even Apple isn't able to bring it to the mass at Studio Display's price point.
In Display Settings, Option Click on "() Scaled", then check "Show all resolutions" ("Show low resolution modes" on earlier OSs) and you'll see some modes marked "low resolution" which really do output the stated resolution.Also some of my older games and software doesn't play nice with 4k. Is there a quick and easy resolution menu option I can just convert the screen into 720p when I need it?
One screen and cmd+tab is the optimum amount of screen.
You will find this when you hit about 40-45 years
How do you use 2 screens? Just curious.Nope.
"When I was a young boy dadabdada...."
2 or more screens can be a pain when you have your windows scattered more or less randomly, but with a little bit a planning it can be perfect.
aka your mileage may vary
I believe now most of sold 4k monitors have matte screens, including this year UP series from LG. But for instance LG ultrafine 23,7" 24MD4KL-B has glossy screen, I don't like it for external monitors.A factor that stopped me getting a 4k display years ago was I prefer matte screens.
And it's less strain on your gpu (depending on what your doing).
I'd say it depends on your workflow and apps used. Here's comprehensive test of some GPU intensive apps, and slowdown due to scaling is not problematic until you go extreme (3 or 4 external 4k monitors). It looks in general like Apple designed ARM chips GPU with hardware accelerated scaling unlike all older designs with AMD or Intel video chips which cannot handle scaling that easy (MBP 16" is included for comparison, it is doing much worse despite having discrete GPU):I will share this as that is what I discovered to and I think he is right.
Mac side:How do you use 2 screens? Just curious.
In Display Settings, Option Click on "() Scaled", then check "Show all resolutions" ("Show low resolution modes" on earlier OSs) and you'll see some modes marked "low resolution" which really do output the stated resolution.
Many full screen games have their own option to choose resolution, though.
Thank you. Sounds good. Should be possible with 2 27” screens as well, ofocurse. Do you edit photos as well? If so, how with 2 screens?Mac side:
2 4k 32" BenQs, so rather cheap.
Main "work" display straight ahead with slight offset to the right.
2nd "content" display on the left angled to me (30° I would say).
If I need to keep something up for reference it might get pushed to the left.
MorphOS side:
20" iSight iMac G5 for desktop, debug output etc on the right
22" LG just for the IDE and maybe running the SW I just compiled (depends on the project)
This more of a HW limitation as 1 20/22" is just to small and I don't want to drag out the PowerMac to run the 27".
1st dual screen setup I had was in 2002 running Amithlon (Linux based Amiga emulator) on 2 18" LCD and I always feel cramped when restricted to 1 screen.
Do you edit photos as well?
What' s the difference between like 2560x1440 and 2560x1440 (low resolution)?In Display Settings, Option Click on "() Scaled", then check "Show all resolutions" ("Show low resolution modes" on earlier OSs) and you'll see some modes marked "low resolution" which really do output the stated resolution.
Many full screen games have their own option to choose resolution, though.
ok, thnxs.Sure, cutting and and resizing them in "Preview" so they can be uploaded into forums
For real, none of my screens would be good enough for serious photo editing.
I do some coding but also just as hobby in the same way others may solve sudoku puzzles.
I think this is referring to retina/non-retina, so actually displaying 2560x1440 pixels rather than at double ppi.What' s the difference between like 2560x1440 and 2560x1440 (low resolution)?