Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

gusping

macrumors 68020
Original poster
Mar 12, 2012
2,031
2,368
I'm looking to get a 2nd 4K monitor soon, and will inevitably buy a third, so I'm also thinking about getting an eGPU in the near future. I will most likely get the Razer Core X, as it is the cheapest. In terms of an actual GPU, do you think the RX 580 8GB will be sufficient to run three 4K monitors, with scaling? I do no GPU intensive workloads, the GPU is just for the monitors.

Thanks
 
I'm looking to get a 2nd 4K monitor soon, and will inevitably buy a third, so I'm also thinking about getting an eGPU in the near future. I will most likely get the Razer Core X, as it is the cheapest. In terms of an actual GPU, do you think the RX 580 8GB will be sufficient to run three 4K monitors, with scaling? I do no GPU intensive workloads, the GPU is just for the monitors.

Then you don't need an eGPU. Put the money into RAM.
 
Then you don't need an eGPU. Put the money into RAM.

Try running two 4K monitors with scaling, it's a lagfest. I already have 16GB RAM, and the VRAM must be maxed, because there is still 5GB+ free all the time.
 
Try running two 4K monitors with scaling, it's a lagfest.

Why would anyone do that? Why would I get a 4K screen just to have a tiny bit of its resolution? Do you really watch 4K movies at 1080p? What a horrendous waste of pixels and screen estate!

If you really feel you need to scale, then use those integer scalings that do not affect performance. Control panel will tell you which when you hover over the options.
 
Why would anyone do that? Why would I get a 4K screen just to have a tiny bit of its resolution? Do you really watch 4K movies at 1080p? What a horrendous waste of pixels and screen estate!

If you really feel you need to scale, then use those integer scalings that do not affect performance. Control panel will tell you which when you hover over the options.

You really haven't been following the Mac mini hi res monitor fiasco have you? The only scaling you can use on a 4K monitor without using GPU power is the 'looks like 1080p' setting which is too large. Native 4K is obviously too small for most uses. 'looks like 2560 x 1440' is often the sweet spot. Try running two 4K monitors (or even 3) with that setting. Your machine will be very laggy, even with 64GB RAM. Your movie analogy is irrelevant to monitor scaling.
 
  • Like
Reactions: djc6
You really haven't been following the Mac mini hi res monitor fiasco have you? The only scaling you can use on a 4K monitor without using GPU power is the 'looks like 1080p' setting which is too large. Native 4K is obviously too small for most uses. 'looks like 2560 x 1440' is often the sweet spot. Try running two 4K monitors (or even 3) with that setting. Your machine will be very laggy, even with 64GB RAM. Your movie analogy is irrelevant to monitor scaling.

It's not. The way to do it is to keep the screen estate and the resolution, and to allow text UI elements in custom sizes. To do fancy scaling is just ludicrous, that's not what GPUs are for. Typical Apple crap.
 
The way to do it is to keep the screen estate and the resolution, and to allow text UI elements in custom sizes.
Great so we can have readable text and all non-text window chrome is unrecognisable due to tiny sizes. Fantastic.

You keep pushing this “why would anyone do that” mantra as if everyone has the same goals as you. Some people just want a large display with crisp UI/text. Ultimately if all your UI/text is scaled up (including presumably some kind of algorithm to increase all the fonts in a browser window, word processing document, etc) then you have the same real estate available as with a scaled display. But you require every single app developer to opt-in to supporting it, and supporting it well.

As for the original question - I’ve run two 4K displays in a scaled res from my 2018 MBP15, which doesn’t have a particularly high end dGPU. The lowest GPU apple officially supports is the Radeon 570, I’d imagine that can run at least 2 at 4k, it seems to be somewhat higher spec than the 555X in the MBP: https://technical.city/en/video/Radeon-RX-570-vs-Radeon-Pro-555

I’m planning to order the OWC bundle soon, which I believe comes with an Radeon580. I haven’t seen any of the egpu spec pages specifically state a max number of displays at 4K like apple do for their computers though.
 
  • Like
Reactions: djc6
Great so we can have readable text and all non-text window chrome is unrecognisable due to tiny sizes. Fantastic.

You keep pushing this “why would anyone do that” mantra as if everyone has the same goals as you. Some people just want a large display with crisp UI/text. Ultimately if all your UI/text is scaled up (including presumably some kind of algorithm to increase all the fonts in a browser window, word processing document, etc) then you have the same real estate available as with a scaled display. But you require every single app developer to opt-in to supporting it, and supporting it well.

Now, tell me, how are you going to have ”crisp” text in non-integer scaling? Interpolation will a) blur the result; 2) tax your GPU. If you indeed are going to scale, integer scaling is the only reasonable option. So if you really want 2560, then scale it thus on a 5K screen (1:2), not on a 4K (2:3), because that is going against the physics (and economy; 3 x 4K + an expensive eGPU just to have three tiny screens).

It can never ever be more ”crisp” than in native 4K resolution. Pixel density is such that you will never detect individual pixels.

As for developers catching up – all that would be required is for Apple to implement universal API:s for text and UI customization. They are going to have to do that sooner or later.
 
how are you going to have ”crisp” text in non-integer scaling?
A 24”/4K display scaled to eg "looks like 2560x1440" still has vastly nicer appearance than a low dpi screen. Sure it’s not quite as crisp as when it’s straight @2x scaling, but thats the benefit of this approach - the end user can choose what suits them.

integer scaling is the only reasonable

Maybe there's only one reasonable approach for you. Most other people just accept that people use their computers differently, and let them do that. You clearly like using your display at native resolution, and that's an option that's available to you - why are you so bothered by other people having a different preference to you?

So if you really want 2560, then scale it thus on a 5K screen(1:2), not on a 4K (2:3), because that is going against the physics (and economy; 3 x 4K + an expensive eGPU just to have three tiny screens).

So what you're saying to me, is that the only reasonable, economical approach, is to have a 27" 5K display shipped internationally, for $2,200USD, rather than to buy two 24" 4K displays domestically for ~$1000USD delivered, which results in twice the usable screen real estate, and other benefits of multiple displays (e.g. better window management options, in my opinion)


It can never ever be more ”crisp” than in native 4K resolution
I'm only 35, don't wear glasses and I would struggle to use these displays at "native" 4K, because everything would be tiny.

all that would be required is for Apple to implement universal API:s for text and UI customization.
Right, like all they had to do was make Cocoa available, and all apps instantly were rewritten to use it, and all they had to do was make x86 libraries and headers available and all apps were instantly rewritten to work on Intel CPUs, and then all they had to do was make 64bit headers and libraries available, and all apps were instantly rewritten to be 64bit.

They are going to have to do that sooner or later.
Why? Because you want them to?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.