Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

retta283

Suspended
Original poster
Jun 8, 2018
3,179
3,481
I'm interested in picking up a 30" Cinema or a 30" Dell panel down the road and I have one question that I haven't found a concrete answer for that is keeping me from committing to a purchase. I have Macs that can drive it fine, but these Macs are not good gaming systems. I would, in the event that I buy it, want to use it for games as well. For this I have a fairly old PC that runs the games I want to play, but that PC cannot output anything higher than 1920x1200. The monitor can still display 1280x800 fine via these PC though.

However, obviously 1280x800 is not the ideal resolution. The thing I am wondering is since 2560x1600 is pixel-doubled from 1280x800, does the image look decent running at that resolution? I understand that the PPI is abysmal and things will appear pixelated, but being that it's for gaming it doesn't mean that much to me. I'm just trying to avoid the muddy/blurry look that you get when running games at anything other than native res. A comparison would be running my Mac Mini at 1080p on a 4k TV, it's very pixelated but the actual images are clear, and crisp... My 27" iMac looks like absolute junk for every use at 720p even that 1440p is pixel-doubled.
 
The thing I am wondering is since 2560x1600 is pixel-doubled from 1280x800, does the image look decent running at that resolution?
I can't speak to any of this specific hardware, but be careful here. Just because the resolutions being sent should allow for integer scaling doesn't mean your monitor will actually use integer scaling. That's probably why your iMac looks crappy at 720p.

My experience has been that monitors always use ugly bilinear scaling for any all non-native resolutions. I'm sure there are smarter monitors out there somewhere, but I haven't found them yet!

You may be able to explicitly enable integer scaling on your PC though, I believe there's an option in nVidia's control panel nowadays for instance. Although I don't know if they made it available for older cards.
 
I have an Apple 30" Cinema Display. There are three ways to do 1280x800 in macOS:

1) If I have the 1280x800 timing mode (copied from the EDID of the display and entered as a custom resolution in SwitchResX if necessary), then the output signal is 1280x800 and the display scales the image up to 2560x1600 by using the same color for 4 pixels - no bilinear scaling - nice and pixelated if you like that.

2) If I have a 1280x800 scaled mode (the framebuffer being drawn to is 1280x800), then the GPU uses bilinear scaling to output a smoothed/blurred 2560x1600 signal. In Windows, this may require a GPU scaling option to be enabled?

3) For a 1280x800 HiDPI mode, the macOS draws objects twice as wide and twice as tall to a 2560x1600 framebuffer which is output directly as a crisp 2560x1600 signal. In Windows, I guess a 200% scale is as close to a HiDPI mode as Windows can get - it's not as consistent as macOS.
 
  • Like
Reactions: Amethyst1
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.