Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I use an M1 Mini with a BenQ PD2700u 4k monitor. It is a decent monitor that lends itself to "enthusiast" graphics and photography. If you check the monitor specs out it is a 60hz monitor. However, for whatever reason, M1 Mini on its own shifts it over to "40-60" hz which is noticeable for its flicker. I have to reset it several times a day.
 

Attachments

  • Screen Shot 2022-03-15 at 1.46.33 PM.png
    Screen Shot 2022-03-15 at 1.46.33 PM.png
    152.7 KB · Views: 283
  • Like
Reactions: EugW
I use 34 inch ultrawide WQHD and 19 inch 900p monitors on the mini M1, and both are fine as it should be with maximum native resolution.

The only problem is when you want to use your WQHD / QHD display with HiDPI resolution, it won't look sharp because Mac OS doesn't support it. You need a minimum 4K resolution for this.
Some will not tolerate this, as native QHD / WQHD is too small to see on 27 inch or 34 inch ultrawide monitor. But it's not a problem for me.
 
My wild guess would be Mac OS is deigned for any Apple hardware for the best result, anything from 3rd party you might end up antialiasing or blurry text.

Most of the website that display "image" as 72dpi that really hurt my eyes...
 
I still use Apple's old (HD and LED) Cinema displays, a late model 23" @ 1920x1200 and two 27" @ 2560x1440, and they look excellent in Monterey. They're clearly not as sharp as the Retina display in my MacBook Pro that runs them when viewed from a few inches away, but who does that anyway? At my desk I cannot even reach my displays without leaning forward (I have long arms) and at this distance the difference in pixel density and picture quality between 100 ppi and 200+ ppi is hardly noticeable (I have good eyesight).
 
I still use Apple's old (HD and LED) Cinema displays, a late model 23" @ 1920x1200 and two 27" @ 2560x1440, and they look excellent in Monterey. They're clearly not as sharp as the Retina display in my MacBook Pro that runs them when viewed from a few inches away, but who does that anyway? At my desk I cannot even reach my displays without leaning forward (I have long arms) and at this distance the difference in pixel density and picture quality between 100 ppi and 200+ ppi is hardly noticeable (I have good eyesight).
I thought the Apple Cinema displays didn't work anymore with the newer operating systems.

I have an old Cinema display from 2003 which I'm sure won't work with Monterey.
 

Attachments

  • IMG_6798.JPG
    IMG_6798.JPG
    335.7 KB · Views: 129
I thought the Apple Cinema displays didn't work anymore with the newer operating systems.

I have an old Cinema display from 2003 which I'm sure won't work with Monterey.
Earlier in this thread I posted a picture of Monterey running through a 15-inch Studio display from the same era. It's all just DVI so of course it'll work :)
 
I thought the Apple Cinema displays didn't work anymore with the newer operating systems.

I have an old Cinema display from 2003 which I'm sure won't work with Monterey.
Here's my Cinema HD Display from 2007:

Screen Shot 2022-04-22 at 12.57.57 PM.png


Why wouldn't it work.....
As long as you have the correct cable.
For mine, if I try to connect the monitor's USB cable through a USB hub to my Mac, I sometimes get weird behaviour, such that brightness control doesn't always work. This was not a problem in Big Sur for example. However, if I connect the monitor's USB cable directly to the Mac, brightness control always works.

However, mine is an older DL-DVI Cinema HD Display with no camera. I think the slightly newer LED Cinema Displays with the built-in camera have more problems.
 
  • Like
Reactions: zoltm and MarkC426
For mine, if I try to connect the monitor's USB cable through a USB hub to my Mac, I sometimes get weird behaviour, such that brightness control doesn't always work. This was not a problem in Big Sur for example. However, if I connect the monitor's USB cable directly to the Mac, brightness control always works.

However, mine is an older DL-DVI Cinema HD Display with no camera. I think the slightly newer LED Cinema Displays with the built-in camera have more problems.
This is to be expected, for a monitor which is ~14 years old......;)
 
Why wouldn't it work.....
As long as you have the correct cable.
I seem to remember reading somewhere that Apple had disabled the ability to use the original Cinema Displays a few operating systems ago. It never made sense to me that it wouldn't work. Right now I'm still running OS 10.6.8 but I'm about to get a new M1 Mac Mini.
 
Earlier in this thread I posted a picture of Monterey running through a 15-inch Studio display from the same era. It's all just DVI so of course it'll work :)
I just found the picture you posted. Considering the topic of this thread, how does the text look on the 1024 x 768 display?

I was under the impression it wouldn't work. For the new Mac Mini I think I'll get a new monitor. I've had this one for 18 years now.
 
I just found the picture you posted. I was under the impression it wouldn't work. For the new Mac Mini I think I'll get a new monitor. I've had this one for 18 years now.
Yes, it's about time. :) The contrast on those old monitors is quite poor in comparison to more recent models. Also, those old models are fairly dim.
 
Yes, it's about time. :) The contrast on those old monitors is quite poor in comparison to more recent models. Also, those old models are fairly dim.
The top portion of my Cinema Display's screen is slightly dimmer than the bottom portion. The change in brightness is gradual and hardly noticeable but it's there. What I've always liked about the old Cinema Displays is they have a very wide viewing angle. When viewed form an angle the image doesn't wash out.
 
  • Like
Reactions: zoltm
I just found the picture you posted. Considering the topic of this thread, how does the text look on the 1024 x 768 display?

I was under the impression it wouldn't work. For the new Mac Mini I think I'll get a new monitor. I've had this one for 18 years now.
The text looks exactly the same as on my 1080p monitor, only larger. It just looks pretty bad on a non Retina display nowadays sadly.

And yes, you could probably see it in the picture, the image is pretty yellow from age too :confused:
 
  • Like
Reactions: crowe-t
I seem to remember reading somewhere that Apple had disabled the ability to use the original Cinema Displays a few operating systems ago.

There was a lot of discussion of this, going back to the 2018 Mini. But I believe the main issue was actually using HDMI to DVI adapters (including Apple's own). In one long thread, somebody claimed this was due to Apple's implementation of the HDMI port on the 2018 Mini, where it did not provide enough power (or wrong voltage or something) for the DVI adapter. They claimed that Apple support confirmed this to them.

And I believe Apple used the same HDMI controller on the M1 Mini too, so the problem persisted in 2020. But using a USB-C to DVI adapter evidently worked properly with the old Cinema Displays. So, this is more of an issue on the M1 Mini, because you are forced to use the HDMI port if you want two screens.

I still have the 23" Apple Cinema Display that I bought with my Power Mac G5 back in 2005 IIRC. Very happy with that screen over the years, was still using it until 2020 on my 2012 Mini (which is now a "headless" server).
 
  • Like
Reactions: MarkC426
I was watching this discussion with some interest. My main desktop is being refreshed soon and I am second guessing myself on monitor.

I have iMac 5k 2017 - always liked it but moving to M series with a mini and MBA just delivered. Studio display is on order for delivery in June but I needed to setup now. (Don't need the power of Studio computer and hard to justify cost when I would not use it. Mini and MBA meet my needs but I like a display).

I had two spare monitors on hand. One is LG 25UD58 21:9 2560x1080 which is around 111ppi. Acer Predator X34P I inherited in a swap. 3440x1440 also around 111ppi and curved. Both IPS and decent machines. The Acer has super stand and position options.

I am running these alternatively side by side with the iMac. Clearly I can see the sharper text on the iMac. However I find the display on both 21:9 more than usable. After an hour or so I don't really notice it as I am busy working on code or text. What I do really notice is when I move back to the iMac and suddenly the width is constrained. I almost find the lack of width more jarring than the lack of highDPI.

I am considering ordering an LG 24UD58 which is a 24" 3840x2160 for a very nice 185ppi. Another IPS panel which I like. Would be keen to see how this compared with the 5k iMac before finally committing to the studio. While I would like the extra space on 5k and clearly the specs on the SD are superior I wonder (for my eyes and use) just how much better.

So I am split. I always tend to buy Apple as it just works and holds value. However the SD is £1200 more than the LG or a similar 3440x1440 non-curved and it comes with a decent stand. Then there is the outside choice - the Dell or LG 5k2k.

Everyone has different needs and thoughts. My experience side by side is that I like both for different reasons. I like HighDPI but also like 21:9 and not sure which is worth more. As long as >=111ppi I would say mini looks perfectly usable to me especially at the distance I use it from and the code I work on (mainly in VIM)
 
There was a lot of discussion of this, going back to the 2018 Mini. But I believe the main issue was actually using HDMI to DVI adapters (including Apple's own).
I experienced this issue first-hand. The problem appeared to be with a firmware update that was done during an update to Mojave. When booting a display hooked up to the HDMI port, it would fail to display anything on screen. You had to unplug the monitor cable from the HDMI port, then plug it back in for it to work normally.

I had a 2005FPW Dell monitor, which used the same panel as the 20-inch Cinema Display, that I used with my 2018 Mac mini, using an Apple DVI-to-HDMI adapter. There was nothing wrong with the adapter itself. I then switched to a DVI-to-USB-C cable from that I got off of Amazon, which resolved the issue.

Apple support confirmed the problem, was aware of it, but they didn't consider it worthy of engineering effort to fix, according to the poster who originally discovered the problem, after having spent considerable personal time to work with Apple on a potential fix. I can't speak to the issues that other models may be having, but this was very specific to the HDMI port on the 2018 Mac mini and a certain Mojave update.

I've since moved to a 21.5-inch UltraFine, due to macOS being optimized for high-PPI displays, but do recall the troubles caused by this, because I spent months dealing with it.
 
  • Like
Reactions: crowe-t
Hello,

I'm a newcomer in the Mac world, coming from PC.

I have a MBP with M1Pro and Monterey 12.3.1 and external display LG 34WN80C.

I successfully use the display in native resolution 3440 x 1440 60Hz on my PC, using USB-C to USB-C cable.

Switching the same cable to Mac -> fonts are really fuzzy : in system itself, Chrome, Terminal, IntelliJ IDEA.

I tried all the commands:
Code:
defaults write -g CGFontRenderingFontSmoothingDisabled -bool NO
defaults -currentHost write -globalDomain AppleFontSmoothing -int 3

with different values for AppleFontSmoothing, restarting the Mac each time -> still fuzzy.

Tried SwitchResX, DummyDisplay -> still fuzzy.

Also tried with USB-C <> Display Port 1.4, HDMI <> HDMI -> still fuzzy

What am I doing wrong?

Thanks,
 
However, for whatever reason, M1 Mini on its own shifts it over to "40-60" hz which is noticeable for its flicker. I have to reset it several times a day.

This is due Freesynch or GSynch, which enables the computer to control the display's refreshing rate. It's somehow useful when playing games.

But your monitor doesn't have the option to turn it off. I searched in Benq's user manual.

Drop a line to Benq support asking how to disable freesynch.
 
I have a M1 mini attached to an old (2012?) Apple Thunderbolt 27" display (2560x1440) via a TB2-to-TB3 adapter, and it works very well. I also use this display with my MBP 2019 16". I also have a 2015 27" Retina iMac, the display set to "looks like" 2560x1440. The Retina screen is sharper, but in normal use it's hard to tell the difference. Perhaps the fact that it's Apple to Apple avoids conflicts – I have no idea how I'd get on with a non-Apple monitor – but happily, I don't have to find out until it dies.
 
This seems like a much more simple issue: Retina vs Blocky pixels. I am addicted to Retina, and I am not ashamed.

The only purpose for low resolution monitors is high frame rate for games, for my purposes. Since I am not a gamer, hence, there are only retina monitors in my home.

I don't think apple removed anything, they probably just hid the switch.
 
  • Like
Reactions: Fravin
Hey. Thinking about your complaint. Maybe you're looking for  System Preferences –> General and find there font smoothing.
That's not there anymore. Check your Preferences.

Also, you can use the following terminal command instead:

defaults -currentHost write -g AppleFontSmoothing -int 0
And this trick stopped working years ago as well. I know because I kept an old 11" MacBook Air running for quite a while (waiting for the butterfly keyboard to be phased out) and I tried every hack under the sun to re-enable subpixel rendering.

For those who are misunderstanding what's going on here, up until a few years ago Apple had built in a method of smoothing that used parts of each RGB pixel to smooth the edges of text on lower-resolution displays. You can read more about it here: https://en.wikipedia.org/wiki/Subpixel_rendering. It's a pretty cool trick that leverages the fact that each pixel is actually made of up three parts (red, green, blue) and those can be addressed individually to smooth out the edges of text.

Around the time Apple phased out the last of its non-retina displays (I think the Air or maybe some base-model iMac was the last one?) -- anyway, around that time they removed the last vestages of subpixel rendering from MacOS, and at that point any non-retina display would have text that was even more chunky than necessary because subpixel rendering was gone, which had previously smoothed out the edges. Retina displays didn't need it, so Apple removed it. Very typical.

All this said, I'm not entirely sure what about this problem is specific to the M1 Macs, much less the Mini in particular.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.