Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TheRdungeon

macrumors 6502a
Original poster
Jul 21, 2011
546
93
Hi there, from what I understand there needed to be all sorts of workarounds in the old minis, including things like this: https://www.amazon.com/NewerTech-He...mmy/dp/B01ASJCZFK?sa-no-redirect=1&pldnSite=1
to trick the GPU into working in order to choose resolutions and get better performance when operating headless. Is this still the case with the M1 mini? I'm currently remotely accessing mine and I can't choose any resolution except for 1920*1080 (I need 1680 x 1050 in order to fit natively on my 2012 MBP non-retina)

Any help greatly appreciated
 
As an Amazon Associate we earn from qualifying purchases.
Select "Default for display," then hold the option key down while you click "Scaled". That will give you all of the resolutions, including 1680 x 1050.
 
Unfortunately tried that, including 3 resolution switching apps, all of them only give 1920*1080 (this was the resolution of the TV i used to set it up initially, seems like it's stuck on that maybe?
 
You don't need an app to do this. Mine is connected to a 4K TV. Follow carefully:

Step 1 - Select "Default for display"

by default 2020-11-25 at 8.27.19 PM.jpg


Step 2 - Hold down the option key and select "Scaled"

by default 2020-11-25 at 8.28.35 PM.jpg
 
Thank you but I'm not sure you understand that there's no monitor hooked up? I understand that option click allows more scaled resolutions, but as soon as the monitor is disconnected it reverts back to 1920x1080. This computer is controlled through a network using Apple Remote Desktop and has no display attached
 
I see now. If my Mini is connected to the TV it shows all of the resolutions even with the TV turned off. But when I disconnect the TV, it reverts back to 1920x1080 just like yours. That is unexpected. Maybe Apple will change that in a future Big Sur update.

For now you might have to give up native resolution and scale the remote 1920x1080 screen to your display. I'm using Screens as my viewer and it's easy to do that.

One other thing you might do...go to craigslist and buy a small TV for $30-$50. Connect that to the Mini but keep it out of sight.
 
I think a good solution is a device like this:

They just emulate the part of a monitor that talks to the computer and negotiates resolution, etc. The computer thinks there’s a real monitor. I’ve used ones like this to good effect (not this exact device).
 
As an Amazon Associate we earn from qualifying purchases.
I think a good solution is a device like this:

They just emulate the part of a monitor that talks to the computer and negotiates resolution, etc. The computer thinks there’s a real monitor. I’ve used ones like this to good effect (not this exact device).
Yep cheers ended up buying a 1080p version of this after reading about them!
 
As an Amazon Associate we earn from qualifying purchases.
I see now. If my Mini is connected to the TV it shows all of the resolutions even with the TV turned off. But when I disconnect the TV, it reverts back to 1920x1080 just like yours. That is unexpected. Maybe Apple will change that in a future Big Sur update.

For now you might have to give up native resolution and scale the remote 1920x1080 screen to your display. I'm using Screens as my viewer and it's easy to do that.

One other thing you might do...go to craigslist and buy a small TV for $30-$50. Connect that to the Mini but keep it out of sight.
Yeah for sure, I think by that point I mayaswell just go the whole hog and invest in a monitor. Really bizarre that this hasn't been addressed by Apple in like ten years, maybe it's some hardware issue or something
 
Yeah for sure, I think by that point I mayaswell just go the whole hog and invest in a monitor. Really bizarre that this hasn't been addressed by Apple in like ten years, maybe it's some hardware issue or something
Ignoring the resolution being fixed at 1080p - did you see any performance issues when it was headless? The GPU not activating used to make headless mini's quite unresponsive/laggy as I understand it.

I'm considering an M1 mini to run our IP camera software - but I don't really want a display with it, if I can help it. However if the lack of display means the GPU doesn't activate, I don't know how well the recording software (https://www.bensoftware.com/securityspy/) will run, as it's heavily optimised to use the GPU for decoding to do object detection, as I understand it.
 
I find it hard to believe that this problem is not solved by software.
How Apple can stop this happening and why?
What kind of security threat can virtual display be?
 
Ignoring the resolution being fixed at 1080p - did you see any performance issues when it was headless? The GPU not activating used to make headless mini's quite unresponsive/laggy as I understand it.

I'm considering an M1 mini to run our IP camera software - but I don't really want a display with it, if I can help it. However if the lack of display means the GPU doesn't activate, I don't know how well the recording software (https://www.bensoftware.com/securityspy/) will run, as it's heavily optimised to use the GPU for decoding to do object detection, as I understand it.
I ended up buying that headless adaptor so I can't quite recall anymore what the performance was like (have returned mini now). I'd heard of those performance issues on the old one as well. I'm inclined to say just buy that headless adaptor as It also lets you switch the resolution to match whatever monitor you use to access it remotely and it's under $10 and solves any chance of the GPU not activating. I will say that I'm certain it will demolish any task you throw at it, especially video
 
BetterDummy is what I used, works perfect. The updated BetterDisplay seems to lack what I was looking for, but the original tool is still supported on a GitHub fork: https://github.com/Brezel31/BetterDummy
What version is the "updated" and what is the "original"?
and
?
 
What version is the "updated" and what is the "original"?
and
?
I know, BetterDisplay is technically supposed to do both, but I have to run both pieces of software to achieve my goal. Unsure if I'm just missing something.
 
So, the answer to my question is...?
Would probably need to ask the developers. But to me BetterDisplay is the newer software, but I am unable to figure out how to add dummy displays in the same easy method as I do through BetterDummy. I find no conflicts in running both. I own BetterDisplay, and would probably find the option if I spent longer. But what I have is running fine, so I just leave it.

I run two headless units, a M1 Mac Studio, and an M1 Mac-Mini, and set their resolutions higher and scale down, or I set them and my machine at 1920x1080 and everything is crisp and sharp. Without the software, and after a restart, you will notice for a little before the software loads, that the resolution "emulation" is low. Then the software makes MacOS Screen Sharing tool see the better display settings.

Apple misses out on hampering their headless Screen Sharing tool. Honestly it was messing with my vision until I fixed it. Everything Apple does now is Retina... except screen sharing a headless unit.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.