Dual monitors help

Discussion in 'Mac mini' started by therzbm, Aug 8, 2013.

  1. therzbm macrumors newbie

    Joined:
    Mar 22, 2011
    #1
    I have a 2012 mac mini and I am running a 27 inch monitor at 1920x1080 through the hdmi and just got a mini displayport to hdmi cable to connect an older samsung 19 inch monitor which has a native resolution of 1440x900. When setting up the displays I can't select the native resolution for the Samsung monitor and the picture on it is blurry and scaled. I can set the samsung to 1920x1080 as well but the monitor scales it down, and running something like 720p still ends up being blurry.

    I was wondering if there was any solution to this as I will be using the second monitor for the mixer window on Logic and it is hard to see the small fonts and boxes on the mixer clearly (its not terrible, just I imagine it will strain my eyes after a while). Ive seen SwitchResX but it appears to be an old app for Snow Leopard and I dont wanna install it if its just gonna end up creating more issues. Cheers.
     
  2. omvs macrumors 6502

    Joined:
    May 15, 2011
    #2
    SwitchresX works great on 10.8 - I highly recommended it. Not sure if you'll have to pay the registration fee to keep using past 30 days, but its well worth it in my opinion.

    I use it to generate modes with odd refresh rates (3840x2160x19hz, 2560x1440x40hz), but it should work just fine for making intermediate resolutions too. In fact, all you might need is the "Disable HDMI 1.3a (enables more resolutions on HDTV)" to get your native res to show up.
     
  3. therzbm thread starter macrumors newbie

    Joined:
    Mar 22, 2011
    #3
    thanks mate. Can I just download it and create/edit the things I need to get the correct resolution showing up then not have to use it again or is it something that I will have to eventually pay for and have open on startup to apply my display profile/settings? thanks again.
     
  4. omvs macrumors 6502

    Joined:
    May 15, 2011
    #4
    I'm not totally sure. I think you can do this without paying for it, but I registered a long time ago, so can't really verify one way or another.

    Another option: I believe the problem is the computer is seeing the mDP->HDMI adapter and thus thinking the monitor is actually a TV. If you used mDP->DVI, I don't think you'd have that problem -- if your monitor is HDMI only, you might even be able to do a mDP->DVI adapter, then a DVI->HDMI adapter. But if you don't already have the adapters sitting around, it may cost more than you want to spend.
     

Share This Page