Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

m11rphy

macrumors 6502a
Original poster
Dec 26, 2009
642
372
Hi All,

I have ordered an MSI Radeon 4 GB RX 560 AERO ITX 4G OC AMD GDDR5 Graphics Card so that I can install macOS Mojave however I just had a thought, with this card installed of my GT120 how can I run the Apple 30inch Cinema Display? Will I need a converter?
 
Looks like that card comes with HDMI, Displayport and a dual link DVI port. You should be able to use your 30" display with it as is.
 
Last edited:
So I bought a Radeon RX 580 that had been flashes by MacVidCards and upgraded to Mojave. The result on my 30" Cinema display is a bunch of vertical lines. The card works fine under High Sierra.

Mike
 
So I bought a Radeon RX 580 that had been flashes by MacVidCards and upgraded to Mojave. The result on my 30" Cinema display is a bunch of vertical lines. The card works fine under High Sierra.

The cards on Apple's supported list are matched up by product ID to the specific Framebuffer driver that Apple has. Otherwise they get matched to a generic driver which may or may not match the ports on the cards not on the list. There is a better chance that the DisplayPort port will 'happen to work' (and that a conversion from there produce signal ).

macVidCards mods only augment the interface between the EFI boot environment before macOS starts. There is still an Apple signed graphics driver to deal with after boot phase has finished. Apple is transitioning the graphics stack with Mojave to a Metal only based future; so it is different from High Sierra. (working in the latter OS version doesn't guarantee will work in the former. )
 
  • Like
Reactions: Alex Sanders74
I have the RX580 (not flashed) and I’m also using the Cinema Display 30". No problem so far either on Sierra (I know, apparently it’s not supported but it works) or on Mojave. No converter used, only the normal DVI connector.
I hope you find a solution. That is weird.
I used to have a color problem with a previous graphic card and my cinema Display, and strangely enough, I solved it simply by going to the display panel presences, and toggling the scaled resolution (choosing it and then unchoosing it). Nothing else I did before that solved the problem. My problem was not the same, but my colors were all messed up, like as if the channels had been mixed.
You could try that, worse thing you’ll lose ten seconds of your time :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.