Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
like the touchbar, it is an embedded version of iOS, with the full resources of an A13 in terms of memory, the ability to run applications (and no real ability as far as I've seen to audit what it's doing), and the ability to crash independently of the computer, which if you research it, happens a LOT.
Firstly the touchbar never ran iOS firmware but bridgeOS a modified version of watchOS. The Studio Display can't run applications though. The Studio display is not a Smart Monitor where it can run apps nor does it have an internet connection like WiFi.
If you are worried about what the A13 is doing in the background then it's a pointless sense of worry, Hector Martin who reverse enginners Apple's SoCs has said Apple's SoCs have no back doors like Intel's IME or AMD PSP.

Yes there was crashing at the start of the Studio Display release, mostly to due with updating the firmware and that was an issue on Apple's server. It has calmed down a lot now(not saying it's completely non-existant). In the Mac Studio forum, many are using the Studio Display without issues.
 
Firstly the touchbar never ran iOS firmware but bridgeOS a modified version of watchOS.

Which is an iOS derivative, like TVOS.

The Studio Display can't run applications though.

*user installed* applications. There's plenty of resources on the A13 to do more than manage the display's peripherals.

nor does it have an internet connection like WiFi.

It has access to your Mac's thunderbolt bus, and therefore access to whatever internet connection your mac is using.


If you are worried about what the A13 is doing in the background then it's a pointless sense of worry, Hector Martin who reverse enginners Apple's SoCs has said Apple's SoCs have no back doors like Intel's IME or AMD PSP.

We know for a fact that Apple tried to put Apple private networking out of reach for network analysis and firewall tools like Little Snitch etc. We know for a fact that Apple is being influenced, both externally and internally to put pervasive surveillance, under the guise of CSAM detection, into their devices.

The Studio Display is effectively a remotely administrable target display mode iMac, over which the owner has no audit-able control. Some people would consider that to be an inherent vulnerability, and an unnecessary one, given it's a consumer, not professional graphics display, and LG makes a TB display that's just a display with a similar consumer-grade panel.

Yes there was crashing at the start of the Studio Display release, mostly to due with updating the firmware and that was an issue on Apple's server. It has calmed down a lot now(not saying it's completely non-existant).

And again, some would suggest that a monitor should not have automatic firmware updates, rather the product should be finished before it's released to manufacturing, so that the product you own remains the product you bought.
 
We know for a fact that Apple tried to put Apple private networking out of reach for network analysis and firewall tools like Little Snitch etc. We know for a fact that Apple is being influenced, both externally and internally to put pervasive surveillance, under the guise of CSAM detection, into their devices.
Apple reverted the private networking when the community spoke up. I am happy we did. As for CSAM I will not go into that can of worms. Best thing for now is that Apple halted it, next best thing would be to scrap it. But with government forces being into destroying privacy idk.
The Studio Display is effectively a remotely administrable target display mode iMac, over which the owner has no audit-able control. Some people would consider that to be an inherent vulnerability, and an unnecessary one, given it's a consumer, not professional graphics display, and LG makes a TB display that's just a display with a similar consumer-grade panel.
Yes but the chances for someone remotely hacking a Studio Display is stupidly low. We have many displays that are like the studio display, TVs and some monitors nowadays have their own SoC with networking and storage capabilites and also with updatable firmware.
And again, some would suggest that a monitor should not have automatic firmware updates, rather the product should be finished before it's released to manufacturing, so that the product you own remains the product you bought.
looks like that is becomming a fantasy with more and more products getting "smarter".
 
Apple reverted the private networking when the community spoke up. I am happy we did. As for CSAM I will not go into that can of worms. Best thing for now is that Apple halted it, next best thing would be to scrap it. But with government forces being into destroying privacy idk.

Apple didn't halt the CSAM system, it was deployed a couple of updates ago, afaik - they're just only activating part of it (iMessage for minors) for now.

Yes but the chances for someone remotely hacking a Studio Display is stupidly low. We have many displays that are like the studio display, TVs and some monitors nowadays have their own SoC with networking and storage capabilites and also with updatable firmware.

My concern isn't some nefarious actor hacking the display, my assumption is that Apple is / has the capability to be a nefarious actor.

looks like that is becomming a fantasy with more and more products getting "smarter".

It's interesting, I was looking at Ikea's smart lighting products the other day - none of it requires apps. It's all non-internet-enabled control / power hubs, with dedicated wireless hardware-based controllers. We have solar and ducted zoned aircon, no way in hell we're letting them connect to the router or access the internet. The solar system, we can connect to its wifi, and access it with an app to see what it's doing, which is an acceptable solution, but its metering and billing runs separately on its own infrastructure.

Wait till you start seeing a premium on older vehicles etc that have no integrated entertainment systems (DIN for ever!), that can't have features changed remotely or made into subscription services. Pre-smart vehicles, that are not even "collectable classics" are going to be worth more than new cars. ;)
 
my assumption is that Apple is / has the capability to be a nefarious actor.
pretty sure Apple has better things to do than hack a display... lol
It's interesting, I was looking at Ikea's smart lighting products the other day - none of it requires apps. It's all non-internet-enabled control / power hubs, with dedicated wireless hardware-based controllers. We have solar and ducted zoned aircon, no way in hell we're letting them connect to the router or access the internet. The solar system, we can connect to its wifi, and access it with an app to see what it's doing, which is an acceptable solution, but its metering and billing runs separately on its own infrastructure.
ohh gotta check them out. They sound cool.
 
The LG Ultrafine 5K will work with the RX580 - but it needs:

- Titan Ridge Thunderbolt card in slot 4
- Put two video cables from your RX580 into the video-in ports of the Titan Ridge
- Now connect thunderbolt cable to the LG screen and the Titan Ridge.
- when starting the computer, leave the screen disconnected, then once MacOS has started, plugin the screen and it should connect properly
- you will not have the camera working or the brightness controls unless you do another restart
- Opencore (use Martin Lo's package, it's easier - then edit the config to enable Titan Ridge card.).

If you are using two screens then the better graphics card is the XFX RX580 GTS because it has three display-ports.
Have a MP 2010 5,1 with Sapphire Plus RX580, Mojave. Wish to use a single 6k60hz monitor. Asus 32PAQCV on the list.
Can the RX580 support this? (Each of 2xDP supports 4k)
Will a Mac flashed Titan Ridge 2.0 with 2xDP feeds from RX580 support 6k? (general office duties, some light CAD/Adobe PS & Ai)

Idea is to use a M4 Mini along with MP5,1 and switch between depending on requirement, both using a single 6k monitor
 
Have a MP 2010 5,1 with Sapphire Plus RX580, Mojave. Wish to use a single 6k60hz monitor. Asus 32PAQCV on the list.
Can the RX580 support this? (Each of 2xDP supports 4k)
Will a Mac flashed Titan Ridge 2.0 with 2xDP feeds from RX580 support 6k? (general office duties, some light CAD/Adobe PS & Ai)

Idea is to use a M4 Mini along with MP5,1 and switch between depending on requirement, both using a single 6k monitor
I would first try using your current monitor at 30hz to see if you can settle for that because TB3 doesn't have enough bandwidth to drive 6k at 60hz
 
TB3 is perfectly happy to drive 6k/60. Hence the XDRs launching with the TB3 Mac Pro.
If a GPU doesn't support DSC, then the XDR requires two HBR3 x4 connections over Thunderbolt 3 to achieve 6K60 (38 Gbps).

The XDR has 6016x3384 pixels. Newer 6K displays usually have more pixels 6144x3456 and therefore require more bandwidth. Two tiles of 3072x3456 requires more than 40 Gbps to achieve 6K60 30bpp. You may have to get rid of HDR or use chroma sub sampling or use a lower refresh width

I don't think Apple allows dual HBR3 over Thunderbolt 3/4 for other displays?

I don't think any non-Apple 6K displays have a dual tile mode like the XDR?
 
If a GPU doesn't support DSC, then the XDR requires two HBR3 x4 connections over Thunderbolt 3 to achieve 6K60 (38 Gbps). . . . .
I was intending to use the Titan Ridge to combine the signal of 2xDP Mini inputs and export via the 1xDisplayPort 1.4a to the monitor. I will also be changing GPU to Vega 64 which supports up to 8k@60 via 2xDP outputs, ignoring TB completely.

I may find full 6K too small on the screen and scale to a more comfortable text size. Does this play into less bandwith needed to light up the screen, even if at a lesser resolution? Does a GPU need to support a monitor at it's max resolution to even light up, even if scaled down?
 
@CarManDSL "I will also be changing GPU to Vega 64 which supports up to 8k@60 via 2xDP outputs, ignoring TB completely."

Vega series GPUs were one of the last AMD graphics cards that do not support DSC.
I think DSC support only came with the RX6000 range?

As @joevt explained in the post above (#36), 6K monitors like the Asus PA32QCV (6016x3384) when not using DSC need 38Gbps to run at 6K.
A single DP1.4a output can only achieve 32.4Gbps maximum, so can't drive a monitor higher than 5K if the GPU doesn't support DSC.

The solution at the time (2015-) was to use two DP cables from the GPU to two DP inputs on a single monitor.
Only the Dell 2715K (5K) and the Dell 3218K (8K) monitors supported this dual-DP cabling mode, as well as a DIY conversion board for the 5K iMac (R9A18).

(Apple of course did their own thing with a single cable dual-channel TB3 mode for 5K iMacs and the 2019 MP)

With the introduction of DSC, single cable operation using TB4/USB3/4 cables became possible.

"I may find full 6K too small on the screen and scale to a more comfortable text size. Does this play into less bandwith needed to light up the screen, even if at a lesser resolution?"

Apple 4K iMac, 5K iMac and ASD or 6K XDR Pro monitor all default to displaying the Menus and interface and text elements at exactly half the pixel resolution of the panel.
Apple call this Retina HiDPI, where each Menu or text pixel is displayed on the screen by 4 panel pixels.

Graphical elements like screen images and video are still displayed at full resolution.
So 4K video is shown at 4K etc.
This takes as much bandwidth as if the monitor is running at the highest 6k resolution.

"Does a GPU need to support a monitor at it's max resolution to even light up, even if scaled down?"

No. The computer reads the monitor's EDID (identity info) and adjusts the GPU's output to match what the monitor is capable of. The monitor will display the highest resolution screen raster that the GPU and cable bandwidth is capable of supporting.
Or anything smaller...
 
Last edited:
PaulD-UK, THANK YOU so much for the comprehensive answer. Much appreciated.
 
Last edited:
The solution at the time (2015-) was to use two DP cables from the GPU to two DP inputs on a single monitor.
Only the Dell 2715K (5K) and the Dell 3218K (8K) monitors supported this dual-DP cabling mode, as well as a DIY conversion board for the 5K iMac (R9A18).

(Apple of course did their own thing with a single cable dual-channel TB3 mode for 5K iMacs and the 2019 MP)
5K iMac and Thunderbolt 3 LG UltraFine 5K displays both use the same dual tile method as the Dell 2715K. The DisplayPort connections are internal. In the case of a Thunderbolt display, the Thunderbolt controller of the Thunderbolt display takes the two Thunderbolt tunnelled DisplayPort connections and converts them back to real DisplayPort inside the display.

With the introduction of DSC, single cable operation using TB4/USB3/4 cables became possible.
Includes DisplayPort cables. TB4/USB3/4 are all carrying DisplayPort. It may be tunnelled through Thunderbolt or it may be real DisplayPort (through USB-C cable or DisplayPort cable).

Apple 4K iMac, 5K iMac and ASD or 6K XDR Pro monitor all default to displaying the Menus and interface and text elements at exactly half the pixel resolution of the panel.
Apple call this Retina HiDPI, where each Menu or text pixel is displayed on the screen by 4 panel pixels.

Graphical elements like screen images and video are still displayed at full resolution.
So 4K video is shown at 4K etc.
This takes as much bandwidth as if the monitor is running at the highest 6k resolution.
To be clear, the resolution in each case is the maximum resolution. The resolution is not halved. The pixels are just really small so the OS uses 4 times as many pixels to draw an object on the screen. 4 panel pixels are separate from each other - they do not need to contain the same color as a single pixel.


Does a GPU need to support a monitor at it's max resolution to even light up, even if scaled down?
Most modern displays can accept different resolutions. They can scale up lower resolutions to fill the display. Some displays can scale down higher resolutions. Usually scaling is done on the GPU to match the max resolution of the display. Some display modes have scaling and some do not. Some display modes are HiDPI (use 4 times as many pixels to draw on object) and some display modes are low resolution (not HiDPI). Some display modes combine scaling and HiDPI.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.