Does the use of an A-series (A13) chip in the current studio display mean it will eventually be unsupported (and unable to be connected to) by a future generation of Macs or MacOS releases down the line?
Does the use of an A-series (A13) chip in the current studio display mean it will eventually be unsupported (and unable to be connected to) by a future generation of Macs or MacOS releases down the line?
The A-series chip in the Studio Display doesn't really communicate with the Mac in any meaningful way. It just runs the stuff inside the display (and all displays have some kind of control chip in them; this one just happens to use a somewhat over-powered A13).
Otherwise, if it were an Apple chip to Apple chip situation, they wouldn't work when connected to a PC (which they do if the PC supports Thunderbolt video-out).
So there's no reason to think that at some point Apple will just turn off all attached Studio Displays. They might not get firmware updates at some point, but that wouldn't be different than every other display out there.
Yep. Seemed strange to me that more or less as soon as the ASD hit the market, the rumor mill was projecting a mini-led, promotion, HDR version was in the works. Since then crickets.
Maybe it's just me, but I don't really understand the broad appeal of HDR on a computer monitor. I get that some folks need it for the work they do, but that has to be a fairly small audience. What little video I watch on the computer doesn't benefit all that much from HDR. The picture on the ASD is really great. I have the Dell 6K and rarely avail myself of the HDR capability.