Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.


Apple today released an updated version of the 15.5 firmware for the Studio Display, with the update coming more than two months after the Studio Display firmware was last updated. The prior version of the 15.5 firmware had a build number of 19F77, while the new version is 19F80.

apple-studio-display-blue.jpg


Apple's release notes for the update confirm that it addresses an issue with the Studio Display speakers. Since the launch of the Studio Display, there have been complaints about the speaker quality. Apple last week sent out a memo to authorized service providers, acknowledging that some customers have had issues with the Studio Display speakers cutting out or offering distorted playback.

Apple said in the memo that a future update would fix the issue, hence today's firmware update.

The Studio Display firmware can be updated by connecting it to a Mac. Studio Display owners can go to System Preferences > Software Update to install the firmware.

Article Link: Apple Releases Studio Display Firmware Update to Fix Speaker Issue
My update failed initially. I did the unplug from the computer, no help. I then unplugged peripherals from the display and the update was successful.
 
My understanding is the OS is closer to iOS than to iPadOS, but but the differences between the two is tiny anyways.

I presume you mean the speaker issue. And no, I'd say it would have been less likely to occur: what they did was take iOS's entire audio stack and use it on a display. That stack needs to handle things like intermixing multiple applications, system sounds vs. app sounds, AirPlay (including syncing the audio delay), Bluetooth, etc. A display's audio output would be a way, way smaller surface for potential bugs. And frankly, tons of displays for a tenth the price can handle outputting audio. At the end of the day, this is a case of overengineering.
I also believe they overengineered the display. But not for the reasons you think. I think the Studio Display was a case of a failed attempt at making an iMac, so the chassis and internals were meant for an M1 Max/Ultra. Since the thing was probably engineered from the beginning to be run from macOS inside an all-in-one, that didn’t leave them with many choices. When they failed at making an iMac, they created the Mac Studio instead, but were faced with a dilemma of how to get a monitor out to go with the new headless computer. Given they had already done all the work on the iMac, they used its carcass to make an overengineered monitor that wouldn’t have been overengineered had it been an iMac. No monitor needs that kind of cooling system.

I don't believe they stripped out anything, no.

(Does someone have a URL to an ipsw for it?)
No idea if they did or not, but that’s what I would have done. The libraries are designed to be added and removed at will.

Yes, of course. But there's a reason most embedded devices tend to use simpler OSes.
Most devices start their lives as strictly embedded and have no desktop OS implications. There is no choice but to develop from scratch.

I understand the choice of "hey, we already have Center Stage up and running on iOS, so let's just use that". I don't think it was the right tradeoff.
Not just Center Stage, but Siri and Spatial Audio, the big three of the Studio Display. It’s a non-trivial task to port that code to another processor since you can’t just copy and paste. Every processor has different attributes you must take into account. I would contend it’s riskier to port it than it is to just use what’s there that’s been tested.

Since we cannot predict the future, we do not know if any other significant bugs pop up. If not, I’d say Apple made the right decision. If they pop up at a consistent rate, then no, they didn’t make the right choice. Personally, I think it was the right choice. As a developer myself, I know what’s involved and it’s a lot riskier to do it the way you wanted.
 

No monitor needs that kind of cooling system.

It might, because of the relatively high brightness.

Not just Center Stage, but Siri and Spatial Audio, the big three of the Studio Display. It’s a non-trivial task to port that code to another processor since you can’t just copy and paste. Every processor has different attributes you must take into account. I would contend it’s riskier to port it than it is to just use what’s there that’s been tested.

Sure.

(I really don't think Siri and Spatial Audio are big selling arguments, though. Center Stage slightly more so, but that didn't exactly get positive press.)

Since we cannot predict the future, we do not know if any other significant bugs pop up. If not, I’d say Apple made the right decision. If they pop up at a consistent rate, then no, they didn’t make the right choice.

I'm not sure I fully agree with that. The problem is the display keeps getting bad press. Sometimes because of mistakes (which, again, wouldn't have happened with a separate OS), sometimes because of minor issues, sometimes because their camera tradeoff choice (choosing an ultrawide to make Center Stage more feasible, but also make any picture where all you want is your own face worse) may have been wrong, sometimes for other reasons.

Some of that is par for the course with Apple products, but the general sense I'm getting is that, as far as recent Mac-related hardware products go, this is the most controversial one. The reality is that many consumers won't be comforted by "sure, but those were teething troubles; the firmware is solid now". They'll look at the headlines, and get the impression that it overall isn't that great a product. Then they look at the price, and wonder if it's worth it.

Yes, from an engineering point of view, it's great that they have a solid basis now. Much easier to keep iterating, and they get any future improvements from iOS for free. But I'm not sure how often that will be a practical benefit (for example, they could've added Bluetooth and Wi-Fi chips to eventually make it an AirPlay receiver in a software update — but they chose not to).
 
  • Like
Reactions: turbineseaplane
You avoided addressing my question:
No, I haven't.

You claimed that losssless codecs can create output files larger than input files, yet haven't provide any evidence that this actually happens. With all due respect, and this is not personal since I don't know you, right now you're just some random guy on the internet making what appears to be a surprising claim without providing any evidence to support it. And I hope you can understand that it simply wouldn't make sense for me to accept it without evidence.
I'm baffled that I'm being asked by some random guy on the internet to "prove" that 1+1=2.

This is completely basic, essential math regarding compression: No finite-complexity algorithm can ever guarantee better than 1:1 compression for all content, let alone any ratio better than 1:1.

For absolutely every actually implementable compression algorithm there is content which cannot be compressed better than 1:1.

Believing the contrary is equivalent to believing in the physical existence of Santa Claus and of perpetuum mobile.

Most of the time even lossless compression will yield smaller files than the uncompressed originals because most content has portions which can be compressed well enough to make up for the parts that can't be compressed at all. But that cannot be guaranteed, which makes it unsuitable for real-time streaming with bandwidth limited to below the uncompressed data rate. That is why lossy compression must be used, at least in those cases. (Compression can combine lossless and lossy compression depending on the ratios required, as DSC is doing, too.)

I'm well aware of the fact that MP3's are audibly lossy. That doesn't mean that DSC is visually lossy.
Yes, it does, because it is a real-time signal compression protocol where that is inherent.

VESA is taking great pains in blowing smoke about it by using the term visually lossless, meaning exactly the same as faithful reproducion in terms of MP3, namely "we just agree to ignore the artefacts for the purpose of this advertisment".

If it was actually lossless they would be able to say so, but that is of course inherently impossible, not the case and so they don't.

What you and others are falling for is simple advertising using weasel-words to blow enough smoke so most people will be sufficiently confused.
 
No speakers? The integrated speakers are really handy for those who don’t want to deal with bulky external speakers.
And are annoying for those of us who already have speakers, or want multiple monitors, and don't want to pay extra for speakers we will never use.

Sure, make a monitor with speakers and all the bells and whistles for those who want them. Oh wait, they already did.

Now also make one that is just an Apple quality dumb screen, wrapped in Apple design language, they will sell like hot cakes.

Hey you know what, also make some Apple external speakers. Then you can put them in decent sized cans so that the bass sounds good, rather than the weird, forced, bass, that is pumped out of these tiny speakers and cause me to crank the bass down to minimum with my EQ so that spoken voice doesn't sound so weird and hollow and muddy.
 
OK, but there's… not that many configurations? You could have the Studio Display, or the Studio Display with more capable stand, or the Studio Display with less gloss. This isn't like on a Mac, where you could have all kinds of peripherals, software running in the background, etc.



Yeah, but those have to handle a massive amount of configurations. Different permutations of mainboards, CPUs, chipsets, other installed cards, external devices, software running in the background, malware scanners, malware itself, … Getting that into a well-defined state where nothing but the firmware updater will run is hard both from a software and hardware side (which is why some firmware updaters don't even bother trying and just have you reboot into a simpler OS environment).
My update failed initially. I did the unplug from the computer, no help. I then unplugged peripherals from the display and the update was successful.
This is what I'm talking about. In my experience, it's quite often the thing that you don't think would cause the problem that actually IS the cause of the problem. The thousands of peripherals that could potentially be connected to the Mac or to the display can affect anything in the update process. In my case, I have about ten different peripherals attached to my Studio Display through a hub or directly and the update went easily and painlessly. I'm sorry it sucked for some people, but it's the law of large numbers.
 
No, I haven't.


I'm baffled that I'm being asked by some random guy on the internet to "prove" that 1+1=2.

This is completely basic, essential math regarding compression: No finite-complexity algorithm can ever guarantee better than 1:1 compression for all content, let alone any ratio better than 1:1.

For absolutely every actually implementable compression algorithm there is content which cannot be compressed better than 1:1.

Believing the contrary is equivalent to believing in the physical existence of Santa Claus and of perpetuum mobile.

Most of the time even lossless compression will yield smaller files than the uncompressed originals because most content has portions which can be compressed well enough to make up for the parts that can't be compressed at all. But that cannot be guaranteed, which makes it unsuitable for real-time streaming with bandwidth limited to below the uncompressed data rate. That is why lossy compression must be used, at least in those cases. (Compression can combine lossless and lossy compression depending on the ratios required, as DSC is doing, too.)

Yes, it does, because it is a real-time signal compression protocol where that is inherent.
Classic strawman. The claim I've consistently challenged is that you said the output can be bigger than the input after lossless compression:
Yes, but those cannot guarantee that the data isn't actually getting bigger for some signals.
But since you can't provide an example of such a case, you instead dishonestly pretend I'm arguing that the output can never be equal to the input, which I'm not, and never did. Like I said, classic strawman.

And no, I'm not asking for proof 1+1=2. I feel like I'm arguing with a climate denier who keeps throwning up pseudo-technical arguments. To paraphrase Neil Tyson, you seem to be someone who knows enough about a subject to think they're right, but not enough to realize they're wrong.
VESA is taking great pains in blowing smoke about it by using the term visually lossless, meaning exactly the same as faithful reproducion in terms of MP3, namely "we just agree to ignore the artefacts for the purpose of this advertisment".

If it was actually lossless they would be able to say so, but that is of course inherently impossible, not the case and so they don't.

What you and others are falling for is simple advertising using weasel-words to blow enough smoke so most people will be sufficiently confused.
There's nothing weasly here. VESA's been up-front from the start that their algorithm is an attempt to make a codec that IS NOT informationally lossless, but IS it visually lossless. Indeed, that's the entire point of the codec. Everyone understands that it is not informationally lossless, so it's laughable to claim that they're somehow hiding something, or confusing people, by using that term. So stop making stuff up. The irony is you're the one that's blowing smoke here.
 
Classic strawman. The claim I've consistently challenged is that you said the output can be bigger than the input after lossless compression:
As I said, that is mathematically inherent when you have a finite compression algorithm and completely arbitrary content (both is the case here).

At best the protocol can then switch to the original uncompressed content (with some overhead added on top) but that option does not exist when a fixed compression ratio is mandated by limited channel bandwidth.

Then lossy compression is the only option, as is the case with GPUs which can't drive the full resolution uncompressed to Thunderbolt because they don't support the full data rate.

The original contention was that:

a) DSC was completely lossless – it isn't (actual lossless compression is used as far as possible, but it isn't always possible, in particular not with high-contrast, all-channel-full-spectrum-noisy content)

b) Apple mandated GPUs always needing to support DSC for the XDR (and by externsion for the ASD) – they don't (because in both cases they very much support the full, uncompressed image signal if the GPU can actually provide that through Thunderbolt, which some smaller / older GPUs just can't, so only those need DSC to make up for the shortfall, but with the risk of compression artefacts – the more the higher the compression ratio)

Both points have now indeed been conceded exactly as I had been saying from the start, so agreement at long last.

There's nothing weasly here. VESA's been up-front from the start that their algorithm is an attempt to make a codec that IS NOT informationally lossless, but IS it visually lossless. Indeed, that's the entire point of the codec. Everyone understands that it is not informationally lossless, so it's laughable to claim that they're somehow hiding something, or confusing people, by using that term. So stop making stuff up. The irony is you're the one that's blowing smoke here.
Nowhere clearly communicating that DSC does indeed come with artefacts under certain circumstances but in the PR copy incorrectly suggesting perfect quality is simply dishonest.

This kind of misdirection is quite common, unfortunately, and in this discussion it has caused that confusion about the Apple displays.

Even some technical information must be read very carefully like a lawyer would, because it is not written for clarity, but for obfuscation.

Unfortunately even Apple isn't adding that much to explain the issue.
 
Last edited:
The original contention was that:

a) DSC was completely lossless – it isn't (actual lossless compression is used as far as possible, but it isn't always possible, in particular not with high-contrast, all-channel-full-spectrum-noisy content)
...

Both points have now indeed been conceded exactly as I had been saying from the start, so agreement at long last.
Nope, that was not the original contention. That's a complete misrepresentation--a total strawman. Here was the original contention, in black-and-white. I don't know how it could be any clearer. Again, stop making stuff up.
But true lossless compression isn't what's being discussed here. We're talking about DSC, which is designed to be informationally lossy but visually lossless.
 
  • Like
Reactions: turbineseaplane
This comparison chart has a lot of bias in it.

It is ok to find a different monitor better for you or a better deal, but should be fair with the comparisons at least.
I agree.
Having TRUE 10 bit is already a discriminating factor, price wise.
I haven't found anything below 1200$ with true 10-bit colour (not 8-bit frc)
 
Nope, that was not the original contention. That's a complete misrepresentation--a total strawman. Here was the original contention, in black-and-white. I don't know how it could be any clearer. Again, stop making stuff up.
That is only MP3-level reproduction which makes the same claims, and it is very much reduced-level quality, which was my point all along.

And it is not the standard mode for the Apple displays. Full, uncompressed video is.
 
Not salty, it’s the truth. Apple nerds would defend eugenics if it came with an Apple sticker and a bloated price tag.
You’re obviously in denial then. It’s far from the truth, and yet again, you’re acting salty… 🧂
Man oh man..
This topic got down deep in the weeds
Like most other threads on this forum? 🤣
And are annoying for those of us who already have speakers, or want multiple monitors, and don't want to pay extra for speakers we will never use.

Sure, make a monitor with speakers and all the bells and whistles for those who want them. Oh wait, they already did.

Now also make one that is just an Apple quality dumb screen, wrapped in Apple design language, they will sell like hot cakes.

Hey you know what, also make some Apple external speakers. Then you can put them in decent sized cans so that the bass sounds good, rather than the weird, forced, bass, that is pumped out of these tiny speakers and cause me to crank the bass down to minimum with my EQ so that spoken voice doesn't sound so weird and hollow and muddy.
I’ve said this before, but Apple is never going to try to make a “dumb” product on purpose. That’s not how they roll nowadays.

Thankfully, you represent a minority of users who want to plug in bulky external speakers. Apple wouldn’t want to deal with that.

Audiophiles are unbelievably selfish.
Wow amazing. Congratulations. Incredible that someone came on to say their display was fine. Because we ALL just assumed that EVERY single display was affected - so your information is priceless.
Another prime example of selfishness… 🤦‍♂️
 
You’re obviously in denial then. It’s far from the truth, and yet again, you’re acting salty… 🧂

Like most other threads on this forum? 🤣

I’ve said this before, but Apple is never going to try to make a “dumb” product on purpose. That’s not how they roll nowadays.

Thankfully, you represent a minority of users who want to plug in bulky external speakers. Apple wouldn’t want to deal with that.

Audiophiles are unbelievably selfish.

Another prime example of selfishness… 🤦‍♂️

Far from the truth yet here you are…
 
Using a new Macbook running the latest non-beta OS initiating a software upgrade to a new studio display isn't exactly an edge case.
It seems that other devices being connected to the display during the update might be the trigger for such issues (and those are of course legion and their behaviour is effectively unpredictable).

I have both my ASDs connected directly to the Mac Studio with none of the USB ports on the ASDs in use and the update went trough quickly and without a hitch (macOS is up to date).

But I can't say for sure if that's the reason; Users who did have issues could possibly shed more light on this.
 
  • Like
Reactions: MisterSavage
I’ve said this before, but Apple is never going to try to make a “dumb” product on purpose. That’s not how they roll nowadays.
Indeed.

Thankfully, you represent a minority of users who want to plug in bulky external speakers. Apple wouldn’t want to deal with that.

Audiophiles are unbelievably selfish.
Maybe we can get that down a notch: I personally happen to use a separate amplifier and big-box speakers so I normally don't need the built-in speakers of the ASDs. I just ignore them apart from a few quick tests.

But I understand that they make the ASD a nice, complete package with any Mac, making normal desktop speakers redundant.

And given the total cost I wouldn't see the point of whining about the speakers being there – there are clearly many more people who are using them than people who ignore them.

We in the minority who are using bigger, better external speakers can just suck it up and let others be happy with them being there. Apple clearly doesn't want to offer different variants of the ASD with or without speakers – the extra supply chain overhead would probably not be worth the savings.
 
  • Like
Reactions: Icaras
Far from the truth yet here you are…
You can’t be bothered to admit it, can you? It is the truth.
Using a new Macbook running the latest non-beta OS initiating a software upgrade to a new studio display isn't exactly an edge case.
Do you have any numbers to back that up?
Indeed.


Maybe we can get that down a notch: I personally happen to use a separate amplifier and big-box speakers so I normally don't need the built-in speakers of the ASDs. I just ignore them apart from a few quick tests.

But I understand that they make the ASD a nice, complete package with any Mac, making normal desktop speakers redundant.

And given the total cost I wouldn't see the point of whining about the speakers being there – there are clearly many more people who are using them than people who ignore them.

We in the minority who are using bigger, better external speakers can just suck it up and let others be happy with them being there. Apple clearly doesn't want to offer different variants of the ASD with or without speakers – the extra supply chain overhead would probably not be worth the savings.
Well said! :)
What a strange generalization to lean into
It’s not a “strange” generalization. It’s a fact. Just take a brief look at this thread!
Many of us do not want to fritter away days/weeks as an alpha/beta hardware or software tester.

For that much money, the thing should be rock solid and sing like a bloody choir in my opinion.
Sorry, you should never expect perfection, especially for a 1st generation product. In the world of hardware and software, there’s no such thing as absolute perfection. 🤷‍♂️
 
You can’t be bothered to admit it, can you? It is the truth.

Do you have any numbers to back that up?

Well said! :)

It’s not a “strange” generalization. It’s a fact. Just take a brief look at this thread!

Sorry, you should never expect perfection, especially for a 1st generation product. In the world of hardware and software, there’s no such thing as absolute perfection. 🤷‍♂️
Historically, Apple did a much better job of hitting the mark 1st iteration. All the hyped marketing BS has gone too far imo

I'm happy to step aside and let the lemmings go off the cliff or stand around hand wringing in a tizzy for months until Apple sorts out their too-soon-to-market blunders :)

The equipment is far too expensive to accept anything less (and so is my time... doing productive things LOL).
 
You can’t be bothered to admit it, can you? It is the truth.

Do you have any numbers to back that up?

Well said! :)

It’s not a “strange” generalization. It’s a fact. Just take a brief look at this thread!

Sorry, you should never expect perfection, especially for a 1st generation product. In the world of hardware and software, there’s no such thing as absolute perfection. 🤷‍♂️

Your comments just prove my point for me and you are clearly too dense to recognize that.

Nobody expects perfection yet people like you defend the products as if they were. What we do expect is competence and, in the case of this monitor, not a 2015 product in 2022 at 2026 prices.

Imagine what Apple could be if we celebrated its wins and called out its losses equally. No? Ok pathetic fanboyism because we somehow get our identities from our product it is.

Lol move on kid.
 
Using a new Macbook running the latest non-beta OS initiating a software upgrade to a new studio display isn't exactly an edge case.
That's not how testing works. Companies have to anticipate every possible peripheral that can be connected to either device to figure out if some hardware interaction can cause problems. The simple case of Studio Display connected to Mac with no other peripherals involved probably doesn't ever fail. Stick some permutation of hundreds of different peripherals and things do start to fail. Whether it's power draw, power surges, third party driver interaction, some change in system caused by a peripheral can cause things to fail.

Apple and every other company test their installs, but they can only test to a certain limit. They don't have every peripheral in existence, and even if they did, it would take them years to test everything for something as simple as an install. As someone in the engineering field, I'm saying it's impossible to fully test everything with every possible combination when some random third party attachment you've never heard of could screw things up. This is why installs work for the vast majority but sometimes it does fail because someone had a random hard drive attached.

I pointed to one person on this thread who said they unhooked all their peripherals and it worked. If someone has an install that isn't working right, that's what I'd recommend. Make your system the simplest configuration possible and that'll probably work for you.
 
So much blind hate for this product. I’ve had mine since day 1 and it’s been exactly what I’ve wanted for years.

It looks amazing, I’ve had no audio issues (though glad there’s new firmware), and the webcam now looks as good as any other built in webcam Apple ships in Macs, only with Center Stage (usefulness of that feature is debatable).

The only other 5K display on the market is ugly to look at, is 6 years old, and still costs $1,299. 4K is not the same as 5K. Hi-PPI/Retina is glorious. $1,599 is expensive but any $1,000+ display is a luxury item and I’m happy to have it.
Im loving the display - I had the speaker glitch once which a reboot fixed. Having an integrated webcam is exactly what I want - the firmware has made some improvement, but the iMac 24 camera definitely performs better and would be happier with. I dont know why they had to go all centre stage with it and I think thats where they made a mistake though not sure how they allowed it to ship like this. It is certainly the case where apple marketing doesn't meet the reality and they should really be pulled up on that. Its not a deal breaker and I wont be changing my monitor, its a great monitor, but the webcam is still has potato cam features. It was a daft move by apple to use that sensor with centre stage .
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.