Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I see a lot of people criticizing the Studio Display but it really is something else. I have yet to find any other display on the market that matches it on DPI and color quality. I think the simple truth is that Apple charges what it does for it because it can. There's simply nothing else comparable.

Even that LG Ultrafine 5K display doesn't measure up. I've compared them side by side. The LG's colors look duller and it has colored banding in greyscale sweeps.

Yeah, it sucks that Apple's less expensive display option is still expensive, but I blame the lack of competition. Even not having HDR and such it destroys everything else. Luckily the few issues that have come up have been fixable with software updates.
It is nevertheless still open to criticism. From dpreview.com:

"I'd much rather Apple had budgeted the $1,600 differently by swapping the webcam, speakers, and microphones for a miniLED backlight and a true 10-bit panel that covers 98%+ of both DCI-P3 and AdobeRGB. That's what I consider a 'Studio' quality display...The Studio Display is an excellent monitor with some really nice features, but it's expensive, and a lot of that money is paying for features that many creatives don't need from a 'studio' display for professional work."


It seems what Apple did was to package a consumer-grade display (not much different in performance from that on the 27" iMac, which cost nearly the same and came with a whole computer), add some features most studio professionals don't need, and then charge professional-grade pricing.
 
Making a retina quality monitor at the size of an iPad and doing so just below the size of a small TV are completely different manufacturing processes. I want a 120hrz mini led 5k screen too, but clearly it’s not possible at the moment at least at any reasonable price or plenty of people would be showing Apple up right now.

Yet the 24-inch iMac has a 4.5K Retina Display, and starts at $1299, and the 27-inch iMac with a 5K Retina Display started at $1799. Those include an entire computer, which presumable makes up about half the cost — yet the ASD starts at $1599. You can't tell me the computer parts of an iMac are only $200, and the display parts the rest.
 
@theorist9

A superb post and point

I'd argue the ASD is perhaps named incorrectly given what tradeoffs they chose and what they optimized for.

The name would make one think it's meant to pair with the Mac Studio .. which are pretty middle to high end small desktops, perhaps ideally suited to "studio" (and other professional) usage.

Those folks are the most likely to have their own, higher end, external peripherals (and not really be interested in or benefitting from ASD built in components)

The very best display panel with perhaps just some USB ports probably would have been best honestly.

Instead, the ASD makes choices and tradeoffs that make one think it's meant to be a MacBook dock
 
Yes, but those [true lossless codecs] cannot guarantee that the data isn't actually getting bigger for some signals.
Of course they can. Just add an instruction to the codec to revert to the original file anytime the output of the codec is larger than the original file. Then, with this small addition to the algorithm, you're guaranteed never to get a bigger output file. But do you have any evidence such pathological cases (where a true lossless codec produces an output larger than the input) actually occur?

Such lossy encodings are very much detectable given certain kinds of signals where the compression is overwhelmed by the complexity of the raw signal. You only need to know what to look or listen for.

And as I said: They make the output unreliable, which makes them completely useless to a professional workflow.
Again, you're making this claim, but do you have any evidence to support this? Can you give me any references?

I provided all the referennces I was able to find (one from VESA, which I acknowledged is not by itself sufficient, because it was funded by an organzation with a financial input in the outcome), and one that is independent, and both found DSC was visually lossless for 2D images. I acknowedlged that while this suppports the positon that DSC is visually lossless, it's not enough to establish it definitively.

Meanwhile, you've provided no references to support the position that DSC is visually lossy, yet you claim with certainty that it is. Do you see the problem here?
 
I'm just so happy that in my many decades of computing, I've never once had to think about "updating my monitor"
I have. I had a Samsung G9 49" ultra wide monitor that had serious problems when it came out. It was so bad they had to recall it within the first two months and stopped shipping them for several months. When they finally re-released it, it required a ton of firmware updates to get it to work with Nvidia cards because DSC was broken. I eventually ditched the monitor, though it finally was stable, because my eyes got bad enough that I got rid of my Windows PC in favor of my MBP. It was my gaming monitor. Fortunately, I didn't have to read much when gaming, but I did learn that 240Hz looked no different to me than 60Hz.

Every monitor has firmware updates. If you don't look for them, you don't know they exist, but they all have bugs that they fix. Humans make these products and humans make mistakes. Even the dumbest monitor has firmware updates. As the IT person in my household, I routinely update monitors in my house for my family along with everything else that requires updating.
 
Last edited:
  • Like
Reactions: constructor
Every monitor has firmware updates. If you don't look for them, you don't know they exist, but they all have bugs that they fix.

Not ones that are "necessary" like the ASD ones have been

I've literally never done a monitor update -- ever

And trust me, I'd seek one out and do something if I were having an issue

I guess I'm just lucky -- all my monitors have been great right out of the box
 
  • Like
Reactions: pdoherty
The onus to drive the price down is on companies like Apple. Instead, they've decided the opposite: they've made Retina a luxury. (And in the meantime, they've made text rendering on non-Retina displays worse, by killing subpixel antialiasing support.)
Exactly. This is a point I've made repeatedly on these forums. Apple changed MacOS in a way that requires consumers to use it on a Retina monitor to get optimal sharpness, yet they don't provide any consumer-priced Retina external monitors.

A very common consumer usage pattern is to buy a laptop, and then pair it with an external monitor for home use. Up to High Sierra (the last OS to implement native subpixel text rendering), you could get a MacBook Air and hook it up to a <$500 4k 27" monitor (163 ppi) and have a great visual experience. No longer.
 
Last edited:
i don't think so. maybe in past?
Lenovo maybe?
Can they change out the hardware via a software update? Cause that’s what it’ll take. Blind fanboyism is exactly why apple does this stuff and gets away with it.
No need to act so salty… 🧂
ProMotion
Ease of use
A monitor that has the same design language as my Macbook

Just take the M1 iMac and take the computer out. That's all anyone wants
Not everybody though, right?
That monitor put together out of various old parts is quite the dud
Old parts? Which “old parts” are you even referring to?
And "...no speakers. Just a screen, nothing else."
No speakers? The integrated speakers are really handy for those who don’t want to deal with bulky external speakers.
Does anyone know where I can get the wallpaper shown in this article?
F5FBDB8C-ECB7-4BB5-9C68-6CF50CE91EDC.gif
 
I’ll admit when Apple makes mistakes and the update process is where they definitely made one
As the IT person in my household who updates all the equipment in the house for my family, here is the update process for just about every non-Apple monitor:

1) Go on the web to see if the manufacturer has a firmware update. I've never seen a notification that there is one, unlike with Apple. To find the update, you have to know the exact model number of it. If I don't remember off the top of my head, I have to pull out my lighted magnifying glass to see the label on it.
2) Find the model and download the firmware. Extract update.
3) Stick in USB flash drive and copy it over after reading the directions for exactly how to update it. Some require running an installer that only works on Windows. Others have the process built into the monitor.
4) Stick it in the monitor's update port, hoping you got the right one. They only ever work with one port, even if your monitor has several. After sticking it in and failing to find the update, you finally figure out it's because your drive is formatted in exFAT instead of FAT32 or vice versa. I've had monitors that required the old FAT.
5) Go back and reformat the drive because every stupid monitor uses a different format and can't recognize the other formats.
6) Stick it back into the monitor and find out you put it in the wrong port because the monitor couldn't find the installer.
7) Remove it and put it in each port until you find the right one.
8) Finally the monitor finds the installer and runs it. Go get a drink and come back to see if it worked. Or stare mesmerized by the slow-moving progress bar if there is one.
9) Success!

Apple's monitor update process:
1) Go to System Preferences or Settings and run update (or have it run automatically). It'll pop up a window saying you have an update. If you have it set to run automatically, just logging in tells you you have an update.
2) Click yes to run it and authenticate using Touch ID or type your password.
3) Installer runs while you go get a drink or stare at the three dots or the progress bar on the laptop's monitor.
4) Done when you come back, or if not, you wait until the process finishes.
5) Success!

Where was Apple's mistake? Are people seriously complaining that Apple reboots your computer? First world problems that aren't problems. Who really gives a crap that a computer gets rebooted?

If you're complaining that something might get bricked, that is a danger anytime you update the firmware for anything. I've bricked all sorts of things like DVD players over the decades, but it happens very rarely. Ever try installing a BIOS on a Windows computer? Screw up and you need a new motherboard.
 
Not ones that are "necessary" like the ASD ones have been

I've literally never done a monitor update -- ever

And trust me, I'd seek one out and do something if I were having an issue

I guess I'm just lucky -- all my monitors have been great right out of the box
I've found most updates are designed to enhance the visual quality of the monitor. Others fix bugs that you personally may not have encountered. For instance, I haven't seen this Studio Display bug and I've had mine for quite a while. Take a look yourself. If you've had your monitors for a while, you'll probably find that a few revisions have been posted. It's up to you if you want to install them, but they're probably there. If you don't look for the updates, you'll never find them because nobody ever notifies that a firmware update exists except for Apple. They will never show up in Windows Update.

Samsung took about three tries before they finally fixed that DSC bug that caused Nvidia cards to top out at 120Hz instead of the 240Hz the monitor was capable of. It was quite high profile because it affected the main selling point of the monitor (240Hz), but if you didn't follow the tech news, you probably didn't hear about it. I bought one after the recall and re-release because they dropped the monitor price by over a third afterwards. Lucky for me, the frame wasn't misaligned or cracked with light bleeding through, which was the main reason for the recall.
 
  • Like
Reactions: pdoherty and SFjohn
Where was Apple's mistake? Are people seriously complaining that Apple reboots your computer? First world problems that aren't problems. Who really gives a crap that a computer gets rebooted?

For sure on that.

It's grasping at straws to have something, anything, to whine about regarding a display many people will likely never need (or may not be within their budget to purchase). Having a non-detachable power cord is another forum crowd-pleaser.

Again... it's for people who create and are picky about color and resolution (and in some cases, brightness). Anybody who regularly uses Lightroom or Photoshop, for example. I'm sure there are more situations. Or, has vision issues and having a superb display makes the difference between enjoying or struggling to read what's presented on a display.

Not for gamers.
 
Putting a mobile phone's operating system in a monitor, to avoid having to port Center Stage to a much simpler, more foolproof OS.

If they had followed your choice to make a completely separate OS, that issue would have still been there since they would have simply ported the code from iPadOS. There's no evidence that it's because it was a full iOS or iPadOS (I lean towards the latter because iOS 15 didn't support Center Stage) that caused the issue. Is there a risk of more bugs because it's full iOS/iPadOS? Perhaps, but only if it affects the portion of the code that is being used. If Apple finds a bug somewhere else, they don't need to fix it. I wouldn't be surprised if Apple stripped out all the libraries that aren't being used, so it probably isn't a full iPadOS.

As a software developer, I would say you're right in general that less complexity leads to fewer bugs, but that is offset somewhat by tried and true code that was already in production versus brand new code due to a port. The code involved had already gone through half a year's worth of testing and use in other devices by real customers. If I were in the product manager's position, I'd probably have made the same choice (I'm a software developer who has worked on the firmware of many proprietary hardware projects).
 
Last edited:
If they had followed your choice to make a completely separate OS, that issue would have still been there since they would have simply ported the code from iPadOS. There's no evidence that it's because it was a full iOS or iPadOS (I lean towards the latter because iOS 15 didn't support Center Stage) that caused the issue. Is there a risk of more bugs because it's full iOS/iPadOS? Perhaps, but only if it affects the portion of the code that is being used. If Apple finds a bug somewhere else, they don't need to fix it.

This.

If you have Center Stage and other features on the display then it needs a firmware and processor to support those features. The simplest way to do that is just do what Apple did.

Otherwise all the processing has to be done by macOS.

Like this, all the processing is offloaded to the display’s processor and that frees up resources on the Mac.
 
Apple's monitor update process:
1) Go to System Preferences or Settings and run update (or have it run automatically). It'll pop up a window saying you have an update. If you have it set to run automatically, just logging in tells you you have an update.
2) Click yes to run it and authenticate using Touch ID or type your password.
3) Installer runs while you go get a drink or stare at the three dots or the progress bar on the laptop's monitor.
4) Done when you come back, or if not, you wait until the process finishes.
5) Success!

Except it's easy to find instances of many people getting stuck in restore mode when they do this. Hell, after calling Apple and working with them they told me to bring it to an Apple Store. It was only because I was stubborn and kept trying after the call that I eventually got it to upgrade. This last update was 20 minutes of a reboot loop followed by me power cycling it twice before it upgraded. I love my ASD but this upgrade process did not get enough QA.
 
  • Love
Reactions: turbineseaplane
Of course they can. Just add an instruction to the codec to revert to the original file anytime the output of the codec is larger than the original file.
That's an escape from the compression getting larger than the full data stream, but that escape only spits you out at 100% + escaping overhead and doesn't get you to about halving your data rate as you need to when you're squeezing a 6k/10bit/60Hz signal into just half the TB bandwidth.

Only actually throwing away information will get you there in such cases.


Then, with this small addition to the algorithm, you're guaranteed never to get a bigger output file. But do you have any evidence such pathological cases (where a true lossless codec produces an output larger than the input) actually occur?
That is a really basic fact of life with lossy compressions and with all of them you can generate the difference signal between the full original and the compressed signal. The more you need to compress, the faster the compression needs to work in real time and the older the protocol is the worse the outcome will be and the bigger that discrepancy.

Listen to an old 128kb/s MP3 encoding of a challenging musical piece with multiple instruments and singing voices interplaying and you'll have no problem noticing the discrepancies and artefacts. Increase the bitrate and use a better codec to lessen those artefacts, but they only go away once you have full lossless encoding. And that can never have a guaranteed compression rate, inherently.

The same principle applies to video signals in related ways.

Again, you're making this claim, but do you have any evidence to support this? Can you give me any references?

I provided all the referennces I was able to find (one from VESA, which I acknowledged is not by itself sufficient, because it was funded by an organzation with a financial input in the outcome), and one that is independent, and both found DSC was visually lossless for 2D images. I acknowedlged that while this suppports the positon that DSC is visually lossless, it's not enough to establish it definitively.

Meanwhile, you've provided no references to support the position that DSC is visually lossy, yet you claim with certainty that it is. Do you see the problem here?
Are you paying me for a research project here? No, you're not.

These are perfectly standard facts about any lossy compression not just in current existence, but inherent to compression by principle: When lossless compression is not enough to achieve your target compression rate (and inherently it can't be enough in all cases), you must throw away some of the actual data. It is completely unavoidable, no way around it. There is no free lunch.

But this presentation here goes into a lot more detail on DSC in particular, which works exactly along those principles as described above:

And that Apple only offers DSC as a fallback option but whenever possible avoids it should tell you something, too.
 
Any complaints about what the Apple Studio Display actually is would gain a lot of credibility when attached with links to actually available better products which can match or exceed it in its crucial capabilities.

Bugs are never welcome and the ASD is no exception there. But it actually delivers what it says on the tin, it is practically alone in the market place on this to this day and Apple does support it properly when they need to.

Fantasizing about purely fictional, imaginary products which just do not exist are of no help to anyone. The ASDs before me on my desk are, though.
 
If they had followed your choice to make a completely separate OS, that issue would have still been there since they would have simply ported the code from iPadOS. There's no evidence that it's because it was a full iOS or iPadOS (I lean towards the latter because iOS 15 didn't support Center Stage) that caused the issue.

My understanding is the OS is closer to iOS than to iPadOS, but but the differences between the two is tiny anyways.

I presume you mean the speaker issue. And no, I'd say it would have been less likely to occur: what they did was take iOS's entire audio stack and use it on a display. That stack needs to handle things like intermixing multiple applications, system sounds vs. app sounds, AirPlay (including syncing the audio delay), Bluetooth, etc. A display's audio output would be a way, way smaller surface for potential bugs. And frankly, tons of displays for a tenth the price can handle outputting audio. At the end of the day, this is a case of overengineering.



Is there a risk of more bugs because it's full iOS/iPadOS? Perhaps, but only if it affects the portion of the code that is being used. If Apple finds a bug somewhere else, they don't need to fix it. I wouldn't be surprised if Apple stripped out all the libraries that aren't being used, so it probably isn't a full iPadOS.

I don't believe they stripped out anything, no.

(Does someone have a URL to an ipsw for it?)

As a software developer, I would say you're right in general that less complexity leads to fewer bugs, but that is offset somewhat by tried and true code that was already in production versus brand new code due to a port.

Yes, of course. But there's a reason most embedded devices tend to use simpler OSes.

The code involved had already gone through half a year's worth of testing and use in other devices by real customers. If I were in the product manager's position, I'd probably have made the same choice (I'm a software developer who has worked on the firmware of many proprietary hardware projects).

I understand the choice of "hey, we already have Center Stage up and running on iOS, so let's just use that". I don't think it was the right tradeoff.
 
  • Like
Reactions: pdoherty
Lenovo maybe?

No need to act so salty… 🧂

Not everybody though, right?

Old parts? Which “old parts” are you even referring to?

No speakers? The integrated speakers are really handy for those who don’t want to deal with bulky external speakers.

View attachment 2039517

Not salty, it’s the truth. Apple nerds would defend eugenics if it came with an Apple sticker and a bloated price tag.
 
Except it's easy to find instances of many people getting stuck in restore mode when they do this. Hell, after calling Apple and working with them they told me to bring it to an Apple Store. It was only because I was stubborn and kept trying after the call that I eventually got it to upgrade. This last update was 20 minutes of a reboot loop followed by me power cycling it twice before it upgraded. I love my ASD but this upgrade process did not get enough QA.
It’s easy to find people who have had trouble installing anything. Apple is no different. No company can predict every possible configuration. The things that affect these types of situations the most are external peripherals. You wouldn’t believe how many problems I’ve had updating dozens of different things. Try updating BIOS or NVidia graphics cards firmware on occasion. If you want risky, try those. Once, it took me six months to update my BIOS because the things simply would not run without erroring out. I had to skip that version and wait for the next one, which rather ticked me off because I couldn’t run Windows 11 without it. One Nvidia firmware update for EVGA corrupted my system so badly I had to reinstall Windows.
 
It’s easy to find people who have had trouble installing anything. Apple is no different. No company can predict every possible configuration.

OK, but there's… not that many configurations? You could have the Studio Display, or the Studio Display with more capable stand, or the Studio Display with less gloss. This isn't like on a Mac, where you could have all kinds of peripherals, software running in the background, etc.

The things that affect these types of situations the most are external peripherals. You wouldn’t believe how many problems I’ve had updating dozens of different things. Try updating BIOS or NVidia graphics cards firmware on occasion. If you want risky, try those.

Yeah, but those have to handle a massive amount of configurations. Different permutations of mainboards, CPUs, chipsets, other installed cards, external devices, software running in the background, malware scanners, malware itself, … Getting that into a well-defined state where nothing but the firmware updater will run is hard both from a software and hardware side (which is why some firmware updaters don't even bother trying and just have you reboot into a simpler OS environment).
 
That's an escape from the compression getting larger than the full data stream, but that escape only spits you out at 100% + escaping overhead and doesn't get you to about halving your data rate as you need to when you're squeezing a 6k/10bit/60Hz signal into just half the TB bandwidth.

Only actually throwing away information will get you there in such cases.
You avoided addressing my question: You claimed that losssless codecs can create output files larger than input files, yet haven't provide any evidence that this actually happens. With all due respect, and this is not personal since I don't know you, right now you're just some random guy on the internet making what appears to be a surprising claim without providing any evidence to support it. And I hope you can understand that it simply wouldn't make sense for me to accept it without evidence.
Listen to an old 128kb/s MP3 encoding of a challenging musical piece with multiple instruments and singing voices interplaying and you'll have no problem noticing the discrepancies and artefacts. Increase the bitrate and use a better codec to lessen those artefacts, but they only go away once you have full lossless encoding. And that can never have a guaranteed compression rate, inherently.
I'm well aware of the fact that MP3's are audibly lossy. That doesn't mean that DSC is visually lossy.

Are you paying me for a research project here? No, you're not.
Not asking you to do a research project. Just asking if your claims are based on something you've already read and, if so, what is it? I'm not asking you to find anything new, I'm just asking what info. you already have in-hand that supports your claim. If you can't provide it, with all due respect, it says you don't have an independent basis for your claim, in which case it simply doesn't make sense for me to give it weight. The presentation you linked doesn't say whether DSC is visually lossy or not.

These are perfectly standard facts about any lossy compression not just in current existence, but inherent to compression by principle: When lossless compression is not enough to achieve your target compression rate (and inherently it can't be enough in all cases), you must throw away some of the actual data. It is completely unavoidable, no way around it. There is no free lunch.
Yes, lossy compression always loses something. That's never been in question. The question from the start has been whether DSC is capable of lossy compression that can't be visually perceived. This is about human perception and thus, contrary to your claim, is not something that can be demonstrated mathematically or based on first principles.

And that Apple only offers DSC as a fallback option but whenever possible avoids it should tell you something, too.
Apple may be doing that because their position is the same as mine on this: It may be visually lossless, or it may not be, and there haven't been sufficient studies to demonstrate this either way.

I'm not arguing that DSC is visually lossless. Indeed, I've also argued against those who claim DSC is visually lossless—my position is the same: we don't know definitively.

Fundamentally, what bothers me about both your certainty that it is visually lossy, and others' certainty that it's not, is that both sides are claiming absolute certainty about something that has yet to be definitively established.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.