Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Clearly Apple chose 90Hz instead of typical 120Hz panels as there are probably warehouses full of the unwanted 90Hz screens in China and Apple got a great deal on them. Every decision they make that doesn't make consumer sense is for their profit.
Yeah, probably raises margins and loosens the QC so they can toss less screens. Similar to how AMD chips have somewhat overkill voltage at stock so even the less than golden samples hit the advertised clock speeds.
 
OK, kinda embarrassed to admit I didn't know this but MacOS has a fixed dpi, but Windows doesn't? Is that just a MacOS thing to try to give the same experience everywhere but with Windows you get to go your own way for better or for worse?
In broad terms, yes.
Windows is technically superior in this respect but depends on developers following the rules, not taking shortcuts and means that bitmap icons etc. supplied with applications have to be re-sampled. The Apple way is probably more consistent, less prone to non-compliant software. I also found Windows had problems if you had a "mixed economy" of different PPI displays, which the Mac seems to cope with better.

I just wrote a long post in another thread on this: https://forums.macrumors.com/threads/monitor-recommendations.2441691/post-33544899
 
  • Love
Reactions: smirking
A FIXED 90Hz display sounds like it will use more power than the current fixed 60Hz displays. That is not significant for the Studio Display or the iMac but would result in faster battery drain in a battery powered device like an iPad
I would guess that any Apple 90Hz display would be a Variable Refresh Rate (VRR) one from 1Hz to 90.
 
According to the anonymous source, the 90Hz display technology will debut first in the next-generation M3 iPad Air, expected in early 2025, before expanding to other products. The 24-inch iMac, which just received an M4 update, likely wouldn't see this display improvement until late 2025 at the earliest.

I'm not sure I buy the idea of a new M3 device. N3B is supposedly higher cost and lower yield compared to the N3E used for M4.

At the same time, using M4 for iPad Air would upset the market segmentation in the iPad line.
 
  • Like
Reactions: Jumpthesnark
The way I finally pulled the trigger on a Studio Display YESTERDAY. 😭 should I return it? Idk what to do now.
Based on a rumor from a rando calling in to a podcast?

This is not a product, this hasn't been announced, this hasn't even been floated as a rumor by the usual supply chain suspects we see quoted on MR. Enjoy your Studio Display, it's a great monitor for the Mac. And it exists today.
 
Based on a rumor from a rando calling in to a podcast?

This is not a product, this hasn't been announced, this hasn't even been floated as a rumor by the usual supply chain suspects we see quoted on MR. Enjoy your Studio Display, it's a great monitor for the Mac. And it exists today.
This.

When you make your buy, buy it for what the product is today, not for any promised future features.

Save yourself some unnecessary stress and the rest of us forum members from future buyer's remorse pain.
 
  • Like
Reactions: Jumpthesnark
I would guess that any Apple 90Hz display would be a Variable Refresh Rate (VRR) one from 1Hz to 90.
That is NOT what was reported in the article
According to the source, Apple is working on "a higher refresh rate LCD display with a new liquid motion panel fixed at around 90Hz," with plans to implement the technology across multiple products.
 
As mentioned several times, only a small fraction of people will notice much difference between 90Hz and 120Hz—diminishing returns set in after 90Hz. Most Apple customers (those not debating in the comments) aren’t even aware of the 60-120Hz refresh rate discussion for iPhones.

While it may seem stingy of Apple, it’s a pragmatic way to distinguish Pro models from the rest. Right now, refresh rate is a key reason enthusiasts (like us here in the comments) choose Pro models.
Also, capping ‘entry-level’ devices at 90Hz will be useful for some gaming initiatives; Apple can even claim that their devices can max out Cyberpunk 2077.

In the end, it’ll likely follow the usual pattern: the tech world will complain, but people will buy it and enjoy the upgrade.
 
  • Like
Reactions: cyb3rdud3
As mentioned several times, only a small fraction of people will notice much difference between 90Hz and 120Hz—diminishing returns set in after 90Hz. Most Apple customers (those not debating in the comments) aren’t even aware of the 60-120Hz refresh rate discussion for iPhones.

While it may seem stingy of Apple, it’s a pragmatic way to distinguish Pro models from the rest. Right now, refresh rate is a key reason enthusiasts (like us here in the comments) choose Pro models.
Also, capping ‘entry-level’ devices at 90Hz will be useful for some gaming initiatives; Apple can even claim that their devices can max out Cyberpunk 2077.

In the end, it’ll likely follow the usual pattern: the tech world will complain, but people will buy it and enjoy the upgrade.
Most people around me just plug in whatever HDMI lead they find and are perfectly happy with whatever automatically comes up ;)
 
So wait, they're still only using 60Hz, already years late behind the competition, and they plan on upgrading to something that's already outdated and will still be behind the competition?

Apple should be using 120Hz for all their entry-level device displays, and 240Hz for their Pro models. That's a clear 2:1 ratio between the two, and a 4:1 ratio to their older 60Hz devices, if that helps with the software side.
 
  • Like
Reactions: UnbreakableAlex
So wait, they're still only using 60Hz, already years late behind the competition, and they plan on upgrading to something that's already outdated and will still be behind the competition?

Apple should be using 120Hz for all their entry-level device displays, and 240Hz for their Pro models. That's a clear 2:1 ratio between the two, and a 4:1 ratio to their older 60Hz devices, if that helps with the software side.

Why? Refresh rate is mostly important for gamers, and refresh rate for high resolution displays is much more difficult and expensive.

You might as well say Apple should just drop back to 1080P like all the competition and forget about retina.
 
  • Like
Reactions: Christopher Kim
“And today, on the 10 year anniversary of the 99% adoption rate of 120 Hz by all other display and phone manufacturers, we have the exciting announcement that all new Apple displays will henceforth do 90 Hz refresh rate.”

And let's just ignore the massive difference in bandwidth required for 120 Hz at 5K instead of 4K and pretend that its trivial to send nearly double the pixels over a little cable. Maybe there is a reason no one in the world is selling 5K monitors with 120hz refresh rates?
 
I'm in the market to replace my ancient Cinema Display but Apples current offering is overpriced and lacks features wise. I recently read about the BenQ DesignVue 31.5" which is also 60Hz but is $999, I may have to go outside the Apple Ecosystem for the first time.
 
  • Like
Reactions: UnbreakableAlex
Why? Refresh rate is mostly important for gamers, and refresh rate for high resolution displays is much more difficult and expensive.

You might as well say Apple should just drop back to 1080P like all the competition and forget about retina.

I was talking about it from a competitors and advertising point of view, it looks bad when a supposedly premium product is inferior in specifications compared to cheaper products from other companies. But that's only looking at one "marketing bullet point".

It's true that higher refresh rates is much harder to do on higher resolution displays and requires faster GPUs, and I didn't think to do a 1:1 comparison between refresh rates + resolution with their competitors. There's also the increased power requirements that goes along with that, which for portable devices would require a bigger battery, resulting in a thicker and heavier product.

However, in the same price range as Apple, there are devices that have high resolution displays running at 120Hz and higher, so I don't see why Apple couldn't do the same. (edit: most competitors seem to have 2K displays)

Not that I care about refresh rates anyway, I'm over 50 and I've lived with both 30Hz and 60Hz all my life, I probably can't tell the difference between 60 and 90Hz, let alone 120 and 240Hz.

Which makes me question how they arrived at their choice, because 90Hz seems weird. Did Apple do polls internally and externally to test average Apple users? Did they test to see if most people could tell the difference between 60, 90, 120, 144 and 240Hz? If they did that and 90%+ of the people couldn't tell the difference between 90 and 120 Hz then of course it would make sense to go with 90Hz.

edit: I didn't even think about the bandwidth required for HDMI. A lot of the comments in this thread, my initial post included, must stop considering refresh rate by itself since it omits other requirements.
 
Last edited:
I would guess that any Apple 90Hz display would be a Variable Refresh Rate (VRR) one from 1Hz to 90.
The article states that it is 'Fixed', which would technically mean more load on the battery as its not a Variable Refresh rate screen. E.g. 90 hz even when reading static text.

Not great, but better than nothing I suppose.
 
Good observation. Thunderbolt 4 is 40Gbps and can support a 5k or 6k monitor at 60Hz. A 90Hz 5k monitor requires 31.85Gbps and a 120Hz 5k requires 42.36Gbps. So 120Hz 5k is just out of range for Thunderbolt 4.
With Display Stream Compression it‘s no problem.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.