Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
With the new report posted on Macrumors today about the The M4 MacBook Pro models feature quantum dot display technology....I have to really think the M4 is the better option for my use case, and I would be saving an additional $220.

Where is the downside besides graphics and 2 GB less of RAM to the M3 PRO vs the M4 ?

I purchased a 14 " MacBook Pro M3 Pro 18 Gb, 512, (the base model M3 pro) from Best Buy for 1699 new and am still in the return window.

I'm trying to find out if buying an M4 pro with the basic M4 processor (16GB/521) for $150 off is a better use case for me now that ports are expanded than the current M3 pro with the M3 pro processor. Everything that I've seen is that they are essentially comparable from a processor perspective, but what about the ports/fans/screen quality stuff? It does look like it will drive the additional monitors and has some faster connections.
 
Would argue base iterations are keeping their value quite well, whereas the upgraded ones would drop a bit more. "Shifts in paradigms" changes that picture somewhat, but sellers seems to pretend that isn`t the reality. Would argue there will be a split between iPhone 15 and iPhone 15 Pro/iPhone16. For Mac Minis, there are 2 notable splits. Intels, M1/M2 and M4.

A base Mini M2 Pro should be a bit less worth than the base M4. About the same performance (M2pro better on graphic and have abt 2 years shorter software update period). Base M1 Max has a even shorter update period and also has somewhat better performance than the M4 Mini which is competitive on CPU, Neural and probably media engine (haven`t checked).

In my opinion, they are a quite bit overpriced in the 2nd hand market, as much as I like both M2 Pro and M1 Max, both would have to be cheaper than the new M4 Mini to trigger any sort of interest from yours truly. The intels would be interesting for running Linux, (not for me, they require USB A cables which I want to loose completely).

This is mostly what I meant about the time to pick. There are certain base models that are just plain good ones. And there are certain base models that, if one is familiar with the industry and looking closely etc, are clearly placeholder years. Like the M3. And the third generation iPad.

I think you're right about iPhone 16 and M4 being the new delineation.
 
Almost everyone is affected the only difference is how long your eyes tolerate it. PWM screens tire the eyes faster. He's an extreme case whose eyes tire the fastest with PWM, but I recommend doing an experiment. If you have access to Air (no PWM), work one day 8h on Pro and then a whole day 8h on Air.

That's the thing, I work on my MacBook Pro or external LCD monitor. I don't really try to get work done on my iPhone or iPad. They're just for quick messaging and on the go use, and the iPad really is just a TV at this point.

To me, high refresh rate is more important. I also don't typically work in a dark room. My eyes are very sensitive to lots of things, but PWM doesn't seem to be one.
 
  • Like
Reactions: pksv
Have to say...
This year they tempted me with the macbook pro.

But I'm resisting!
Still living the desktop life!
 
  • Like
Reactions: NoelWalker
Is quantum dot the reason there is a purple coating on the screen of MacBook Pro 16" this year? I can see the purple reflection in the area above the function keys of my silver computer.
 
they need to move to oled

Part of the issue is cost and capability. Large OLED displays with rgb stripe or even pentile subpixel matrices get very expensive very quickly - this is why we don't see pentile or stripe arrangements used in televisions, save for an odd Sony model. Even then, these OLED implementations don't get as bright as the mini-LED displays in the MBPs.

Quantum dots can be seen as a solution to that. It means they can have a solid color OLED backlight (blue) and have red and green quantum dots in front of that. This helps with color accuracy, reduces long-term degradation, and above all else, massively boosts brightness. Even better - QD means they can better control costs.

As for what that has to do with this quiet change - it's easy to assume that the OLED backlight wasn't ready to go, so they engineered a mini-LED stopgap solution. It's still an upgrade from the previous model, but not significant enough to mention apparently.

A good question will be what they'll call their QD-OLED display when it's ready? Quantum Retina Display?
 
The screen still has dark spot under mouse pointer on dark background and dimming around all edges of display. I think the biggest thing that sucks is no wifi7 but judging by the lack of 320Mhz band support on the iPhone 16 Pro maybe Apple isn't ready for wifi7 Apple always behind the times.
 
I really would love to buy one, but I'm simply unable to due to the issues caused by Temporal Light Modulation, PWM and Pulse Amplitude Modulation - to me, all MacBook Pro displays since at least the M series have led to bad eyestrain within minutes of use, and an inability to focus properly, especially on text. It looks like everything is under water, text is blurry and seems to float around, and the display seems too dark to me even at full blast.

And no, I don't have any of these issues with older MacBook Pro displays, nor do I experience this with many Windows notebooks (which generally have pretty bad displays, so they come with other problems,
would the new nano texture screens help you reduce the eye strain you experience on macbook pros screens?
 
Who cares when OLED is just round the corner?
They`ll introduce it when/if they get it right and don`t burn the screen a.o. With the longevity of mbp they don`t want a pile of returns and warranty claims. Personally I`m in no hurry to get a oled 32 until 100% sure it will cope without burn-in for years and years..
 
I wonder whether this is why the hit you take with the nano texture, with color vibrancy in particular, isn't as bad on the M4 MBPs as it is on the iPad Pro and ASD? I mean, it's still there plain as day, but not as bad.
 
I wonder whether this is why the hit you take with the nano texture, with color vibrancy in particular, isn't as bad on the M4 MBPs as it is on the iPad Pro and ASD? I mean, it's still there plain as day, but not as bad.
That`s right I`d argue. Have a look at Samsung 65 "the frame" and e.g an LG 65" 85T. A lot of the difference arises from the anti-glare solution of the Samsung. Believe the tech is pretty similar otherwise.

It SHOULD make nano texture far more attractive in terms of vibrancy of colors.
 
PWM on OLED is even worse. It's going to be fun watching people complain when those OLED displays hit if they complain about PWM of the current screens.

*** Wikipedia Disambiguation Required ***

Yes and No.

It depends on the OLED. There's various technological nuances where some OLEDs flicker more than LCDs, while other OLEDs flicker less than LCDs. There are many causes of flicker (whether from the backlight itself, the local dimming itself, pixel transition logic, voltage inversion algorithm logic, VRR gamma-compensation logic, GtG pixel decay between refresh cycle scanouts, etc)

The 240Hz+ OLEDs have much less PWM than FALD LCDs.

The 27" 480Hz OLED monitor has less flickerdepth than most FALD LCDs (including M1-M4), since PWM is the method of local dimming as it's more linear response than DC dimming due to the sheer number of local dimming zones.

The desktop-based OLED (not iPhone OLED) pixels use DC dimming on a per pixel basis, with only one tiny <5% brightness dip per refresh-cycle pixel reset pass, which is less than the flickerdepth of an incandescent light bulb during AC zero crossing events.

lighting-flicker.jpg


Different OLEDs use PWM-driven brightness dimming, including certain iPhones and Androids at different brightness settings. However, not all OLEDs use PWM for brightness dimming. Some of them have ginormous bit depth (>12bit) to the point where they can use DC dimming instead of PWM dimming.

While there are those with increased eyestrains on certain OLEDs and certain LCDs for some, it varies quite a lot. For many people, OLEDs also decrease eyestrain, as long as you fix the text-rendering issue (e.g. greyscale instead of ClearType etc) and have enough PPI.

Unfortunately the genuine research is clouded by a lot of noise by anecdotes, that overwhelm the science, due to the wide gamut of different flickers on different types of OLEDs, AMOLEDs, QDOLEDs, WOLEDs, PHOLEDs, etc.

TL;DR: A modern 480Hz OLED has less flickerdepth than an incandescent light bulb. Most eyestrain on OLED was traced to other causes. While PWM is a definite cause of eyestrain, it's also a common accidental wild goose chase to a red herring, since there are currently over 100+ different sources of eyestrain on a modern display.

In some laboratory experiments, trying to solve the flicker didn't solve OLED problems for 100% of test subjects (in fact, it only helped <10% because modern 480Hz OLEDs flicker so little now), but found some were affected by color gamut problems (e.g. blue light) or glossy reflections (e.g. no antiglare), or other cause that are not always easy to diagnose without doing dozens of blind tests for the >100+ different causes of eyestrains/nauseas/headaches that certain displays can create for different people. It's harder to diagnose than "Grandma said she get motion sick at the movies", because sometimes it is traced to unexpected causes other than the ones they formerly assumed due to specific groups parrotting only a tiny subset of the 100+ causes.

In a world where >90-99% of population do not get eyestrain from a specific screen, concurrently diagnosing 100 different sub-1-to-10% causes of eyestrains is ginormously tough, given screens are imperfect windows of real life. When PWM frequencies started to go beyond >1-2KHz, other causes of eyestrains became dominant. There's a lighting industry research paper that shows humans could see 1-5KHz PWM (page 6), so obviously, this is all genuine. However, it is often misdiagnosed too.


(Blind test in paper at lrc.rpi.edu)

It varies from "I can't see it" to "I see it but I don't get bothered" to "It bothers me only slightly" to "I seem to be getting nausea or headache" to "I get health issues".

Although flicker fusion doesn't go as high (e.g. 70-85Hz), the stroboscopic effect goes beyond. Stroboscopic effects, a known trigger, can be simply like a phantom array effect; and phantom arrays can trigger people (e.g. finite refresh rate triggers a motion sickness in a small fraction of people). A finite frequency (whether a Hz, a PWM, a framerate, a refreshrate, etc) can create stroboscopics, such as:

project480-mousearrow.jpg.webp

(Although this is a refreshrate-based stroboscopic on a 480Hz display test, other finite frequencies do produce stroboscopics -- including things like backlight PWM. The effect can be similar, e.g. stroboscopic effect that creates motion sickness or nausea or headaches or eyestrain, depending on human)

Display manufacturers try to add ergonomic features, like Low Blue Light and PWM frequency adjustments (in cases where PWM cannot be removed for technological-limitation reasons within a screen budget). Doing 1000-zone local dimming in a $500 screen while having linear brightness response, is quite expensive electronics-component-wise, and DC-dimming per zone requires more driving electronics per local dimming zone than PWM-dimming per zone. DC-dimmed zone FALD's cost more than 5x to manufacture than PWM-dimmed zone FALDs, sadly.

However, a lot of the longtime techgeek population keeps assuming PWM was the cause, despite it being now <10% the cause now. There are event accidental cross-pollinated misdiagnoses, where a person is getting genuine eyestrain from an iPhone (because of PWM) and genuine eyestrain from a desktop 240Hz+ DC-dimmed WOLED (but NOT because of PWM), causing the person to stubbornly assume it's PWM.

With various adjustments of the variables, PWM (while it is correctly sometimes is) is no longer the dominant cause of eyestrain on most modern FALD/OLED screens. (It can be, but it's now more often an accidental assumption)

Even the color primaries of a specific human varies by single nanometers, and CIE 1931 is just a mere population-averaged "One Size Fits All" boilerplate. That's in a world where ~12% (stats vary) of humans are colorblind (aka their color primaries aberrate significantly away from the norm, or missing color channels, etc). Tough decisions has to be made about display performance, quality, cost, etc on their pros/cons.

It's super tough to address when certain manufacturers choose to reduced a $10,000 supercomputer display (IBM-T221 first 4K LCD in year 2001) into a mere $200 Walmart special (your cheap 4K TV etc), sometimes...

Source: I work for display manufacturers, and am cited in over 30+ peer reviewed papers
 
Last edited:
They might have multiple panel suppliers or want that option for the future. Maybe not all Macs get this or might going forward?
Honestly I believe they didn't tout it bc if they had, then when they roll out non-QD OLED people might've perceived it as a step back (regardless of performance).
 
The last part isn’t true it’s not up to 1000nits from 600nits, it’s only 1000 nits when on auto brightness and outside in bright light, in door or with auto brightness off it’s the same 600nits max brightness and there’s no new tech it’s a software unlock. Apps like vivid let my M1 Max use full 1000nits all the time even indoors and without auto brightness on. All they have done it removed some software locks. It’s not because of new hardware at all. The old m1 MacBooks can go full brightness so why say it’s a new feature. Just let the m1 MacBooks and newer go full 1000nits all the time without 3rd party apps or each year they will improve it and say it’s another new feature when it’s not
You have no idea if this is true. It is completely speculation on your part. Also very likely wrong. Plus you are posting about it in an article now MacBooks using a different screen.....Plus there is more to a display than the screen anyway.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.