Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Question:
In practical terms - when should people opt to get the extra graphics performance of the M2 Max versus the M2 Pro for the MacBooks.
No doubt it's useful for things like video editing etc. So I'm not asking about the usual well known use cases. What about other edge cases though.
Is it at all useful for everyday tasks? For driving multiple monitors?
What about for using programs like Photoshop, Lightroom?
Do any games on the Mac really stretch the capabilities to making use of the Max?
Will running multiple programs at once in the background, during a zoom call for instance, result in fewer glitches?
Thanks!
 
The reason why Apple implements screen tech like OLEDs so much later than someone like Samsung is simple mathematics and production capabilities, where Samsung produces a few million at most, and Apple produces hundreds of millions per year of a product.


This will sound really dumb…but I’d love someone to explain to me what benefit ray-tracing would have for us in practice?
When you choose the red “heart balloon” effect in iMessage, you can see the reflections of word bubbles or nearby photos you’ve texted in the balloon. I think this is an example of ray tracing. 😆

Seriously, though, ray tracing really puts the 3D in 3D.
 
Clearly you have no CLUE how the memory controllers are implemented on M1/M2 vs Pro vs Max.
Or the extent to which 100GB/s is an extraordinarily high memory bandwidth for this class of SoC (which is basically the i3 of Apple's line).

You can read my PDFs (volume 3) to find out the engineering details.
https://github.com/name99-org/AArch64-Explore

Honestly this endless claiming (by people who know nothing of SoC design) that "Apple did it because they suck" rather than understanding the engineering is so damn tiresome. Be Better!
btw, why the hell are you writing in C++ instead of Swift?! Or at least Objective-C?!
 
  • Haha
Reactions: SFjohn
I would not be that certain for all users.
Many people can get away with 8 gb currently. Not that I would advise getting that. But it shows how efficient memory usage is on these machines.
It depends on what you do.
It’s not relevant to the comparison you’re making, but it’s worth noting, the plain, non-Pro M2 Mac mini has a memory bandwidth of just 100GB/s.

I doubt this was a cost-saving limitation — I’d be more inclined to believe Apple deliberately handicapped the base model for Marketing purposes.
Any smart vendor needs to create an organized range of devices increasing in value and performance. The comment seems to imply that "handicapped the base model for Marketing purposes" is something nefarious; it is not.
 
Question:
In practical terms - when should people opt to get the extra graphics performance of the M2 Max versus the M2 Pro for the MacBooks.
No doubt it's useful for things like video editing etc. So I'm not asking about the usual well known use cases. What about other edge cases though.
Is it at all useful for everyday tasks? For driving multiple monitors?
What about for using programs like Photoshop, Lightroom?
Do any games on the Mac really stretch the capabilities to making use of the Max?
Will running multiple programs at once in the background, during a zoom call for instance, result in fewer glitches?
Thanks!
In the really old days GPU was really separate, but more than a decade ago OS and app designers realized that GPU power was available, and started sharing CPU work with the GPU. Today Apple has CPU/GPU/RAM all architecturally close to or on the SoC using Apple's Unified Memory Architecture ("UMA"). Even prior to UMA there was no simple answer to your question, and now with UMA IMO all apps and the OS will increasingly be taking advantage of capabilities UMA presents.

So IMO it is pretty simple: 2024/2025 (the most relevant time frame for a box not yet purchased) each pricier box up the price chain will be generally doing most real apps work better (real apps being those that require substantive computer work, as opposed to simple word processing or email for instance). Specific performance of any given app will depend on how thoroughly the app designers integrate with the UMA over time.

RAM is part of the UMA and the speed improvements under UMA are very, very significant. The downside is that RAM is now baked in to the initial box; no adding future RAM cheaper from OWC. The upside is huge performance benefits. Almost all folks will want more RAM in 2024 than they wanted in 2022 (the exception being simple fixed usages like granny doing fine with her email under 8 GB RAM), and Apple is making more RAM available on the latest boxes (up to 32 GB in a Mini or 96 GB in a MBP; my 2016 MBP's maximum RAM was an expensive 16 GB).

The one straightforward thing that I think is now and will in the future be significantly RAM-sensitive is "...running multiple programs at once in the background, during a zoom call for instance." In my personal case running multiple apps at once now is seriously RAM-limited, even though the exact same apps (2017 versions and 2017 OS) and workflow did not RAM-limit in 2017.
 
Last edited:
You’re confusing a bunch of different technologies. The Quantum Dot OLED screens used in the Sony A95k and Samsung S95 last year (as well as the Alienware monitor) have nothing in common with any of their phone or laptop screens. Their phone and laptop screens have more in common with the WRGB OLEDs that LG Produces. They may produce QD OLED screens for notebooks one day but that isn’t the case now.
I'm not confusing anything(and OLED screens are OLED screens, so they have common points no matter the device they are used on, saying they have nothing in common is plain wrong), my example was to put things into perspective as to how an OLED screen can provide a more impactful image than an LED screen with twice or more than twice the peak brightness of an OLED screen. Brightness isn't everything.
This link from Asus shows just what I mean. At one point they show an image at 150nits on and LED screen vs 150nits on an OLED. The difference is obvious.

I have 2 OLED TV's in my house and a laptop with an OLED screen. The laptop has a peak brightness of 600nits but I generally keep it below 40% brightness because it feels too bright after than threshold, its uncomfortable for my eyes with regular static content (so excluding videos or images).
 
Last edited:
  • Love
Reactions: SFjohn
In the really old days GPU was really separate, but more than a decade ago OS and app designers realized that GPU power was available, and started sharing CPU work with the GPU. Today Apple has CPU/GPU/RAM all architecturally close to or on the SoC using Apple's Unified Memory Architecture ("UMA"). Even prior to UMA there was no simple answer to your question, and now with UMA IMO all apps and the OS will increasingly be taking advantage of capabilities UMA presents.

So IMO it is pretty simple: 2024/2025 (the most relevant time frame for a box not yet purchased) each pricier box up the price chain will be generally doing most real apps work better (real apps being those that require substantive computer work, as opposed to simple word processing or email for instance). Specific performance of any given app will depend on how thoroughly the app designers integrate with the UMA over time.

RAM is part of the UMA and the speed improvements under UMA are very, very significant. The downside is that RAM is now baked in to the initial box; no adding future RAM cheaper from OWC. The upside is huge performance benefits. Almost all folks will want more RAM in 2024 than they wanted in 2022 (the exception being simple fixed usages like granny doing fine with her email under 8 GB RAM), and Apple is making more RAM available on the latest boxes (up to 32 GB in a Mini or 96 GB in a MBP; my 2016 MBP max was an expensive 16 GB).

The one straightforward thing that I think is now and will in the future be significantly RAM-sensitive is "...running multiple programs at once in the background, during a zoom call for instance." In my personal case running multiple apps at once now is seriously RAM-limited, even though the exact same apps (2017 versions and 2017 OS) and workflow did not RAM-limit in 2017.
If Macs very often use virtual memory — especially with buyers who skimp on RAM — why can’t Apple allow RAM expansion in, say, a Mac Pro, that is of course slower than UMA RAM, buy way faster than SSDs? When Macs run out of RAM, they temporarily park data on the SSD, no? Using the SSD virtually as a sort of “slow RAM.”

Intermediary added RAM would be slower than UMA RAM yet faster than SSD I/O.

macOS would need a serious revamp to virtual memory management if it has to intelligently organize and manage I/O via UMA RAM, then modular RAM, then SSD storage/virtual RAM, but still…
 
The reason why Apple implements screen tech like OLEDs so much later than someone like Samsung is simple mathematics and production capabilities, where Samsung produces a few million at most, and Apple produces hundreds of millions per year of a product.
Samsung has been producing hundreds of millions of OLED screens long before Apple started to buy from them. Production capacity was never really a concern, not to mention the fact that Apple didn't use right from the start OLED screens on every single phone they sold.
Apple simply waited as long as they could because LED screens were really cheap, so they took advantage of that for an additional 1-2 generations.

This will sound really dumb…but I’d love someone to explain to me what benefit ray-tracing would have for us in practice?
It's the next step in graphical realism(and I'm not talking only about games). There are tons of articles and videos online explaining what it does and what are the benefits.
One thing is clear, ray-tracing is here to stay and Apple doesn't have the power to ignore it for very long.
 
Last edited:
Honestly not concerned about discrete GPU's. The issue is with the score discrepancies between M1 variants. The GPU cores across all variants of the M1 run at the same speed, so performance should actually scale linearly with only a minor loss for each additional core. (Apple has increased memory bandwidth of each variant to make sure those cores can be "fed" without any lag. On the Max, Anandtech tested the bandwidth, and was able to push data to all GPU cores with a sustained bandwidth rate of 240GB/s. So we know it's not a memory bandwidth issue.)

Geekbench Metal scores...
M1 8-core, ~10W TDP, 68GB/s: 20440
M1 Pro 16-core, ~30W TDP, 200GB/s: 39758 (+95%)
M1 Max 32-core, ~60W TDP, 400GB/s: 64708 (+317%)
M1 Ultra 64-core, ~120W TDP, 800GB/s: 94583 (+463%)

As expected, the first tier does in fact scale linearly; double the cores, double the performance. But as we get higher the scores drop way more than they actually should. The Ultra loses over 300% which is an insane drop in performance.

As a side note: Unfortunately until we get applications that are actually optimized for Apple's GPUs (Metal), we won't see the performance they're actually capable of.
Most content creation apps have been using Metal for several years now. Adobe used to give you a choice between Metal and OpenCL but it is all Metal now. Metal does not have anything to do with Apple GPUs other than the possibility of Apple designing the GPUs to run Metal very good. Metal is simply a graphics API like DirectX or OpenGL. *edit* Metal also covers GPU compute.
 
Last edited:
  • Like
Reactions: SFjohn
Do any games on the Mac really stretch the capabilities to making use of the Max?

Sure, especially games in Rosetta. You just have to increase the graphics quality and the resolution. Try to run Shadow of the Tomb Raider, Bordrlands 3, Metro Exodus or Resident Evil Village on ultra 4K without MetalFX.
 
If Macs very often use virtual memory — especially with buyers who skimp on RAM — why can’t Apple allow RAM expansion in, say, a Mac Pro, that is of course slower than UMA RAM, buy way faster than SSDs? When Macs run out of RAM, they temporarily park data on the SSD, no? Using the SSD virtually as a sort of “slow RAM.”

Intermediary added RAM would be slower than UMA RAM yet faster than SSD I/O.

macOS would need a serious revamp to virtual memory management if it has to intelligently organize and manage I/O via UMA RAM, then modular RAM, then SSD storage/virtual RAM, but still…
Apple's SoC with UMA - - despite being hella fast - - clearly does create some constraints at highest end usages. So I wonder about some of the same things you. In particular it seems to me that some kind of hybrid memory architecture (different RAM levels maybe) for the Mac Pro that would allow the hundreds of GB RAM that some usages like would be very cool.

My guess is Apple designers have been considering all the things you mention.
 
Last edited:
  • Like
Reactions: SFjohn and R2DHue
Any smart vendor needs to create an organized range of devices increasing in value and performance. The comment seems to imply that "handicapped the base model for Marketing purposes" is something nefarious; it is not.
I don’t know if this disagrees with your point or completely agrees, but I really believe — for coherence in product differentiation — Apple needs to cull the iPhone line!

There’s no reason Apple should still be making and selling the iPhone 12 from 2020 or even the iPhone 13 (just because some bean counter says, “there’s still a market for it”).

Just like with other Apple product lines — the Mac(s), the iPad, Apple TV — the new model should replace the old model and the old model should no longer be manufactured or sold.

I know you’re never supposed to ask, “What would Steve do?,” but in this case we know what “Steve did.” For most products, a new release would entirely replace the previous generation, and the previous generation would no longer be available as an option.

There are more than one reason for this:

1. Consumers get confused about exactly what differentiates one Apple product from other models from previous generations if there are too many options to sift through, and they can’t come to a purchase decision at all.

(1b. Customers buy an iPhone 12 when they’d actually be perfectly willing to spend the price of a 14 if the 12 was not available — dangling itself in front of their eyes. IOW, if they had no choice…)

It certainly isn’t as bad as in the 90s when Apple had dozens of confusing preconfigured permutations of the Mac: Mac SE, Macintosh LC, LC II, LC III, LCI III+, LC 45, LC 630, LC 520, LC 550, LC 575, LC 580, a bunch of models in the Performa line (which are too monotonous to list) — same with the extensive Macintosh Quadra line.

This beneficent “democratization of choice” proved to be a disaster for Apple and customers.

Sometimes, for the sake of the consumer, you need to limit choice. Sometimes, it is the best thing you can do for them. There are entire books written about this including, “The Tyranny of Choice,” and “The Paradox of Choice.”

In 1998, Steve Jobs consolidated the Mac consumer line into one product, the iMac. He kept the Power Mac and laptops, but phased out all other lines. If the iMac had the identical specs, but was sold as another beige box or tower, do you think it would’ve succeeded just as much as the iMac?!

(Also, Steve refused to charge just $999 for it, which had become the standard for consumer PCs of any brand.)

The 1998 iMac became the #1 selling personal computer 4 months after its release.

People paid $1,299 for the iMac instead of $999 for a PC because they found the iMac so intriguing and appealing.

2. You have to bear the risk that if you take away models for sale (like the iPhone 12), buyers won’t leave you for an Android brand that costs as much as an iPhone 12 or less.

The available evidence suggests that consumers will remain loyal when you do eliminate older, cheaper “choices,” and they’ll stick with the Apple brand/iPhone platform and pony up the extra cost of the newest iPhone model. I don’t know about right now, but the iPhone 14 Pro Max was Apple’s highest volume iPhone 14 model by unit sales!

3. Expectations constantly change and aging product models no longer provide the best, most modern User Experience, which does not reflect well on the iPhone OR the Apple brand.

I’d wager iPhone sales by unit would substantially increase if Apple prudently culled the iPhone line.
 
Last edited:
btw, why the hell are you writing in C++ instead of Swift?! Or at least Objective-C?!
Because
- C++ is what I know and
- I want to have a more or less obvious mapping from the code to the assembly. If you look through the entire series you will see that I usually write in assembly, but when it's convenient I write in C++ then look at the assembly to ensure that it matches what I expect without any weirdness or overhead.
 
  • Like
Reactions: R2DHue
Because
- C++ is what I know and
- I want to have a more or less obvious mapping from the code to the assembly. If you look through the entire series you will see that I usually write in assembly, but when it's convenient I write in C++ then look at the assembly to ensure that it matches what I expect without any weirdness or overhead.
(I was teasing.)
 
Okay so performance and longevity wise, which of the these two same priced systems am I better off with:
  • Mac Studio - Apple M1 Max with 10-core CPU, 24-core GPU, 16-core Neural Engine, 32GB, 1TB SSD; or
  • Mac Mini - Apple M2 Pro with 12‑core CPU, 19-core GPU, 16‑core Neural Engine, 32GB, 1TB SSD
It totally depends on what you want to do with it. Personally I ordered a maxed out M2 Mini (except with just 4 TB of SSD). My last one we used for 9 years, and it’s moving to another room in the house. Mostly using both as a media servers, but investing in the new one for the longevity… 😉
 
I made a comment about this when the "expected roadmap" was shared earlier in Jan. Their roadmap makes zero sense, right now, studio as a product, is dead in the water.

Not because the studio is bad. Great machine for people that already bought it, but there is very little rationale for buying it right now.
The studio is long in the tooth, but doesn’t it have the most expansive options of any Mac in the current lineup? I’m sure that will appeal to a few people who need it now… 🤷
 
Last edited:
I will never buy an OLED computer monitor. Way too many static screens I leave up for literally days at a time without changing them, depending on what I need to be doing.
I will happily purchase the new 2 layer OLED devices Apple introduces, it will be a very long wait for Micro LED panels (they are currently available, but blindingly expensive & not ready for mass production). 👍🏻
 
I'm not confusing anything(and OLED screens are OLED screens, so they have common points no matter the device they are used on, saying they have nothing in common is plain wrong), my example was to put things into perspective as to how an OLED screen can provide a more impactful image than an LED screen with twice or more than twice the peak brightness of an OLED screen. Brightness isn't everything.
This link from Asus shows just what I mean. At one point they show an image at 150nits on and LED screen vs 150nits on an OLED. The difference is obvious.

I have 2 OLED TV's in my house and a laptop with an OLED screen. The laptop has a peak brightness of 600nits but I generally keep it below 40% brightness because it feels too bright after than threshold, its uncomfortable for my eyes with regular static content (so excluding videos or images).

I didn't mean for that to sound as condescending as it did, so apologies for that. What I was trying to say is that you were talking about how impactful an OLED can be and using the bleeding edge best case QD OLED as an example. These are not the screens that are going into laptops and phones. Those still use regular old WRGB OLED screens with all the traditional limitations it brings. Then you show the Asus link which refers to a regular old non dimming LED backlight. Of course OLED looks better; that's a best case scenario.

I also have two OLED TVs (Sony A80J) and a MiniLED TV (TCL 635) and the Macbook Pro. While I appreciate the perfect contrast and blacks, there's something to be said for HDR popping a lot more on a 1600 nit MiniLED screen, especially in a bright room.
 
  • Like
Reactions: FriendlyMackle
I didn't mean for that to sound as condescending as it did, so apologies for that. What I was trying to say is that you were talking about how impactful an OLED can be and using the bleeding edge best case QD OLED as an example. These are not the screens that are going into laptops and phones. Those still use regular old WRGB OLED screens with all the traditional limitations it brings. Then you show the Asus link which refers to a regular old non dimming LED backlight. Of course OLED looks better; that's a best case scenario.

I also have two OLED TVs (Sony A80J) and a MiniLED TV (TCL 635) and the Macbook Pro. While I appreciate the perfect contrast and blacks, there's something to be said for HDR popping a lot more on a 1600 nit MiniLED screen, especially in a bright room.
Apple isn’t going to release OLED screens on their iPads & laptops without completely addressing brightness & burn in. I’m not aware of any 2 layer OLED screens on the market yet, and if the are I believe Apple will do them better If & when Apple releases them. 🍏👀
 
I don’t know if this disagrees with your point or completely agrees, but I really believe — for coherence in product differentiation — Apple needs to cull the iPhone line!

There’s no reason Apple should still be making and selling the iPhone 12 from 2020 or even the iPhone 13 (just because some bean counter says, “there’s still a market for it”).

Just like with other Apple product lines — the Mac(s), the iPad, Apple TV — the new model should replace the old model and the old model should no longer be manufactured or sold.

I know you’re never supposed to ask, “What would Steve do?,” but in this case we know what “Steve did.” For most products, a new release would entirely replace the previous generation, and the previous generation would no longer be available as an option.

There are more than one reason for this:

1. Consumers get confused about exactly what differentiates one Apple product from other models from previous generations if there are too many options to sift through, and they can’t come to a purchase decision at all.

(1b. Customers buy an iPhone 12 when they’d actually be perfectly willing to spend the price of a 14 if the 12 was not available — dangling itself in front of their eyes. IOW, if they had no choice…)

It certainly isn’t as bad as in the 90s when Apple had dozens of confusing preconfigured permutations of the Mac: Mac SE, Macintosh LC, LC II, LC III, LCI III+, LC 45, LC 630, LC 520, LC 550, LC 575, LC 580, a bunch of models in the Performa line (which are too monotonous to list) — same with the extensive Macintosh Quadra line.

This beneficent “democratization of choice” proved to be a disaster for Apple and customers.

Sometimes, for the sake of the consumer, you need to limit choice. Sometimes, it is the best thing you can do for them. There are entire books written about this including, “The Tyranny of Choice,” and “The Paradox of Choice.”

In 1998, Steve Jobs consolidated the Mac consumer line into one product, the iMac. He kept the Power Mac and laptops, but phased out all other lines. If the iMac had the identical specs, but was sold as another beige box or tower, do you think it would’ve succeeded just as much as the iMac?!

(Also, Steve refused to charge just $999 for it, which had become the standard for consumer PCs of any brand.)

The 1998 iMac became the #1 selling personal computer 4 months after its release.

People paid $1,299 for the iMac instead of $999 for a PC because they found the iMac so intriguing and appealing.

2. You have to bear the risk that if you take away models for sale (like the iPhone 12), buyers won’t leave you for an Android brand that costs as much as an iPhone 12 or less.

The available evidence suggests that consumers will remain loyal when you do eliminate older, cheaper “choices,” and they’ll stick with the Apple brand/iPhone platform and pony up the extra cost of the newest iPhone model. I don’t know about right now, but the iPhone 14 Pro Max was Apple’s highest volume iPhone 14 model by unit sales!

3. Expectations constantly change and aging product models no longer provide the best, most modern User Experience, which does not reflect well on the iPhone OR the Apple brand.

I’d wager iPhone sales by unit would substantially increase if Apple prudently culled the iPhone line.

We don't have super-recent data, but look at

The point is look at the ASP (average selling price) compared to the range of prices. It's always slap bang in the middle of the range. For every iPhone selling ABOVE this cost (ie the various pro models) there's an iPhone (and nowadays probably more than one) selling below this price, ie last year's model...

At some point possibly you could argue that the SE fulfills this role, but right now the iPhone SE comes with fingerprint, while the iPhone old models (13 and 12) come with Facetime.
So if you're in the market for a cheap iPhone, the choice is not as complex as it appears - if you are willing to put up with fingerprint, then SE; if you want FaceTime, choose a 12 or a 13 depending on your budget and how much storage you want.
 
Apple isn’t going to release OLED screens on their iPads & laptops without completely addressing brightness & burn in. I’m not aware of any 2 layer OLED screens on the market yet, and if there are I believe Apple will do them better, if & when Apple releases them. 🍏👀
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.