Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thanks for this informative video, especially dealing with the SSD speed issue between the different models, so customers can buy the right one that they are happy with! Glad you were using the 1TB model, which do not exhibit the SSD slower issue that the 512GB MacBook Pro M2 Pro and 512GB MacBook Max M2 Max models seem to have.

This video may help explain the SSD speed issue for many MacBook Pro M2 Pro models. Very informative. He expresses the frustration that Pros and Creators feel with this nand SSD speed issue in a lighthearted video:

The MacBook M2 Pro SSD Downgrade/Issue? - I'm Frustrated:

1. Why would Apple do this? I know this is just a spec bump release and all, but a supposed spec bump upgrade where storage speeds are LOWER than the previous generation model that it is supposed to be “bumped up” from?! I can’t summon any acceptable excuse.

2. Does this only affect the 512GB models? Does the disparity disappear at 1TB and above?

I don’t think any review site or YT channel is going to go to the trouble of testing all the R/W speeds of a 512GB SSD MacBook Pro, then compare a 1TB config, then a 2TB, 4TB and 8TB model! All that sounds like an expensive proposition.

3. Is this an argument for configuring your MBP purchase with higher RAM? So that it has to resort to using virtual memory less often? (Extending the life of the SSD in the process; although I’m told SSDs are always busy doing something — even between discrete reads/writes.)

So, 32GB instead of 16? 64 GB instead of 32? It’s a pretty crude approach to mitigating slower SSD speeds to have to buy more RAM to circumvent virtual memory SSD use, you have to admit.

4. Does anyone know what Apple got out of its purchase of Anobit? In 2011, Apple paid $500 Million for Anobit, at the time its largest acquisition since it bought Next.

Anobit was a fabless chip design company that made proprietary Memory Signal Processors that were supposed to increase the reliability of SSDs through a type of ECC that prevented electron leak/electrical interference (aka “noise”) from corrupting charged multilevel flash cells, which would result in corrupted data stored on these ever-smaller (nanometer) and denser SSDs.

5. Anobit’s proprietary IP was supposed to increase the number of (at the time) 3,000 lifetime read/write cycles typical for SSDs to 50,000 read/write cycles, thus extending the longevity and reliability of data stored on SSDs.

Degradation over time or “entropy” is a pernicious issue that plagues SSD technology to this day: it’s been reported that more than 20% of SSDs develop uncorrectable errors over a 4 year period. 30% to 80% of SSDs develop bad blocks. These errors can affect data retention and lead to effective failure.¹

And if anyone wants to store data in a way that its integrity and reliability will last an especially long time, they’d best store it on Winchester hard drives, magnetic tape drives or Millennial optical discs (M-Discs). So, does Anobit IP solve this problem or at least increase the number of years before an SSD starts developing errors and failing? Does Anobit’s Memory Signal Processing and EEC technology set Apple SSDs apart from the rest of the industry? (Another option is to copy important data on an SSD to a brand new SSD every few years. That’s a cumbersome, expensive and labor intensive method to maintaining data longevity.)

6. Anobit’s IP was also supposed to substantially increase the R/W speeds of SSDs over what’s typical for the industry.

7. The acquisition of Anobit was supposed to result in significantly cheaper SSD prices.

Apple did get 160 Engineers out of the purchase of Anobit, but for $500 Million, did Apple get proprietary technology that makes Macs and Apple devices appreciably outperform/beat the industry in SSD speeds and prices? (Doesn’t seem that way.)

Has the SSD industry since “neutralized” any “Anobit advantage” Apple once had? (There goes half a billion dollars!)

Are the NAND chips on MacBook logic boards “off-the-shelf” Samsung or other chips or is Apple having TSMC or someone fab home grown “Apple Silicon” NAND flash chips? — that incorporate the Anobit IP/technology Apple paid $500 Million for?

Ultimately, did Apple get its money’s worth?

Anyone?

Anyone?
 
Last edited:
I wonder if Dan hits the nail on the head for a lot of MBP users.

IE. never hearing the fans.

For me this means he has rarely if ever pushed the computer along and challenged it at its optimum levels. I hear the fans all the time in my M1 14" MBP max. when utilising 100% CPU or GPU.

If you never hear the fans are you wasting money on the computer and can get a lower spec one, as all it suggests to me is you don't challenge the hardware.
 
I wonder if Dan hits the nail on the head for a lot of MBP users.

IE. never hearing the fans.

For me this means he has rarely if ever pushed the computer along and challenged it at its optimum levels. I hear the fans all the time in my M1 14" MBP max. when utilising 100% CPU or GPU.

I extremely rarely hear the fans on my 14-inch Pro. I rarely push the GPU, though, really only the CPU.

If you never hear the fans are you wasting money on the computer and can get a lower spec one, as all it suggests to me is you don't challenge the hardware.

I challenge it plenty.
 
It looks like an amazing laptop. I wonder if it's best to buy a refurbished M1 Pro/Max?
Last year's older M1 tech was improved by many tens of thousands of engineering hours to get to M2. Whether year+ old tech works for you depends on personal finances and on how big the discounting is. The savings need to justify using always-a-year-older M1 tech for the next 6 years.
 
So, do multiple smaller NAND flash memory chips work similarly to a RAID-0 (stripe) volume?

I.e. Better to have four 512GB flash memory chips than one 2TB chip — or two 1TB chips?
 
i know no change to design or forma factor, just change in processor ?
We need new form factor, new design every year.
😂 Does Apple have enough money to foot the bill for a new Industrial Design every year? 🤣
That would be nice but Apple likes to use outdated designs to save costs. Unless people complain or sales are low Apple doesn't bother changing up the hardware design.
 
  • Like
Reactions: svish and R2DHue
I'm looking to upgrade my Mid '15 MBP with i7 which was a Refurbished unit. I got a laugh out of the 32GB RAM comment for photography because mine has 16GB and I'm a freelance photographer running Capture One and Photoshop and have had zero issues with performance even on my old MBP.
Why is it so many folks fail to realize that a new box is for 2023-2028, not whatever worked in 2022 because the OS and apps have been demanding more RAM every year for 40 years. And unified memory makes adding more RAM even more useful. Instead of laughing, think.

E.g. my 2016 MBP had plenty of free space and worked fast and smooth. Today with the same apps (newer versions) same workflow the 16 GB RAM overfills constantly, despite the fact that the intensity of my workflow has decreased a bit.
 
That would be nice but Apple likes to use outdated designs to save costs. Unless people complain or sales are low Apple doesn't bother changing up the hardware design.
Oh, I know. The cost of an overhauled ID (including R&D and lots of retooling and manufacturing/production realignments) would be recouped over time in the price of the machine — or, Apple would offset the costs of a yearly ID overhaul by using cheaper, less-advanced internals.
 
Wow, that's so much RAM, do you mind telling me why do you need all that RAM? Max Tech on his YouTube channel says 16GB is enough for almost everybody! 😀
YouTubers are largely (not all) morons, albeit sometimes entertaining. Midlife of any new box is ~2025. Today's RAM is not what is needed for 2025. Forty years of observation tells any of us (no YouTube bimbos needed; RAM was 128k 40 years ago) that apps/OS will be taking advantage of more RAM in 2025.
 
Last edited:
  • Disagree
  • Like
Reactions: T Coma and R2DHue
The original M1 computers were game changing disruptively innovative, mostly because of architectural innovations like unified memory with on-chip graphics, cores, neural engines and ASIC encoders and decoders. Then there was the implementation of low power, ARM-based processor cores coupled with high speed SSD and TB peripheral connectivity. The Pro, Max and Ultra processors were certainly significant updates to a phenomenal baseline but were not particularly innovative except, possibly, in a product design sense.

The current M2 chips are not disruptively innovative like the M1 processor series. They are certainly improvements to the M1 designs but thee performance improvements are relatively incremental. So far, rumors of the pending M3 design suggest more incremental improvement based on die process size reduction and, likely, more up scaling. Right now, disruptive innovation seems to be entirely in the rear mirror.
 
My camera has the CF Express slot but I don't use the card in the camera because I don't have anything to read the card yet. My MacBook Pro came with an SD card slot.
You probably fail to fully use the capabilities of your camera. Otherwise you would be preferring XQD/CF Express and would own a good CF Express reader with fast cards like me with Nikon D500/D850.
 
You are saying in the video that Wifi6E isn't noticeable. What I can speak of is my experience. I don't have Wifi6E but with all my other devices I get around 500Mbps/s when connected to my WiFi network (including my M1 Pro 14" MacBook Pro). With the M2 Max MacBook Pro I got sustained 900Mbps. This is a big improvement for me and you can call me impressed.

Yeah with 802.11AC I got ~45MB/s copying files from my NAS. With WiFi 6 I get ~80-90MB/s. I'd really like to if that could be improved further with WiFi 6E as I use my NAS a lot. It's all dependent on the individual setup ofcourse, so reviews are not always useful.

File copies can run in the background but the shorter time it takes the less I have to work around it.

The reviewer didn't really address 6E perf at all,- as the usecase is not really "how does it feel with average Internet usage", unless we're talking about mitigating for a highly congested area in the 5GHz bands.
 
Last edited:
The original M1 computers were game changing disruptively innovative, mostly because of architectural innovations like unified memory with on-chip graphics, cores, neural engines and ASIC encoders and decoders. Then there was the implementation of low power, ARM-based processor cores coupled with high speed SSD and TB peripheral connectivity. The Pro, Max and Ultra processors were certainly significant updates to a phenomenal baseline but were not particularly innovative except, possibly, in a product design sense.

The current M2 chips are not disruptively innovative like the M1 processor series. They are certainly improvements to the M1 designs but thee performance improvements are relatively incremental. So far, rumors of the pending M3 design suggest more incFoetunately remental improvement based on die process size reduction and, likely, more up scaling. Right now, disruptive innovation seems to be entirely in the rear mirror.
It sounds like you think that disruptive innovation is something that should be an annual routine. It should not. M2 should be an incremental innovation.
 
SD was never a card for professional video workflows, it was always used by consumer products. But SD is still the only general purpose card that consumers use (Pocket 4K, 6K, Ursa Mini, 3D printers, audio recorders, etc...). Whereas for an arguably professional video workflow you will use CFast (not CF express and not SD) on cameras like the Alexa Mini and RAW recording on the Ursas and Canon Cinema line or XQD on the Sony FX6/FX9. But in that case you will have a dedicated reader that comes with the camera. So SD is the only one that makes sense to have built in, since you're kind of expected to always have access to it. Everything else is so fragmented and changing all the time that no matter which one you get in the laptop, it will only be useful 10% of the time.

I'd say if you're going to have one card reader in a laptop, make it the one that most people will use for most things most of the time: the SD card. Everything else is a very specific use-case and will remain that way long after the laptop is obsolete.
First a correction: XQD was the predecessor to CF Express and can be read by CF Express Readers. Both run circles around SD.

Second, most SD is slow as hell. Please keep it off my Macs. Instead give us back a useful port (ideally Thunderbolt) that we can plug a reader into that suits whatever past or future cards we may need: CF, CF Express, SD, SD UHS-III, whatever. Taking up space on a high end Mac for the next 6 years with a low end slow SD slot is just wrong.
 
1. Why would Apple do this? I know this is just a spec bump release and all, but a supposed spec bump upgrade where storage speeds are LOWER than the previous generation model that it is supposed to be “bumped up” from?! I can’t summon any acceptable excuse.

2. Does this only affect the 512GB models? Does the disparity disappear at 1TB and above?

I don’t think any review site or YT channel is going to go to the trouble of testing all the R/W speeds of a 512GB SSD MacBook Pro, then compare a 1TB config, then a 2TB, 4TB and 8TB model! All that sounds like an expensive proposition.

3. Is this an argument for configuring your MBP purchase with higher RAM? So that it has to resort to using virtual memory less often? (Extending the life of the SSD in the process; although I’m told SSDs are always busy doing something — even between discrete reads/writes.)

So, 32GB instead of 16? 64 GB instead of 32? It’s a pretty crude approach to mitigating slower SSD speeds to have to buy more RAM to circumvent virtual memory SSD use, you have to admit.

4. Does anyone know what Apple got out of its purchase of Anobit? In 2011, Apple paid $500 Million for Anobit, at the time its largest acquisition since it bought Next.

Anobit was a fabless chip design company that made proprietary Memory Signal Processors that were supposed to increase the reliability of SSDs through a type of ECC that prevented electron leak/electrical interference (aka “noise”) from corrupting charged multilevel flash cells, which would result in corrupted data stored on these ever-smaller (nanometer) and denser SSDs.

5. Anobit’s proprietary IP was supposed to increase the number of (at the time) 3,000 lifetime read/write cycles typical for SSDs to 50,000 read/write cycles, thus extending the longevity and reliability of data stored on SSDs.

Degradation over time or “entropy” is a pernicious issue that plagues SSD technology to this day: it’s been reported that more than 20% of SSDs develop uncorrectable errors over a 4 year period. 30% to 80% of SSDs develop bad blocks. These errors can affect data retention and lead to effective failure.¹

And if anyone wants to store data in a way that its integrity and reliability will last an especially long time, they’d best store it on Winchester hard drives, magnetic tape drives or Millennial optical discs (M-Discs). So, does Anobit IP solve this problem or at least increase the number of years before an SSD starts developing errors and failing? Does Anobit’s Memory Signal Processing and EEC technology set Apple SSDs apart from the rest of the industry? (Another option is to copy important data on an SSD to a brand new SSD every few years. That’s a cumbersome, expensive and labor intensive method to maintaining data longevity.)

6. Anobit’s IP was also supposed to substantially increase the R/W speeds of SSDs over what’s typical for the industry.

7. The acquisition of Anobit was supposed to result in significantly cheaper SSD prices.

Apple did get 160 Engineers out of the purchase of Anobit, but for $500 Million, did Apple get proprietary technology that makes Macs and Apple devices appreciably outperform/beat the industry in SSD speeds and prices? (Doesn’t seem that way.)

Has the SSD industry since “neutralized” any “Anobit advantage” Apple once had? (There goes half a billion dollars!)

Are the NAND chips on MacBook logic boards “off-the-shelf” Samsung or other chips or is Apple having TSMC or someone fab home grown “Apple Silicon” NAND flash chips? — that incorporate the Anobit IP/technology Apple paid $500 Million for?

Ultimately, did Apple get its money’s worth?

Anyone?

Anyone?
Hard to understand everything you are asking. But yes, buyers wanting to optimize new computer purchases should

- To the extent feasible, buy more RAM than seems appropriate for today. This has been sound procedure for 40 years, because app and OS designers constantly build to improve performance using more and more RAM, a good thing.

- Buy oversize SSDs to facilitate smooth operation, ~ double one's expected need. An SSD or HDD crash is a huge PITA, and spending a little extra to improve longevity is good finances.
 
Last edited:
  • Like
Reactions: R2DHue
It looks like an amazing laptop. I wonder if it's best to buy a refurbished M1 Pro/Max?
I just did exactly that. I saved £900 compared to an M2 MacBook Pro with equivalent memory/storage by getting a refurb M1 MacBook Pro with 16GB/1TB.

Arrived yesterday. You would never know it’s a refurb either. Absolutely like a brand new machine.

Way more computing power than I will ever need no doubt.

The M2 pricing killed it for me in the UK.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.