Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I agree that it is all marketing, but I still feel like there should be consistency to keep things simple. People already act like the iPad made some huge leap frog jump in performance because Apple put the M1 chip inside it when all that is is a marketing name, the M1 is literally just an A14X chip, it's just a name. It is what the iPad was always gonna get no matter what, and people just don't get that.
Not at all. You're right, if you limit yourself to just counting CPU and GPU cores - the M1 seems like what an A14X would have been. But the M1 is a lot more than that. Most obviously, it's got a ton more I/O (though still a bit less than I'd like). Its *not* what the iPad pro would have gotten, if Apple weren't making Macs.

Consider the cost of shipping ten million iPad Pros and Airs with M1s instead of a hypothetical A14X with equivalent core counts. You're paying for extra area - perhaps 10-15% in total. On the other hand, you have the cost of designing, producing, and verifying a whole new chip. That's probably enough right there to just go with the M1, but on top of that you have to consider the opportunity cost of having a chip team with finite resources working on an A14X instead of the M2/M3/etc. And then you get the marketing benefit of using a chip with a fearsomely good reputation in your iPad ("It's got too much power" is a *good* problem to have). It's really a no-brainer.
 
So much comment on 5nm = 6, 4nm = 5 etc. etc., but for me and I suspect many users, especially after what I consider a disappointing M2, my interest is in what it will actually do, what benefit it will give and whether it will save me money by being more productive.

Of course its amazing the level of transistors on a chip, with not so long ago the first moon landing AGC with 17,000 transistors, weighing in at a rather heavy laptop 70lb(tic). capable of 40,000 instructions per second.

Sadly I suspect many commenting on the wonders of 3nm don't actually have a real need for the improvements other than having it and no problem with that. Takes all sorts.

There's probably a combination of the above in most of us, even if we can justify it on productivity gains.

Although Moore's Law is relevant still, we are up against the laws of physics and increasingly quantum physics with the latter being more prominent by the day.

For many a 2012 iMac may still be doing the job it was intended admirably but where we can't really knock the frenzy about new chips, new models nor competitors, as it is the fuel for further innovation.
 
  • Disagree
Reactions: NetMage
For many a 2012 iMac may still be doing the job it was intended admirably

I assure you the M2 or even M1 is a noticeable upgrade over a 2012 chip.

You’re not gonna see the kinds of leaps the M1 brought every year. There are occasional bumps like the A6, A7, A9, A11 that bring performance per thread up more than 50%, but most of the time, 10-20% is more realistic.
 
They were talking about the M1 Ultra, which also cannot be used in a laptop.

M1 Ultra is also not used in a laptop...That's the chip they were comparing.

M1 Ultra can be used in a laptop as the max power consumption is around 160 watts. It would just be thicker, louder and weaker when used on battery, which is the case for almost all gaming/creator laptops coming with Windows.

Besides, they were comparing the chip to RTX 3090 at the same power consumption, not at their absolute max power. Given M1 Ultra GPU is around 100 watts, that is easily comparable to an RTX 3090 at 100 watts. I agree that it is a little deceptive marketing though.
 
The new fab plant that is being built in the US will always be years behind. It's nice that they will build a plant here in the US. Many don't understand how far behind it will always be.
TSM (ADR ticker for ASMC) has TWO Az (US) plants under construction. One, nearing completion, will be 4nm (requested by AAPL, NVDA and AMD). The one just starting is slated for 3nm.

There is also IBM, which is committed to shifting from Samsung to a new US plant (presumably for 2nm), and is helping Japanese chip makers climb toward more modern designs.

Intel fell behind because it was poorly managed by a former bean counter, who thought they could advance without spending on new plants and equipment. Intel recently announced layoffs, along with many computer companies. It has been said by many that Intel needs to recruit folks from TSM or Samsung to catch up.

Meanwhile, work on sub-1nm is in the planning phases.
https://www.tomshardware.com/news/i...ntil-2036-from-nanometers-to-the-angstrom-era

Disclosure: I own stock in AMD, NVDA, APPL, TSM, TXN, IBM... and unfortunately, in INTC which has been less than stellar for some time (sold in the $50s/share, bought back in low $30s to mid $20s, may add more if it tanks further, maybe between $21-18 or lower).
 
3nm could make a dead silent 2023 Mac Mini (m2/3 pro) a possibility which could be almost as fast as the 2022 entry level Mac Studio (m1 max)
 
  • Like
Reactions: amartinez1660
3nm could make a dead silent 2023 Mac Mini (m2/3 pro) a possibility which could be almost as fast as the 2022 entry level Mac Studio (m1 max)
Apple faces a challenge around that reality: heat. The issue is that heat management is not just some theoretical issue of SoC metrics. The Studio real world manages heat much better than the Mini does. IMO heat removal is why the (superb) Studio exists.

Many of the folks wanting Pro-chip-level power in a Mini want that power to drive heat-producing workflows that should be running under the Studio's superior heat management. Such usages could overdrive the Mini's heat management capability, leading to overheating and the internet wags dissing Apple for the overheating. That is probably why no M1 Pro Mini was released.

So expect Apple to constrain users' ability to overdrive Minis. Either by hamstringing the chips used, or by reducing RAM allowed, or by reducing display-driving capability or by some combination thereof. Folks who intend heat-producing computing need to be directed into the Studios designed to manage such heat.

Edit: Another solution is that Apple could build fans into the Minis and just allow high heat-producing Mini workflows to kick in the by-definition-noisy fan operation. That way low budget users could still get access to Pro-chip-level power/RAM/etc. but the fans would keep their Minis alive. The fan solution is great for users who only rarely run heavy heat production workflows; it has worked for generations of Mac laptops.
 
Last edited:
The M2/A16 were such underwhelming updates that I no longer believe Apple is the leader in the performance/watt arena. I expect Qualcomm to eventually to take the lead. Qualcomm is even more efficient than Apple. Their performance cores need 4W while Apple needs 5W.
 
Last edited:
The M2/A16 are such bad chips that I no longer believe Apple is the leader anymore. I expect Qualcomm to eventually to take the lead. Qualcomm is even more efficient than Apple. Their performance cores need 4W while Apple needs 5W.
Nonsense. M2 is not bad. Even last year's M1 is plenty for most graphics workflows like mine. M2 does not need spectacular performance jumps, all it needs is all the evolved code and the architectural changes under the hood that version 2 of M1 silicon will benefit from.

Edit: Above comment applies to all Macs except the Mac Pro, which does need a spectacular performance jump.
 
Last edited:
The consumer never wins if we give these corporations a free pass. Reputable reports have showed that Apple has lost many of their best designers and in-depth reviews of the M2 show that it’s less efficient than the M1.

Regarding the A16, it’s based on a 5nm+++ process . Apple tried to be slick by saying 4nm but that’s just marketing.
 
The M2/A16 were such underwhelming updates that I no longer believe Apple is the leader in the performance/watt arena. I expect Qualcomm to eventually to take the lead. Qualcomm is even more efficient than Apple. Their performance cores need 4W while Apple needs 5W.

But their upcoming 8 Gen 2 only performs about as well as Apple's A14 from the 2020's iPhone 12. So they're still more than two generations behind.
 
The consumer never wins if we give these corporations a free pass. Reputable reports have showed that Apple has lost many of their best designers and in-depth reviews of the M2 show that it’s less efficient than the M1.

Regarding the A16, it’s based on a 5nm+++ process . Apple tried to be slick by saying 4nm but that’s just marketing.
What the heck are you talking about when you say "The consumer never wins if we give these corporations a free pass."?

Apple wants to make great SoC and I want Apple to make great SoC; our goals coincide. Plus Apple proved their competence with the M1 SoC. And FYI the 4nm nomenclature is pretty typical for the industry; certainly hella preferable to pushing "5nm+++" into the consumer marketplace.
 
That's not what I'm saying, I'm still suggesting that Apple updates the MacBook Pros in a few months, but if the chips are indeed 3nm and not based on the M2 then Apple should just call the chips M3 Pro/Max. Because if they call the chips M2 Pro/Max then that's gonna imply that they are, well, the M2 chip but more powerful and that wouldn't be the case if they are built with a different process. I definitely don't think Apple's gonna hold off on updating the MacBook Pros for another year, not at all. I'm only talking about what they name the chip, that's all.

Ah, gotcha, sorry for the misunderstanding/misinterpretation there. 👍
 
The consumer never wins if we give these corporations a free pass. Reputable reports have showed that Apple has lost many of their best designers and in-depth reviews of the M2 show that it’s less efficient than the M1.

Regarding the A16, it’s based on a 5nm+++ process . Apple tried to be slick by saying 4nm but that’s just marketing.
"In-depth" reviews by the clueless are not useful. That efficiency claim is false, produced by people who don't understand what they're measuring. Among other flagrant failures, you can't make conclusions about relative efficiency when running chips at two different frequencies, since small increases in clocks at the high end cause large changes in energy consumption.

As for the 5nm vs 4nm stuff, it's beyond ridiculous to pretend that one of those numbers is in any way meaningful or real while the other isn't. They're both marketing nonsense. N4 is a die shrink of N5, so it's smaller and better, even if not lots better. The basic metrics that matter are power, area, and clocks, and if you don't want to publish those numbers (which are meaningless to most people) all you have left is names and marketing terms.

In fact, those numbers aren't as simple as people seem to think anyway. For example "area" depends heavily on what you're making - logic and memory scale very differently. 3nm SRAM is barely smaller than 5nm SRAM, which is a real problem for cache-heavy designs.
 
  • Like
Reactions: Allen_Wentz
- AMD and Intel have reached 2200 in Geekbench single-core (while Apple’s newest chip just hit 1880).
-Qualcomm has reached 1550 in single-core and Snapdragon Gen 2 has surpassed the A16 in graphics by a big margin
- Apple lost many of their best chip designers
[Citation required] for this claim that "Apple lost many of their best chip designers".
A few high profile people have left Apple over the years. Jim Keller left many years ago and life went on. GWIII and Manu left about three years ago. The history of silicon valley is of people moving between companies; I mean FFS that's like the thing EVERY BOOK says in the first chapter, that silicon valley succeeded because of cross-pollination between Shockley then Fairchild then Intel then ...
Jim Keller is another example who has basically been through every CPU company/ISA that matters.

So yes, we have some evidence that some CPU people have left Apple. What we do not have is any evidence that
- the turnover now is any higher than in the past OR
- the turnover has had any significant effects.
 
  • Like
Reactions: amartinez1660
The consumer never wins if we give these corporations a free pass. Reputable reports have showed that Apple has lost many of their best designers and in-depth reviews of the M2 show that it’s less efficient than the M1.

Regarding the A16, it’s based on a 5nm+++ process . Apple tried to be slick by saying 4nm but that’s just marketing.
You realize that it is TSMC that names its nodes, not Apple?

You've totally shot your credibility with this sequence of every more ludicrous and uninformed claims.
 
The big season for new MBPs is late summer when university starts. Hopefully the 3nm Apple silicon MBP will be ready by then. We should have heard about this by early summer or similar (pure speculation).

Concerning games I think Apple still underestimates how important games have become for many users especially younger ones. Given how good and early Apple was concerning music this is an area they could improve from my view. eGPU capability might help a lot here. Not sure how many game companies might want to write code for Apple AAA games that can fully use the hardware's capabilities? Wouldn't future Apple VR or augmented reality glasses need many Apple games before to base this technology on?
 
Wrong, there is no way in the world the spring Mac's will be based upon the 3nm process that is just entering production. There is too much work that is needed to prove in the new production line to ever allow it to be used in such a high volume high profile product that will be offered in a few months - it takes 8+ weeks to complete a wafer run.

There could be a multitude of issues with chips on a new line; power consumption, data integrity, fab process issues, backside issues, etc. so no way would Apple place a bet that this will go without a hitch and even of it did go well at TSMC, Apple would need to conduct it's own quality and reliability studies on the new process line.
What do you think it *means* when, for every new TSMC process, Apple is the first to use that process...?
You think Apple aren't aware of the risks? That's exactly WHY the A16 is basically a slightly goosed A15, because the risk for N3 was too high.
But it's not too high for Macs, which have a flexible schedule anyway. Yes, it does mean that Apple don't hit their goal of complete transition to AS by end of 2022, but that's no big deal, and the only real risk. Apart from that, delaying the next round of Macs by a month or two is just fine.

You seem to misunderstand the terms here. What has started with N3 is VOLUME production; not ANY production. Small runs have been made throughout the last year (and more) which is how TSMC was able to categorize performance and yield (and so conclude, for example, that it made sense to switch to the slightly altered parameters of N3E). And what do you think they were building as the test wafers when they made those earlier runs throughout the past year...? Once again I ask you, what do you think it MEANS for Apple to be the first on each new TSMC process?

We have precedence for this sort of thing. Once again I would remind you of the A10 vs the A10X...
 
Not at all. You're right, if you limit yourself to just counting CPU and GPU cores - the M1 seems like what an A14X would have been. But the M1 is a lot more than that. Most obviously, it's got a ton more I/O (though still a bit less than I'd like). Its *not* what the iPad pro would have gotten, if Apple weren't making Macs.

Consider the cost of shipping ten million iPad Pros and Airs with M1s instead of a hypothetical A14X with equivalent core counts. You're paying for extra area - perhaps 10-15% in total. On the other hand, you have the cost of designing, producing, and verifying a whole new chip. That's probably enough right there to just go with the M1, but on top of that you have to consider the opportunity cost of having a chip team with finite resources working on an A14X instead of the M2/M3/etc. And then you get the marketing benefit of using a chip with a fearsomely good reputation in your iPad ("It's got too much power" is a *good* problem to have). It's really a no-brainer.
Oh no I definitely agree that putting an M1 in the iPad was the right call, it makes everything easier for all the reasons you mentioned. I'm saying that a theoretical A14X chip is the same as an M1 chip in terms of power specifically, I know the A-series chips didn't have Thunderbolt controllers etc. I'm more pointing out the fact that people think the iPad is crazy powerful because it "got the M1 chip", whereas if it "only got an A14X chip", people would not think it's this crazy powerful thing even though it would've been the same amount of power. So many people imply that Apple went all out by going from an A12Z chip to the M1 when in reality it was the natural progression. I'm speaking purely on how the marketing of the name affects people's perception.
 
Oh no I definitely agree that putting an M1 in the iPad was the right call, it makes everything easier for all the reasons you mentioned. I'm saying that a theoretical A14X chip is the same as an M1 chip in terms of power specifically, I know the A-series chips didn't have Thunderbolt controllers etc. I'm more pointing out the fact that people think the iPad is crazy powerful because it "got the M1 chip", whereas if it "only got an A14X chip", people would not think it's this crazy powerful thing even though it would've been the same amount of power. So many people imply that Apple went all out by going from an A12Z chip to the M1 when in reality it was the natural progression. I'm speaking purely on how the marketing of the name affects people's perception.

Well, another big difference was RAM. The A series has its RAM in the SoC. The M series has it on the package, to the side of the SoC. This allows the M1 in the iPad to offer 8 and 16 GiB RAM configs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.