M1 Ultra is also not used in a laptop...That's the chip they were comparing.Remind me, can the 3090 be used in a laptop? Is it energy efficient? Didn't think so. Different products for different uses.
M1 Ultra is also not used in a laptop...That's the chip they were comparing.Remind me, can the 3090 be used in a laptop? Is it energy efficient? Didn't think so. Different products for different uses.
Not at all. You're right, if you limit yourself to just counting CPU and GPU cores - the M1 seems like what an A14X would have been. But the M1 is a lot more than that. Most obviously, it's got a ton more I/O (though still a bit less than I'd like). Its *not* what the iPad pro would have gotten, if Apple weren't making Macs.I agree that it is all marketing, but I still feel like there should be consistency to keep things simple. People already act like the iPad made some huge leap frog jump in performance because Apple put the M1 chip inside it when all that is is a marketing name, the M1 is literally just an A14X chip, it's just a name. It is what the iPad was always gonna get no matter what, and people just don't get that.
For many a 2012 iMac may still be doing the job it was intended admirably
They were talking about the M1 Ultra, which also cannot be used in a laptop.
M1 Ultra is also not used in a laptop...That's the chip they were comparing.
TSM (ADR ticker for ASMC) has TWO Az (US) plants under construction. One, nearing completion, will be 4nm (requested by AAPL, NVDA and AMD). The one just starting is slated for 3nm.The new fab plant that is being built in the US will always be years behind. It's nice that they will build a plant here in the US. Many don't understand how far behind it will always be.
The 2011 17" MBP rocked. If Apple again makes 17" or larger MBP I will preorder."The all-new M2 Ultra in the all-new 18" MacBook Ultra, we think you'll love it...!"
Apple faces a challenge around that reality: heat. The issue is that heat management is not just some theoretical issue of SoC metrics. The Studio real world manages heat much better than the Mini does. IMO heat removal is why the (superb) Studio exists.3nm could make a dead silent 2023 Mac Mini (m2/3 pro) a possibility which could be almost as fast as the 2022 entry level Mac Studio (m1 max)
Nonsense. M2 is not bad. Even last year's M1 is plenty for most graphics workflows like mine. M2 does not need spectacular performance jumps, all it needs is all the evolved code and the architectural changes under the hood that version 2 of M1 silicon will benefit from.The M2/A16 are such bad chips that I no longer believe Apple is the leader anymore. I expect Qualcomm to eventually to take the lead. Qualcomm is even more efficient than Apple. Their performance cores need 4W while Apple needs 5W.
The M2/A16 were such underwhelming updates that I no longer believe Apple is the leader in the performance/watt arena. I expect Qualcomm to eventually to take the lead. Qualcomm is even more efficient than Apple. Their performance cores need 4W while Apple needs 5W.
What the heck are you talking about when you say "The consumer never wins if we give these corporations a free pass."?The consumer never wins if we give these corporations a free pass. Reputable reports have showed that Apple has lost many of their best designers and in-depth reviews of the M2 show that it’s less efficient than the M1.
Regarding the A16, it’s based on a 5nm+++ process . Apple tried to be slick by saying 4nm but that’s just marketing.
That's not what I'm saying, I'm still suggesting that Apple updates the MacBook Pros in a few months, but if the chips are indeed 3nm and not based on the M2 then Apple should just call the chips M3 Pro/Max. Because if they call the chips M2 Pro/Max then that's gonna imply that they are, well, the M2 chip but more powerful and that wouldn't be the case if they are built with a different process. I definitely don't think Apple's gonna hold off on updating the MacBook Pros for another year, not at all. I'm only talking about what they name the chip, that's all.
"In-depth" reviews by the clueless are not useful. That efficiency claim is false, produced by people who don't understand what they're measuring. Among other flagrant failures, you can't make conclusions about relative efficiency when running chips at two different frequencies, since small increases in clocks at the high end cause large changes in energy consumption.The consumer never wins if we give these corporations a free pass. Reputable reports have showed that Apple has lost many of their best designers and in-depth reviews of the M2 show that it’s less efficient than the M1.
Regarding the A16, it’s based on a 5nm+++ process . Apple tried to be slick by saying 4nm but that’s just marketing.
If you can predict the future, you should really be winning the lottery every week. 🤣🤣🤣Wrong, there is no way in the world the spring Mac's will be based upon the 3nm process that is just entering production.
[Citation required] for this claim that "Apple lost many of their best chip designers".- AMD and Intel have reached 2200 in Geekbench single-core (while Apple’s newest chip just hit 1880).
-Qualcomm has reached 1550 in single-core and Snapdragon Gen 2 has surpassed the A16 in graphics by a big margin
- Apple lost many of their best chip designers
You realize that it is TSMC that names its nodes, not Apple?The consumer never wins if we give these corporations a free pass. Reputable reports have showed that Apple has lost many of their best designers and in-depth reviews of the M2 show that it’s less efficient than the M1.
Regarding the A16, it’s based on a 5nm+++ process . Apple tried to be slick by saying 4nm but that’s just marketing.
What do you think it *means* when, for every new TSMC process, Apple is the first to use that process...?Wrong, there is no way in the world the spring Mac's will be based upon the 3nm process that is just entering production. There is too much work that is needed to prove in the new production line to ever allow it to be used in such a high volume high profile product that will be offered in a few months - it takes 8+ weeks to complete a wafer run.
There could be a multitude of issues with chips on a new line; power consumption, data integrity, fab process issues, backside issues, etc. so no way would Apple place a bet that this will go without a hitch and even of it did go well at TSMC, Apple would need to conduct it's own quality and reliability studies on the new process line.
Oh no I definitely agree that putting an M1 in the iPad was the right call, it makes everything easier for all the reasons you mentioned. I'm saying that a theoretical A14X chip is the same as an M1 chip in terms of power specifically, I know the A-series chips didn't have Thunderbolt controllers etc. I'm more pointing out the fact that people think the iPad is crazy powerful because it "got the M1 chip", whereas if it "only got an A14X chip", people would not think it's this crazy powerful thing even though it would've been the same amount of power. So many people imply that Apple went all out by going from an A12Z chip to the M1 when in reality it was the natural progression. I'm speaking purely on how the marketing of the name affects people's perception.Not at all. You're right, if you limit yourself to just counting CPU and GPU cores - the M1 seems like what an A14X would have been. But the M1 is a lot more than that. Most obviously, it's got a ton more I/O (though still a bit less than I'd like). Its *not* what the iPad pro would have gotten, if Apple weren't making Macs.
Consider the cost of shipping ten million iPad Pros and Airs with M1s instead of a hypothetical A14X with equivalent core counts. You're paying for extra area - perhaps 10-15% in total. On the other hand, you have the cost of designing, producing, and verifying a whole new chip. That's probably enough right there to just go with the M1, but on top of that you have to consider the opportunity cost of having a chip team with finite resources working on an A14X instead of the M2/M3/etc. And then you get the marketing benefit of using a chip with a fearsomely good reputation in your iPad ("It's got too much power" is a *good* problem to have). It's really a no-brainer.
Oh no I definitely agree that putting an M1 in the iPad was the right call, it makes everything easier for all the reasons you mentioned. I'm saying that a theoretical A14X chip is the same as an M1 chip in terms of power specifically, I know the A-series chips didn't have Thunderbolt controllers etc. I'm more pointing out the fact that people think the iPad is crazy powerful because it "got the M1 chip", whereas if it "only got an A14X chip", people would not think it's this crazy powerful thing even though it would've been the same amount of power. So many people imply that Apple went all out by going from an A12Z chip to the M1 when in reality it was the natural progression. I'm speaking purely on how the marketing of the name affects people's perception.