Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yay. Who knows, by 2023 maybe some of the big but lazy software vendors (you know who you are) will have updated their code to run Apple Silicon natively!

I stubbornly refuse to install Rosetta, as a matter of principle.
 
  • Like
Reactions: DEMinSoCAL
Yay. Who knows, by 2023 maybe some of the big but lazy software vendors (you know who you are) will have updated their code to run Apple Silicon natively!

I stubbornly refuse to install Rosetta, as a matter of principle.
I get you. On one hand, you're absolutely right. Translation and conversion layers can be bloaty.

But on the other hand, why NOT use translation and conversion to save on developer labor, money, or end-user hassle? After all, that is having computers do what computers are GREAT at doing; mundane processing!

But I'm with you, any code I write for Mac/iOS/iPadOS will be M1 native.
 
... But then they can neglect optimising the Mac version of the app, expecting AS to compensate.
Yeah, but at least most of the code is probably shared. Also I was thinking more of web stuff where there's no distinction between platforms.
 
3nm is insane. We were at 800nm in 1987... and to think man was driving around a horse and buggy just over 100 years ago. Thank the aliens for crashing their ships on our planet... without their technology we would be stuck at hand calculators.
 
It’s going to be interesting to see if the Mac consumer upgrade cycle decreases as a result of the Apple Silicon change.

Back in the nineties and early 2000’s, it seemed uncommon for a laptop to be useful for more than about two years. The rate of change was such that anything became unusable within that period, and often didn't even have the necessary specs to run new software.

That hasn't been the case for close to a decade now. Average computers still run great between 8 (really!) and 16gb RAM, maybe 32 for power users. My last computer, a 2016 MBP, was great for five years; my brother-in-law just informed me that he was finally replacing his 2013 MBP with an M1(!!).

I don't expect software needs to catch up with Apple Silicon speed any time soon (if ever), but one of the fascinating and under-reported things about M1 processors is that they don't just run faster than Intel chips, they do different things. There is no Intel "Neural Engine," for example. Afterburner cards haven't exactly taken off, but it's an interesting concept that has only just been tried.

While I can pretty much promise that I won't be replacing this M1Pro with an M2Pro next year, I can't say the same for the M3Pro. It might not just be faster — it could also be new in a way that wasn't possible in the Intel era.
It's definitely true that most consumer level software has been overtaken by the power of standard CPU/GPU's and their data needs has also been caught up by RAM size needs steadying around 8GB to 16BG. However, there are still some consumer needs in the graphics processing and gaming fields that are still growing, especially w GPU needs. And there are certainly many industry and scientific fields that haven't found the end point. Also, the producers seem to have reacted by stagnating the price of said parts. For those that need more than 8GB RAM and 512GB SSD, it is a frustratingly expensive exercise, especially with the 400% Apple Tax applied. One feels that the base sizes are being deliberately kept small, simply so that a premium can be charged for larger sizes, which long ago should have become the minimum.
 
It seems each year this is reduced by 1. Once they do hit 1nm what happens next. I know its not zero, it must become a decimal of the size or is there a smaller unit of measure to switch to. Must be option two now that I type it out.
SI units go Nano, Pico, Femto, Atto, Zepto and Yocto.
 
It’s going to be interesting to see if the Mac consumer upgrade cycle decreases as a result of the Apple Silicon change.

Back in the nineties and early 2000’s, it seemed uncommon for a laptop to be useful for more than about two years. The rate of change was such that anything became unusable within that period, and often didn't even have the necessary specs to run new software.

That hasn't been the case for close to a decade now. Average computers still run great between 8 (really!) and 16gb RAM, maybe 32 for power users. My last computer, a 2016 MBP, was great for five years; my brother-in-law just informed me that he was finally replacing his 2013 MBP with an M1(!!).

I don't expect software needs to catch up with Apple Silicon speed any time soon (if ever), but one of the fascinating and under-reported things about M1 processors is that they don't just run faster than Intel chips, they do different things. There is no Intel "Neural Engine," for example. Afterburner cards haven't exactly taken off, but it's an interesting concept that has only just been tried.

While I can pretty much promise that I won't be replacing this M1Pro with an M2Pro next year, I can't say the same for the M3Pro. It might not just be faster — it could also be new in a way that wasn't possible in the Intel era.
I can still name a few common websites that struggle on old computers, e.g. the 2014 base-tier MBP I'm typing this on. Twitter is the worst offender. Uses tons of CPU and RAM and visibly lags after scrolling too much.
 
Okay, seeing as it's Christmas and just as a bit of fun, could someone with some time and more knowledge than me estimate just how much more powerful an M3, M3 Pro and M3 Max will be compared to the M1, M1 Pro and M1 Max? And seeing as the M1s can handle 8k video without breaking stride, what will the practical advantages of the M3 variants be? It's like the future has landed. Indulge us with benchmark estimates!!! Wow us with battery life improvement gains! Glad tidings to all!
Sure, the M1 can process 8K. But if you've got 120 mins of 8K to process, it will take a while, so incremental advances are welcomed. Plus gaming hasn't even realistically started with 8K yet, and the games continue to grow more and more complex, and the graphics more realistic, so the needs of gaming have far from peaked.
 
so some of you are saying that the M2 is just a minor upgrade over the M1 and that the M3 is the "real" upgrade?
 
Could this be the chip that goes into the Mac Pro? Apple could announce the mac pro with very limited availability at the end of 2022 and basically release them to the general public in early 2023. Unless these are consumer grade chips that will go in lower end devices. Just seems to me like only the most expensive models get the new stuff first while the rest get speed bumps for several years.
 
I think m1 and pro and max chips are here for the foreseeable future. They can get speed bumped to much faster speeds and unlock more cores over the next several years. That covers the consumer macs and what apple today calls Pro. That entire pro line will lose the pro name and be eclipsed by a new mac pro speed level tier (with similar price points) and be called Apple's new Pro line, desktop, iMac, and laptop, all running at mac pro speeds and way more expensive than todays models. Then whatever new chips they come out with after that will basically replace the current ones with cheaper to produce models that are slightly faster than the speed bumps we will get for a few years and give them another few years of speed bumps with the same exact chips. They will just keep clocking them up instead of replacing them with new ones. They could save a lot of money that way without doing much work. If not, then it wouldn't take much to use some of the current space on the die to add a new component that could serve some purpose the older ones can't do that way giving folks a reason to upgrade already fast computers. I'm just saying they wont be replacing the chips they have with anything radically different, just slight modifications for the foreseeable future. The M3 is probably going to be Apples top of the line chip for the next several years, just with more cores every year till it gets replaced.
 
I was wondering since the RTX 3k series are notorious power hogs and it’s been said being manufactured by Samsung instead of TSMC played a part in that.

It’s not a 1:1 comparison since no x86 cpus are made on Samsungs processes. But I’ll take your word for it because I’m no expert.
It definitely did, Samsung's 8nm process is not great, but so is Intel's 10nm process.

Everyone is far, far behind TSMC.

Of the processes available and in use right now TSMC is leading the pack with its N5 (which includes N5P, N4 etc etc), behind that is Samsung's 5LPE (5nm) which is very very close in performance and density to TSMC's N7FFP process, then it's a ******** between Intel's 10nm and Samsung's 8nm - both are pretty poor).

TSMC is basically a generation ahead of Samsung, who's a generation ahead of Intel. We will see how 3nm nodes shakes out. My guess is TSMC will crush everyone with Samung's GAA 3nm being a bit of a wild card as I've struggled to see good info with regards to how GAA is performing, could be spectacular, could struggle. TSMC's N3 process is looking very good.

I'm a TSMC shareholder and have made quite lot of money off of them simply by understanding whats going on and who's delivering.
 
  • Like
Reactions: JMacHack
The good thing of the Intel era was that you could buy a Macbook Pro and use it for at least 5 good years before it became sluggish and you get a new one… On Apple Silicon I expect that generations will become outdated faster like on iPhone
 
The symbol for angstrom is Å so no confusion with ampere.


A silicon atom has a diameter of about 2.22 Å (0.222 nm)
Thank you for the lesson in symbology. But how do I type that? No, don't bother; because I'm not going to start typing it. And neither will 98.9% of MacRumor people.

You see, I was responding to RealityCK's post and he or she already did precisely what any human is going to do for a character that looks like an "A" but is not on the keyboard: Type "A". Which is Amps.

Engineers and scientists may be left-brained to the point where they'll figure out the key sequence for that character, and commit it to memory. But everyday folks aren't going to do that. We're going to type "A" or "Amps". When they took the "cent sign" off of modern keyboards, we started typing "cents" or ".xx dollars" or "$.10". We did NOT find the key sequence for the old cent sign, even though it surely still exists today. And no, I'm not going to look it up.

Maybe if we need specificity, we can go to 10ths or 100ths of a nanometer.
Or maybe just type out "angstrom", or "angs" for short. Either way, problem solved; no with no hieroglyphics sacrificed or harmed. ?
 
  • Disagree
Reactions: Houpla and rs-mac
The good thing of the Intel era was that you could buy a Macbook Pro and use it for at least 5 good years before it became sluggish and you get a new one… On Apple Silicon I expect that generations will become outdated faster like on iPhone

Why? 2018 ipad pro still runs super fast. 2017 iphone still runs super fast. My 2016 MBP runs iffy.
 
It seems each year this is reduced by 1. Once they do hit 1nm what happens next. I know its not zero, it must become a decimal of the size or is there a smaller unit of measure to switch to. Must be option two now that I type it out.
It is probably going to be referenced by pm (picometre) for the process measurement.
 
This may be the LAST and GREATEST chip.
I really doubt they can make a 2nm or 1nm chip that small and with lots of processing power.

The future from here is more cores.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.