Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple chipmaker TSMC will begin test production of 2nm chips next week
So we move from 3nm to 2nm. As I keep predicitng, we should be at zero nm by the end of the decade. After that, the 2030's will be the era of negative feature size.

But maybe I'm wrong. I doubt it but possitly rather then going negative they curve flattens and goes horizonal and runs forever at some feature size just above some natural limit. So when they hit 0.8 or whtever. it stays there.

What weill the era of non-shrinking feature size be like?
 
  • Like
Reactions: Donoban
Intel was talking about 18A/14A and claiming they’d regain leadership by 2026, and TSMC was like okay, enough of all that talk, full steam ahead on 2nm!
I hope Intel does come out with their 1.8nm and 1.4nm equivalent processes. Competition rocks :)
 
So we move from 3nm to 2nm. As I keep predicitng, we should be at zero nm by the end of the decade. After that, the 2030's will be the era of negative feature size.

But maybe I'm wrong. I doubt it but possitly rather then going negative they curve flattens and goes horizonal and runs forever at some feature size just above some natural limit. So when they hit 0.8 or whtever. it stays there.

What weill the era of non-shrinking feature size be like?
It's a Xeno Paradox like thing, about a 20% shrink each time.

The numbers themselves are kinda arbitrary, as the density is increasing partially by going vertical these days.
 
Loving my M4 with OLED screen

Hope you all enjoy your last gen Macs 🙃
I have a M4 iPad with OLED and a “last gen” MacBook Pro with 128gb of RAM. Only one of them can run fusion 360 at all or illustrator effectively. Putting an M4 processor in an iPad and calling it better than a Mac is like putting 40” tires on a Lamborghini and calling it the best off road vehicle of all time.
 
  • Like
Reactions: Manzanito
“Trial production” = engineering run

This is pure clickbait by ET News… of course they are running lots already, part of development
 
  • Like
Reactions: DavidSchaub
So we move from 3nm to 2nm. As I keep predicitng, we should be at zero nm by the end of the decade. After that, the 2030's will be the era of negative feature size.

But maybe I'm wrong. I doubt it but possitly rather then going negative they curve flattens and goes horizonal and runs forever at some feature size just above some natural limit. So when they hit 0.8 or whtever. it stays there.

What weill the era of non-shrinking feature size be like?
Intel already moved to Angstrom terminology, and, “2 nm” is much more of a marketing term nowadays bs real “measurement”… shrinking will continue…
 
  • Like
Reactions: Chuckeee
Intel already moved to Angstrom terminology, and, “2 nm” is much more of a marketing term nowadays bs real “measurement”… shrinking will continue…
Well, Angstrom is just the same kind of marketing term; though Intel actual density increases may not match the 1.8nm and 1.4nm increased density that 18Angstrom and 14Angstrom implies.

Time will tell.
 
30% power savings with 10-15% more performance? Holy hell, the next gen iPhones are going to have incredible battery life and performance. Going to be interesting once that fab gets down to the Apple Watch.
 
The chip technology since iPhone and now Apple Silicon has been one of the most exciting things to follow. It's incredible that we are about surpass nanometers as the unit of measure in the very foreseeable future. Taking a moment to think about what an A17 Pro is capable of, and how much one could do with just that as far as productivity and computing are concerned is really astonishing, and how powerful the M3 and M4 are in a desktop operating system. And the future is looking very bright.
“Nanometer” as a unit of measurement has been gone for a decade. It’s just a marketing term which many people believe is actually indicative of the node size.
 
That assumes that one believes AI is needed. Which it is not.
You believe Siri is perfect as is? Most people here find it to be a joke, and AI will give it a chance to improve… but you’re not interested in that right? 😂
 
I have a M4 iPad with OLED and a “last gen” MacBook Pro with 128gb of RAM. Only one of them can run fusion 360 at all or illustrator effectively. Putting an M4 processor in an iPad and calling it better than a Mac is like putting 40” tires on a Lamborghini and calling it the best off road vehicle of all time.

Fourties on a Lambo Urus, gonna need to flare those fenders...!
 
  • Like
Reactions: Joshuaorange
Definitely expecting to see it on next year's iPhone. Wonder whether the iPad Pro will get it first
 
I’m enjoying Mac OS and my desktop browser, that’s for sure ;)
I prefer iPadOS and my iPad

only Macs from Apple I find interesting are the desktops, and unfortunately they don’t pay any attention to them, or not enough. Design wise the Mac Mini, the Studio and the iMac are great devices.

Apple’s laptops are nothing special. And MacOS isn’t anything special either. I ”enjoy” using Windows just as much.
 
  • Like
Reactions: gusping
I have a M4 iPad with OLED and a “last gen” MacBook Pro with 128gb of RAM. Only one of them can run fusion 360 at all or illustrator effectively. Putting an M4 processor in an iPad and calling it better than a Mac is like putting 40” tires on a Lamborghini and calling it the best off road vehicle of all time.
You can definitely do more on a Mac (Mostly) .Best device if you want to use your computer more traditionally. But I just enjoy my ”simple’ iPad so much more. And it has a better screen 🙃🙃🙃
 
You can definitely do more on a Mac (Mostly) .Best device if you want to use your computer more traditionally. But I just enjoy my ”simple’ iPad so much more. And it has a better screen 🙃

Better screen than the Pro Display XDR?!? I think not. That said, I love the screen on the iPad as well. Though I didn’t notice any difference going from the mini led backlit 12.9”.
 
You believe Siri is perfect as is? Most people here find it to be a joke, and AI will give it a chance to improve… but you’re not interested in that right? 😂
Nope never said that, but I love people with binary decision making. Evidently, the world is more detailed than you can understand.

Siri responding to specific requests to control things is not AI. That technology was common place 10 years ago. Apple just never "got it". Having Siri connected to chatGBT is not an improvement.
 
I prefer iPadOS and my iPad

only Macs from Apple I find interesting are the desktops, and unfortunately they don’t pay any attention to them, or not enough. Design wise the Mac Mini, the Studio and the iMac are great devices.

Apple’s laptops are nothing special. And MacOS isn’t anything special either. I ”enjoy” using Windows just as much.
I love iPad OS. Do I wish it could do more, absolutely. But it's great as is for casual use/consumption.

Agree on your Mac preference and point on Windows. I've also been a desktop person. I have a THICC gaming PC running Windows 11 and have owned several Mac minis and expect to own several more. I've been tempted by the Studio but simply don't need that much power.
 
Nope never said that, but I love people with binary decision making. Evidently, the world is more detailed than you can understand.

Siri responding to specific requests to control things is not AI. That technology was common place 10 years ago. Apple just never "got it". Having Siri connected to chatGBT is not an improvement.
Siri with Apple Intelligence will be on device and protect your privacy. All those companies that “got it” 10 years ago have been selling your information to make their devices work. If you want to use ChatGPT, you have to give it permission every single time.
 
You believe Siri is perfect as is? Most people here find it to be a joke, and AI will give it a chance to improve… but you’re not interested in that right? 😂
😂 you need to lower your expectations man. It’s perfect for sending messages, checking the weather, setting a timer, playing a banging playlist, etc.

To everyone that finds it a joke. Go back to the 80s where the idea of talking to a computer was science fiction.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.