Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Man, now I want me some ARM chips built on a 5mm process. Just think of the power handling capabilities. Plus, you could use it for heating houses. Heck, put a pressurized water vessel over it and hook up a turbine and you could generate power with it.
Just had to look it up: The 6502 processor used in the Apple 1 and Apple ][ computers was built with an 8 µm process, that is 1,600 times larger than the 5nm of the A14. I can't find a reference to the A14 chip size, but the A13 was 98.5 mm^2 or about 10mm squared. In the 6502 process, 1,600 times larger, we'd get a square 16 meters in each direction.
[automerge]1602543149[/automerge]
Not Arm. AMD (x86 and x86-64), Sun (Sparc) and Exponential Technology (PowerPC). Before that I worked on a MIPS-like architecture called F-RISC.
Sorry, meant AMD, not ARM.
 
this could be the last GREAT ARM chip
I mean how small can they make a processor?
Seems like 5mm is the limit
next chip 2.5mm? I think not
would have to start piggy backing 2 5mm chips together at this point

I would never buy a professional MAC with an ARM chip

Just think INTEL and AMD is much better at this game of chip design for heavy intensive computing.

Your chips are off-scale by a factor of 1 million. Try nano-meters (one thousand millionth of a meter) and not millimeters :)
 
Last edited:
  • Haha
Reactions: Homy
I am slightly afraid that the A14 CPU for laptops will just be slightly better iPad CPU. I was expecting big gains thanks to active cooling ... but knowing Apple they might just go with low power chips to make new laptops even thinner.

We will see. As a creative pro who has used Mac on and off for 20 years, I’m excited to see their new direction. If it’s performant and can offer great speed with software like Adobe Suite and LPX then I’m on board, if not I’m jumping ship to a custom AMD build. I figure I’ll know what direction this is headed by spring.
 
As someone that has a 16-core Rack Mac Pro, a really nice Intel 5GHz i9900K game system, a 12.9 iPad Pro and various other devices - gaming "bragging rights" has frack all to do with productivity, and WTH does a 144Hz monitor have to do with speed running? Speed running rarely has anything to do with twitch reflex games where the 144Hz monitor does have value.

I can tell you right now my Mac Pro beats the pants off my gaming system at some things, the game system beats the Mac Pro at others, etc.. Use the best too for the job, and that's not always about pure performance either. There is usability and any number of other factors you need to consider.

I honestly don't expect Apple to have much of a pure performance margin, if any at the top end, but they aren't going to be left in the dust either. However if you're talking performance/watt, then yes, I expect Apple to have a substantial margin with their AS series chips.

FWIW, my first computer gaming experience was on a MacPlus.
 
Compiling Docker for ARM shouldn't be too bad; at most a couple of weeks of work.

I wonder how hard that is. I mean, docker won't run on MacOS/x86 without a VM assist, so maybe they're really more bound to linux than we'd think.

OTOH they do have Docker on the Raspberry Pi, but those generally run a linux variant as well.

I don't know a lot about Docker's back-end, and what the environment actually looks like to whatever's in the container. It's less than a VM, obviously. I always thought they were more like Solaris Zones, but apparently they are not.
 
Attirex, that's unlikely. However, cmaier _did_ indeed design CPUs for AMD and for other companies before that.

(Sorry, originally wrote ARM by mistake).

Oh, I believed him, which is why I had to ask. :) In my mind, that's the sort of thing that goes on with chip engineering--a different plane of intelligence/existence. My existence = I watch TV and try not to get run over when I cross the street.
 
I wonder how hard that is. I mean, docker won't run on MacOS/x86 without a VM assist, so maybe they're really more bound to linux than we'd think.

OTOH they do have Docker on the Raspberry Pi, but those generally run a linux variant as well.

I don't know a lot about Docker's back-end, and what the environment actually looks like to whatever's in the container. It's less than a VM, obviously. I always thought they were more like Solaris Zones, but apparently they are not.

Docker already exists on ARM Linux, and is available on MacOS and Windows via their respective OS built-in hypervisors (xhyve and Hyper-V), which run a Linux kernel. Docker needs a Linux kernel, which is then shared across all containers, which is why they can be so small (unlike VMs).

I would imagine (but don't know) that Docker on AppleSilicon would also use a native hypervisor to run.

The issue for developers is how to use or build x86 Docker containers. You can build multi-architecture containers that support ARM and x86, but it involves extra steps.
 
Huh? Is the MacRumors Article Writing Software set to “maximum word count”? Hard to read, man, hard to read.
 
Nvidia’s purchase of ARM is looking like a smarter and smarter buy.

ARM is nolonger relegated to the realm of underpowered devices. They are really crushing everything in terms of performance, hell the worlds most powerful supercomputer is based on ARM.
I really think the future is ARM and it will eventually replace x86_64.

Intel’s bread and butter (server chips) are getting eaten from all sides by ARM/AMD. Wouldn’t like to be a stock holder of theirs.

I’m very curious to see how Intel vs ARM MacBook Pros stack up in outright performance and efficiency. As an EEE it never ceases to amaze me just how good Apple’s chip designers are. I hope they can work that same magic on their own modem.
I would be so down for a documentary, tech story piece or similar to walk me through how all of this happened... how ARM came to be from underpowered and efficient devices to basically what seems to be a silver bullet in computation.
How come Intel didn’t see it coming or if they did why didn’t steer the boat or if there’s anything to be done actually... why RISC made sense but then CISC and then back again to a RISC like architecture.
I’m guessing we still have to see what’s really going to happen, I don’t think we can write off Intel just like that. And have they pointed to any hints of feeling threatened? Because I haven’t seen anything.

So many questions.
 
  • Like
Reactions: 827538
Just funny you keep crapping on gamers, not realizing how much money many twitch streamers make.

Certainly a ton more then I (a civil engineer) make.

Being able to make big bucks sitting around playing games is amazing. That’s someone who’s dominating the game of life and laughing at all the smucks making fun of them.

Ignorance is bliss though, I suppose.

I'm sure there are some gamers who do make a living from it, but I wouldn't expect it to be a large number.

If I were giving career advice to a young person, I wouldn't suggest playing games 12 hours a day as a good starting option. Even if we generously put gaming into the same category as professional sports-people, it must surely be something that has a limited period of making money, as trends change and reflexes wane. A gamer would do well to develop some other income-generating skills.
 
So for you guys, its about the money that makes a man serious or more important for society?Gamers are locked down in their world without any real social people. They know just 1 thing, obsession about games
So a gamer is more important than a crappy salary doctor that tries to fight this covid issue
No wonder, how small people thinks

again, gamers are often without any knowledge about anything else(like our user that made that hilarious statement, so theres a proof), because they are too obsessed to one thing and their entire life can’t learn travel meet or make money based on something else..some of them can’t even talk in their own language
So kudos for you gamer
Do you think the same of experts on their craft that only know about only a single thing and ONLY that thing? Like talking to a hardcore many PhD physicist?
What about soccer, football, basketball players? They live and breath their sport, so much so that they are allowed to skip actual school and studies, tons of documentaries on that.

I do agree in part with what you have to say and our gamer person over here happened to hit all the gamer cliche points, but I don’t think it’s such a clear cut cookie. For example, they do have to have tons of communication and synchronization skills for team based games like LoL or Rainbow Six. And being isolated at home on their games, actually would help keep the viruses contained.
Just throwing a bit of devil’s advocacy.
 
Just had to look it up: The 6502 processor used in the Apple 1 and Apple ][ computers was built with an 8 µm process, that is 1,600 times larger than the 5nm of the A14. I can't find a reference to the A14 chip size, but the A13 was 98.5 mm^2 or about 10mm squared. In the 6502 process, 1,600 times larger, we'd get a square 16 meters in each direction.
Okay, now I'm curious... So, per Wikipedia, the A13 is 98.48 mm^2, using a 7 nm feature size...

... and a 5 mm process size would be 5,000,000 nm, which is about 714,285.714 times the feature size of the A13. So, the 98.48 mm^2 (9.924 mm on a side) of the 7 nm A13 ... becomes 7,088,363.560 mm (4.405 miles) on a side. Which works out to a die covering 12,415.785 acres, or 19.400 square miles.

Conclusion? A 5mm process A13 would be... really big.

Also, I wonder what kind of clock speed one could manage, given that a straight path from one corner to the other (assuming a square die) is 6.229 miles - if I'm fudging the numbers right, that's about 33 microseconds for light to get from one corner to the other, in a straight line (and nothing in a processor ever goes in a straight line).
 
Yep my iPad Pro 2018 blitz s my Mac pro 16 in 4k editing so much that I edit all my drone and go pro footage on the iPad over the Mac! I’m really looking forward to the a14x or what ever the chip will be called
Damn man, LumaFusion it would be?
I do still need the full ease of use and layering better provided by a multi screen, high ram desktop, however I have heard of those comparisons... some codecs and resolutions (4K and above) can be actually real-time on iPad Pros and not on iMac or Mac Pros. Mind blowing.
 
  • Like
Reactions: GlamCheese666
You would be surprised about how many developers use Macs and use "non-traditional" VM tools such as Docker, Valet, Homestead and other container OSes. Software like MAMP and WAMP are things of the past; with so many advantages by using containerization based development.

This is going to be a huge struggle for developers if they are unable to efficiently run Linux micro OS's without passing it through rosetta stone v2. That will increase the costs for hardware and decrease the productivity of the developers.

For example, with the latest version of Magento 2.4+ you cannot even host the software on a Mac or Windows desktop... but instead REQUIRES a VM or dedicated box.

This is a question that concerns me too. If the development tools are not available or are too hard to use, then Apple Silicon will struggle to attract new users who have depended on x86 tools. I'm sure XCode will work well, but support for popular languages, IDEs, databases, web/app servers, containers & maybe VMs is also required. Yes, we can run a lot of these on cloud services, but there needs to be a solid local development capability.
 
Very interesting, I was not aware so many developers used VM tools. I hope for the developer's sake that some sort of solution comes around. It would be unfortunate to require two computers to complete a single task if that were even a viable option.

Some sort of solution for running x86 VMs locally would be very welcome. I've personally moved nearly all my VMs to cloud services, but there is a cost involved, and you need a good network connection.
 
Okay, now I'm curious... So, per Wikipedia, the A13 is 98.48 mm^2, using a 7 nm feature size...

... and a 5 mm process size would be 5,000,000 nm, which is about 714,285.714 times the feature size of the A13. So, the 98.48 mm^2 (9.924 mm on a side) of the 7 nm A13 ... becomes 7,088,363.560 mm (4.405 miles) on a side. Which works out to a die covering 12,415.785 acres, or 19.400 square miles.

Conclusion? A 5mm process A13 would be... really big.

Also, I wonder what kind of clock speed one could manage, given that a straight path from one corner to the other (assuming a square die) is 6.229 miles - if I'm fudging the numbers right, that's about 33 microseconds for light to get from one corner to the other, in a straight line (and nothing in a processor ever goes in a straight line).
Signals also don’t go corner to corner in a real die :)
 
With nVidia buying ARM and AMD not performing well on the macbook pro 16 (with an external monitor atleast), could it be a return with nVidia as a dGPU in the next ARM Macbook pro?
 
With nVidia buying ARM and AMD not performing well on the macbook pro 16 (with an external monitor atleast), could it be a return with nVidia as a dGPU in the next ARM Macbook pro?
Nope. Apple is not going back to NVIDIA ever. And ARM MacBook pros will use Apple GPUs, not AMD or nvidia.
 
  • Like
  • Love
Reactions: CarlJ and Alan Wynn
this could be the last GREAT ARM chip
I mean how small can they make a processor?
...
I would never buy a professional MAC with an ARM chip
...


This. Is. Not. An. ARM. Chip.

ARM (Advanced RISC Machine) does several things but they don't make chips. At the high end they sell "reference designs" from which you can contract with a fab and have a CPU built for you.

At the lower end (where Apple is a customer) they provide an instruction set reference. From this Apple has, essentially from the ground up, designed not just a CPU, but an entire system on a chip including memory controllers, GPU, neural network, etc. I won't begin to list all the neato stuff Apple has crammed into this package.

According to the laws of physics, we could go smaller, the only really "hard" limit is the speed of light. At some point you simply cannot move information from one part of the chip to the other fast enough to not have delays. I think Apple's engineers are seeing this on the horizon and aren't trying to increase the single point performance (clock speed, single core performance) anywhere as much as they are starting to implement larger solutions that provide alternative processing solutions to real-world compute problems. They are, literally, designing a processing system. To solve complex real-world issues you could shrink and clock-up (massively on both counts) the primary CPU cores, and/or add more cores. Intel worked hard at this for years, look up the mhz wars. or, you could find better ways to improve FP math on the GPU and offload math the tolerates errors to that system, we've done this; probably in the middle of it on most platforms. Apple seems to be choosing route three: PGA and neural engines where the hardware becomes decoupled to a specific task but provides amazing performance for a variety of use cases and can change state in a few clock cycles.
Nothing ARM specs out comes close to what Apple Silicon does. So let's just stop calling these ARM chips/processors. Right now.

...

There was a time when no-one would ever buy '...professional ___ with a RISC chip...'. You could replace RISC with any number of words but that one came to mind instantly. There was a die-hard group that said RISC was a gimmic, CISC was the only viable way foward. Here we are with every major platform running in RISC, and Intel continuing to keep CISC alive by emulating CISC on RISC hardware. Don't EVER listen to anyone that thinks that the technologies that aren't the "mainstream" have no long term value, they'll very likely silently eat their words within a decade. I recall a time when we all though 1Ghz was impossible, when 10nm was impossible. The engineers working on this stuff eat impossible for breakfast.

What Apple's doing in designing a new platform and compute design may well be the last of the discrete CPU systems, after this we may move to quantum or light based computing. Who knows. But the CISC running over RISC, burning bits in a 40 year old instruction set is NOT sustainable for very much longer. Just like vacuum tubes, discrete transistors, backplanes, etc. the idea of a "CPU" will fade to history.
We know the goal: an electronic "brain" that can operate at the level of a mammal's but completely controllable and reconfigurable to a single purpose on the fly. Nothing in production from any of the major chip fabs gets us close to that.

You can sit by and whine as the wold goes by, or you can be amazed, develop for the new ideas and see how far you can push them, and in turn push the hardware folk to push harder. Choose wisely.
 
one thread? 1984 is calling and wants its comment back. Seriously, most apps are multi-threaded and have been for a long, long time. You can see the threads running, just do a long running task like handbrake and call up the performance monitor, on a 4 core/8 thread machine, you can see all 8 queues maxed out. That is from the multi-threading. End of discussion
You've picked the textbook multiprocessing app Handbrake, even used as a CPU benchmark. Funny thing is video transcoding is such a parallel task that the CPU is the wrong tool for the job. Most (but not all) tasks that will scale to 8 CPU cores will also scale to thousands of GPU cores.

Most apps aren't so optimized and may use multiple threads or processes but not effectively spread the load across them... especially ones written in Javascript.
 

Attachments

  • Screen Shot 2020-10-12 at 7.55.22 PM.png
    Screen Shot 2020-10-12 at 7.55.22 PM.png
    5.5 KB · Views: 73
Last edited:
The issue for developers is how to use or build x86 Docker containers. You can build multi-architecture containers that support ARM and x86, but it involves extra steps.

Once QEMU gets ported to MacOS/arm then you could use this process to get Docker/x86 on ARM:


I've been trying (not very hard) to get an ARM system to run on MacOS, so I'll use this docker method to get it. It'll hopefully be faster than dealing with actual ARM hardware for now.
 
this could be the last GREAT ARM chip
I mean how small can they make a processor?
Seems like 5mm is the limit
next chip 2.5mm? I think not
would have to start piggy backing 2 5mm chips together at this point

I would never buy a professional MAC with an ARM chip

Just think INTEL and AMD is much better at this game of chip design for heavy intensive computing.
There's nothing sacred about CPU, RAM, disk, GPU, etc as we know them. Computers can be designed in many ways.
 
AMD is laughing their butts off at the amateurs at Apple. Serious people like me need serious CPUs and GPUs for serious productivity. I’m pwning people at Overwatch all day and night on the highest settings.
You forgot “/s”, Shirley.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.