Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not quite. They usually do one per power envelope. e.g. an A12 for phone-sized batteries and TDP, and a bigger A12X for tablet-sized batteries and TDP. Plus even smaller dies for the Watch and AirPods. More than one. [/qoute]

The A12 and A12X are two different dies. Yes they share some common core functional units but they are not manufactured as one line. The difference there is not just power. They also aren't in the same product line.

Apple has clock tweaked some of these when used on different products but that is largely going completely miss what I'm highlighting here and trying to loop back into a "one implementation" for everything will work on the Mac product line up breath. It really doesn't even work in the iOS product space. The Mac line up is 2-3x that more diverse.
 
I wouldn't be shocked at all. But the XCC, HCC, iCC variants of along the Intel Xeon W product line are not 'blow fuses' but different dies. Yes there is some stuff that is Core i9 that is "fused off' variant of a Intel Xeon W. Apple won't need the massive CPU package line up that AMD and Intel have. However, Apple track record is basically having just one. Every Apple Watch sold in current generation $250-2,500 is just one CPU package SKU. All iPad Pro 2018 .. just one CPU package SKU . iPhone XR , XS, XS Max ... just one CPU SKU. Apple could make more SKUs by fusing off features but they largely don't have the volume to do that profitably (largely beause they can't use them to fill products for other folks. They are a single consumer silo).

Intel and AMD have the opposite problem. They need to fill as many products as possible with as many variations as possible. They have a large product volume space to fill. Apple doesn't.


"every possible product space" is not Apple's core design approach that they are following. They take every opportunity to talk about how the craft their SoC solution to fix exactly the specific product they area designed for on that first iteration of use. The A12X is highly finely tuned for the iPad Pro 2018. And then after that they find other uses for it. ( e.g., won't be surprising to turn up in an AppleTV update like last iPad Pro processor to squeak out more volume. it will be fast enough and affordable enough due to the large bow wave it is riding off of to not have to do a specific design. ).

Apple's primary focus is not at the top end of power, it is at the bottom. That is largely a gross mismatch with what AMD and Intel are doing in the top half of their line up. ( Power saving isn't completely last in priority but

The primary issue is that Apple is NOT doing top end stuff at all. If they took most of the talent off the "bottom" and put them to work on the new "top" who would be working on the "bottom"? If the iPhone SoC development stalls for 12 months what kind of strategic impact would that have for Apple? ( pretty high). So how likely are they going to take those folks off of iterating there? ( pretty low).


As the basic ARM server designs mature over 2020-2023 Apple could take some other parties implementation and just substitute them in at the higher end without having to do much in-house design at all. The N1 is suppose to get a chiplet baseline refernce and E1 bring in SMT. Iterate that 2 more generation and if only particularly concerned about being "fast enough' Apple could buy that almost off the shelf and couple an even more custom T-series so as narrow it specifically to a Mac. If Intel continus to stumbles and AMD gets more on track that is just as viable an option for most of th Mac line up.


Apple's "in-house" development though is almost exactly the opposite of what a robust, "top to bottom" Mac line up needs. Apple could shift completely off of x86, but it is just as likely that would only mean changing vendors for a substantive portion of the Mac line up that isn't mobile obsessed. That over 50% of Apple's overall revenues is extremely likely to keeps Apple's in-house work fixated on obsessing over extreme mobile. It is far more fundamentally strategic to the company.

I worked with many of apple’s cpu folks at AMD and Exponential (...which spun off into Evsx which was renamed intrinsity). They can do more than one thing at once. A lot of the time the difference between high and low end is all chip integration stuff (floorplanning to deal with bigger caches and more cores). It’s not that hard. They also need a lot fewer designers than intel. We had a dozen at amd where intel had 200 on an equivalent chip.
 
In that scenario, doesn't practically all existing Windows apps have to be rebuild for ARM by developers, because they won't run otherwise.

I recall that there were indeed some kind of ARM-based Microsoft Office applications, but there are already comparable Mac versions for those. Other than those, were there ever any important Windows ARM programs anyways?

Meaning, not so many will do that at least for free and one's old intel-based application purchases would become obsolete. If one would even think about doing the conversion, he would be better off to make an elegant ARM for Mac or Linux version instead than to target for random bootcamp or virtual os users.

I would suggest that a catalog of ARM based Android/Chromebook/Ubuntu/other Linux apps is a more likely scenario than a catalog of ARM based Windows apps. But even in that scenario, quality of applications may not be decent.
No they wont be because there is a built in emulator which will run x86 apps. It has good performance, according to Microsoft. Also be rest assured if Apple goes ARM so will everyone else in the industry everyone has some silicon team working out the details of ARM based architecture. Also the quality of linux apps will be as great as x86, i trust OSS to be diligent in cross platform efforts. Look to UBuntu for raspberry pi, they are pretty good apps on an inferior processor.
 
  • Like
Reactions: 09872738
I don't think Mac users will go through this again. People just don't have that loyalty or tolerance anymore in my opinion.
I think this will spell the end of the Apple Macintosh & Mac OS as we know it into a completely disposable device Apple iPhone/iPad IOS World designed for simpletons. :(

One man’s apocalypse is another man’s utopia.
 
  • Like
Reactions: tha_man
Apple already makes its own A-series chips for the iPhone and the iPad, and there are also custom Apple chips in recent Macs -- the T2. The T2 chip, in the iMac Pro and 2018 MacBook Pro, MacBook Air, and Mac mini models, integrates several components including the system management controller, image signal processor, SSD controller, and a Secure Enclave with a hardware-based encryption engine.
The T2 chips are crap and have caused nothing but problems.
This doesn't bode well for Mac users who are already neglected enough in favor of gadget users.
 
I'm wondering if AppleScript will survive the transition to ARM chipsets.
AppleScript survived transitioning from Motorola 68000 series chips to PowerPC chips and then PowerPC to Intel chipsets.
We've used AppleScript for automating applications like Quark Xpress and the Finder since System 7.
I have a bad feeling they will drop AppleScript once they rollout to ARM crippling MacOS for good.
 
I'm wondering if AppleScript will survive the transition to ARM chipsets.
AppleScript survived transitioning from Motorola 68000 series chips to PowerPC chips and then PowerPC to Intel chipsets.
We've used AppleScript for automating applications like Quark Xpress and the Finder since System 7.
I have a bad feeling they will drop AppleScript once they rollout to ARM crippling MacOS for good.

I think AppleScript is already on its way out in favor of shortcuts.
 
I’ll probably just stay on my current machine until it dies and see what the landscape looks like in 6-8 years
This is what it comes down to for most people. What Apple released next year only matters to people who NEED what Apple releases next year. If it comes, and they don’t like it, they still have this year’s hardware and software that will last until it dies.
 
This is what it comes down to for most people. What Apple released next year only matters to people who NEED what Apple releases next year. If it comes, and they don’t like it, they still have this year’s hardware and software that will last until it dies.

Yeah doing music and video production the last thing I need is to be stuck in limbo with this transition
 
Copyright. Yes, copyright. That's what Intel is using to stop x86-64 emulation and likely also translation.

Imagine the next J K Rowling writes a series of children's books where the characters "speak" in a strange dialect. Copyright could be used to prevent other authors from writing stories set in that universe - they wouldn't be able to use the dialect. The other authors could certainly write books, but they'll be unconnected stories of wizards, magic, or what not.

Back to CPUs: I'm sure Intel has plenty of patents on implementing their instruction set, but those can be worked around via a different implementation. And a software emulation is certainly a different implementation. But a specific instruction encoding is a copyright matter. (" Why did you encode your instruction set exactly the same as us?")

What about 32 bit x86, why isn't Intel pursuing copyright claims there? In years past, nobody thought such matters were important. So such claims weren't pursued. And thus 32 bit has escaped into the public domain.

P.S. In a related matter, back in the '70's, Zilog came up with their own (much nicer!) mnemonics for 8080 instructions due to copyright issues.
You don't copyright the technologies in a CPU. You patent them. You also don't copyright code. You patent it. Look it up.
 
You don't copyright the technologies in a CPU. You patent them. You also don't copyright code. You patent it. Look it up.
Actually, you are both right. Some things are patentable, others are copyright. Intel, for example, has sued clone makers multiple times on copyright theories, saying that the instruction set (the choice of specific bit patterns to represent opcodes and the instruction fields) is copyrightable because it is not purely functional. They have also asserted that microcode is copyrighted.

https://jolt.law.harvard.edu/digest/intel-and-the-x86-architecture-a-legal-perspective
 
You don't copyright the technologies in a CPU. You patent them. You also don't copyright code. You patent it. Look it up.
Depends where you live. In Europe for example, you cannot patent software, but it is protected by copyright (depending on your license. If you do not license your code explicitly, its still copyrighted by default)
 
Depends where you live. In Europe for example, you cannot patent software, but it is protected by copyright (depending on your license. If you do not license your code explicitly, its still copyrighted by default)

The mask designs, microcode, netlist, etc. are always copyrighted, but you can implement the same functionality using different circuits and not infringe copyright (yet still infringe patents). But some things, like instruction set, may or may not be copyrighted, depending on country, and may or may not be patented depending on country.
 
"...but if Apple were to custom create an ARM chip designed to be used in a high-end desktop or laptop system with active cooling, they would likely produce the fastest consumer chip on the market.“

I'm wondering how eGPU's may come into play here, and if they keep working at it to perfect ARM's viability (assuming all the hoop-law is true)-this could get interesting if a seamless transition emerges.

Would the differences in CPU instruction set's make all current eGPU''s incompatible? If there are options if/when, I'd like the ability to choose the eGPU I'm connecting to it-but I'm not holding my breath on that part.
 
Last edited:
what‘s wrong with intel? and what will happen to gpus?

apple can‘t even sell up to date gpus in der imac pros or imacs. they ask a premium of 300% for 3 year old hardware in the case of gpus. and amd showed it cannot deliver. I am worried.
 
  • Like
Reactions: Precursor
what‘s wrong with intel? and what will happen to gpus?

apple can‘t even sell up to date gpus in der imac pros or imacs. they ask a premium of 300% for 3 year old hardware in the case of gpus. and amd showed it cannot deliver. I am worried.

It's because people have a distorted and unrealistic view of how fast the ARM chips Apple uses actually run in reality.
They see it running in a iPhone or iPad and giving amazing benchmark speeds, when those devices are very limited in what they need to do, and the very custom lightweight OS they are running, and mistakenly then compare it to speeds on a Macbook or Windows laptop and say, hey why use Intel. Use these fast ARM chips instead.
What would probably happen, and why it's not been done yet, is if you just fitted an iPhone/iPad ARM CPU into a real/full/proper computer running a full OS with applications for the desktop, they would slow to a crawl.
This almost never seems to be thought of.
I'm sure once the ARM chips are up to the task, THEN is will happen.
It's just a matter of time.
But running iOS and iOS apps is not the same as running MacOS and MacOS apps.
 
It's because people have a distorted and unrealistic view of how fast the ARM chips Apple uses actually run in reality.
They see it running in a iPhone or iPad and giving amazing benchmark speeds, when those devices are very limited in what they need to do, and the very custom lightweight OS they are running, and mistakenly then compare it to speeds on a Macbook or Windows laptop and say, hey why use Intel. Use these fast ARM chips instead.
What would probably happen, and why it's not been done yet, is if you just fitted an iPhone/iPad ARM CPU into a real/full/proper computer running a full OS with applications for the desktop, they would slow to a crawl.
This almost never seems to be thought of.
I'm sure once the ARM chips are up to the task, THEN is will happen.
It's just a matter of time.
But running iOS and iOS apps is not the same as running MacOS and MacOS apps.

The fact that people don’t understand CPUs and offer opinions like this is humorous. Things ignored:

1) the ARM cpu in the mac won’t be an iPhone chip just stuck into a mac
2) even the ARM cpu in your iPhone is capable of laptop-like speeds if given a sufficient thermal solution so that it doesn’t have to keep throttling the frequency down. And it just so happens that sticking it in a bigger case, like a laptop, allows you to have such a solution
3) there is absolutely nothing that makes ARM ISA chips inherently slower than AMD64/x86-64 chips. In fact, given identically-talented designers, and modern compiler technology, ARM has a slight advantage since you can use a significantly simpler instruction decoder, which takes less space, eliminate certain critical paths, and run the pipelines faster.
4) Apple’s CPU designers are much better than Intel’s. At AMD we never were worried about the designers at Intel - we were worried about their excellent fabs.
5) TSMC‘s fabs are at least as good, if not better, than Intel’s now.
 
It's because people have a distorted and unrealistic view of how fast the ARM chips Apple uses actually run in reality.
They see it running in a iPhone or iPad and giving amazing benchmark speeds, when those devices are very limited in what they need to do, and the very custom lightweight OS they are running, and mistakenly then compare it to speeds on a Macbook or Windows laptop and say, hey why use Intel. Use these fast ARM chips instead.
What would probably happen, and why it's not been done yet, is if you just fitted an iPhone/iPad ARM CPU into a real/full/proper computer running a full OS with applications for the desktop, they would slow to a crawl.
This almost never seems to be thought of.
I'm sure once the ARM chips are up to the task, THEN is will happen.
It's just a matter of time.
But running iOS and iOS apps is not the same as running MacOS and MacOS apps.
In terms of overall performance, AnandTech emphasized Apple's lead in the mobile chip space, noting that the A13 posts almost double the performance of the next best non-Apple chip. The site also found the A13 "essentially matched" the "best that AMD and Intel have to offer" for desktop CPUs, at least based on SPECint2006, a suite of CPU-intensive cross-platform integer benchmarks.

This combined with persistent rumours linking the transition to 2020 (including Intel's own executives believing Apple will move away from their platform) and Apple's big investment in a chip-related facility in Texas (including nabbing one of ARM's own top chip designers) is pretty solid evidence it's happening in the not too distant future. If you're looking for an evidence based prediction, it's all there that it's at least being seriously considered and developed behind the scenes. The key advantage of these ARM chips over anything Intel comes out with is of course optimisation. They can be built from the ground up to be exactly what Apple want and need to run their system.

 
They see it running in a iPhone or iPad and giving amazing benchmark speeds, when those devices are very limited in what they need to do, and the very custom lightweight OS they are running, and mistakenly then compare it to speeds on a Macbook or Windows laptop and say, hey why use Intel. Use these fast ARM chips instead.
iOS and macOS are both Unix-based operating systems. Yes, iOS was optimized for smaller devices (mostly in terms of having less ancillary code running, as well as not having to support a lot of normal expected Unix services), but when you're running a computation-based benchmark, you don't care about all that other stuff, and, indeed, you try to minimize its impact on the benchmark (e.g. you don't run other intensive software while the benchmark is running). Computation-based benchmarks aren't measuring anything directly related to the OS, they're going to run roughly the same on an iPhone vs a MacBook, once you've quieted down anything that might interfere with the benchmark.

What would probably happen, and why it's not been done yet, is if you just fitted an iPhone/iPad ARM CPU into a real/full/proper computer running a full OS with applications for the desktop, they would slow to a crawl.
First, it's not at all clear that that is true. Second, putting an iPhone CPU in a "real" computer is not what Apple would do. One of Apple's huge advantages with their A-series of chips, is they can have their engineers target precisely the needs of their iPhones and iPads. The mobile phone environment is severely limited in terms of how much power you can use, and how much heat you can generate.

Those limits are much different in the laptop arena, and many orders of magnitude different in the desktop arena. So why do you assume that they would use chips designed for a phone in their other computers? Especially given that a huge advantage they have is the ability to precisely target their own needs? Other companies have to scrounge around for processors that fit (and mostly come in two sizes: too small and too large), while Apple has the equivalent of their own tailor making exactly what they need.

There's nothing stopping them from making a desktop ARM-based CPU with 5x as many cores, running at twice or more the clock speed. You haven't seen a chip like that from them yet, because it's more computational power than they need in a phone, and it couldn't stay within the thermal and power limits of the phone environment. But in a laptop, or desktop, it'd be fine. And I wouldn't be at all surprised if they've had at least designs for such chips in the lab for a few years.
 
Just to make it clear, as my viewpoint may not have come across in the right way.
No One can want us to move forwards when it comes to CPU's and if fed up with the slow pace of Intel than me.
I so so am fed up with the terrible lack of advancement over the past decade+ when it comes to this.

Personally I wish x86 was dumped years ago, and a totally fresh start, and all apps my now being re-written for the new base.

So yes, if ARM can ramp up the cores and speed, and then need decent heatsinks and mains power, and blasts past Intel, then yes please, it can't come soon enough.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.