Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sure. The logic applies both ways.

What you *can* do is carefully study the ISA and see if either of them offers an inherent advantage or disadvantage. And as someone who designed both RISC and x86-64 chips, I‘ve done that.

Arm‘s advantages are a much simpler instruction decoder that allows you to reduce the size of the core by a reasonable percentage, the avoidance of terrible addressing modes and, especially, all the gunk that goes into backwards x86 compatibility, and, if you believe that a compiler thinking about a problem for awhile can do a better job than a few million transistors that have to run in real time, then the simpler instruction set is an advantage too. Otherwise, if you feel like compilers do a bad job, x86-64 may have an advantage due to heftier instructions (though a lot of that goes away if you are running in pure 64-bit mode).

Any Intel trick to speed things up (branch prediction algorithms, multithreading, whatever) can be done just as easily on Arm as x86. But since x86 will always have deeper pipelines, it’s actually easier to do a lot of these things on Arm. (The penalty for guessing wrong, flushing the pipeline, and trying again is less when the pipeline is shallower. This means you can get away with less branch prediction accuracy, which means fewer transistors, and achieve the same performance).

In the end it’s all probably a wash other than the fact that x86-64 will always need more transistors to do the same job, which means more power consumption, and longer wires (which slow things down). How much of an effect that is will vary depending on lots of factors. But the one thing I can tell you for sure is that x86-64, with equal chip designers using equal fabs, does not have any advantage.

Thank you for this well thought out answer.

Am I correct in saying that x86 translates internally to RISC? Whereas this isn’t necessary for the ARM ISA? Thus calculation time is wasted on translation which means x86 draws more power and requires a great number of transistors to do the same job? Apologies - I haven’t studied processor design and always wished this was available to me at university.

Why haven’t we seen ARM chips as powerful as Xeons / Golds today? Is it a matter of innovation, marketing to small iterations to sell more or something else?
 
Thank you for this well thought out answer.

Am I correct in saying that x86 translates internally to RISC? Whereas this isn’t necessary for the ARM ISA? Thus calculation time is wasted on translation which means x86 draws more power and requires a great number of transistors to do the same job? Apologies - I haven’t studied processor design and always wished this was available to me at university.

Why haven’t we seen ARM chips as powerful as Xeons / Golds today? Is it a matter of innovation, marketing to small iterations to sell more or something else?

It’s not too inaccurate to say that all processors are really risc processors, some with a translation layer around them, but it’s not quite right. Useful way of thinking about it, though. Essentially, x86 instructions are hard to decode for a bunch of reasons. They can have various different lengths, a certain field can mean a register number in one case, and a memory offset in another, etc. So first you have to create a complicated little machine to figure out what kind of instruction it is and what the arguments are. Then, if it‘s an instruction that cannot be executed in one pass through the execution pipeline (which, for integers, really only knows how to add, invert, multiple, shift, rotate, and a few other things), then you look up the instruction in a microcode ROM and generate a sequence of microcode instructions (inverts, adds, shifts, whatever). Like little subroutines. Then you issue those, but you have to make various other parts of the chip more complicated to cope with it. You have to have some way Of keeping track of the fact that multiple microcode instructions running in the pipelines may be part of the same ISA instruction. Etc. etc. Then, add in the fact that x86 instructions have more complicated behaviors - in RISC, you almost always perform operations on data in registers, and store the result in registers. In x86, you can add two numbers where one comes from memory, etc. These get translated into memory load/stores with math instructions That use the results that get loaded from memory, etc. But then you need special temporary registers to store these intermediate values so that you don’t contaminate the ISA registers. Then you get into the fact that you can modify instructions on the fly (in traditional x86), which causes all sorts of problems with caches, etc. So it’s more than just a wrapper - the poison runs throughout the processor.

As for why we haven’t seen ARM chips as powerful as Xeons, who would have the skill and motivation to make them? There are only a few companies with the skill - Intel, AMD, and Apple, probably. (Everyone else is probably still doing ASIC-style design, which isn’t going to get you there). Why would one of those companies have made such a chip before now? What OS would it run? Who would be the customers?

When x86-64 was designed at AMD, it took years for it to get legitimacy and be supported by lots of software. Who’s going to go out on that limb? Apple may be the first place where skill and need actually intersect. They control the software. They know how to design full-custom CPUs. They have a great fab in TSMC.
 
So it's a bad idea to buy an Intel Mac now?
Is it better to wait till 2021?

If you need now, buy now.

If apple was to release an ARM based Mac TODAY, there would be at least 12-36 months worth of pain while application vendors get their stuff sorted out to run optimally on it, and your intel Mac won't just suddenly stop working or perform badly.

If you don't need now, don't buy now. You're wasting money in that case.

If you can improve your work with a new machine, by buying now - well that's an ROI calculation for you to do for your specific use case.

[automerge]1588136718[/automerge]
By the time it’s done (and assuming Apple does decide to phase out intel Macs altogether), it will probably be time for you to upgrade anyways.

This, 100%.

Whilst the ARM move is exciting to many, myself included, that's because I'm excited for the potential long term benefits. Not specifically because I'm eager for a revision 1.0 ARM based Mac.

Its a potential long term benefit, and interesting for the Apple enthusiast (as opposed to someone who just wants something to work); you can be sure the first ARM based machines will have significant compromises until some time has passed for the kinks to be ironed out.

Obligatory car analogy: Waiting to buy a brand new ARM based Mac would be like buying an electric car 5-10 years ago. Not for people who are unwilling to deal with some quirks until they get sorted out.
 
Last edited:
This, 100%.

Whilst the ARM move is exciting to many, myself included, that's because I'm excited for the potential long term benefits. Not specifically because I'm eager for a revision 1.0 ARM based Mac.

Its a long term benefit, you can be sure the first ARM based machines will have significant compromises until some time has passed for the kinks to be ironed out.

Same. I am excited over the idea of an ARM macbook or even imac, but have little interest in being the beta testers for Apple’s new pet project. I currently am still using my 2017 5k imac and you can be sure I will hang on to it till the ARM platform stablises, because I rely on it for work too much to take any risks.
 
  • Like
Reactions: AlexGraphicD
If you need now, buy now.

If apple was to release an ARM based Mac TODAY, there would be at least 12-36 months worth of pain while application vendors get their stuff sorted out to run optimally on it, and your intel Mac won't just suddenly stop working or perform badly.

If you don't need now, don't buy now. You're wasting money in that case.

If you can improve your work with a new machine - well that's an ROI calculation for you to do for your specific use case.
[automerge]1588136718[/automerge]


This, 100%.

Whilst the ARM move is exciting to many, myself included, that's because I'm excited for the potential long term benefits. Not specifically because I'm eager for a revision 1.0 ARM based Mac.

Its a long term benefit, you can be sure the first ARM based machines will have significant compromises until some time has passed for the kinks to be ironed out.

Obligatory car analogy: Waiting to buy a brand new ARM based Mac would be like buying an electric car 5-10 years ago. Not for people who are unwilling to deal with some quirks until they get sorted out.
But currently my Mac is MacBook Pro 7,1. It still works and it's almost fine. A little slow on Unity though. The only reason I'd buy a new Mac is because of Xcode, because I've been looking around and I can get a gaming PC for the price of MacBook Air.
I dont want it to be for example I'll buy a Mac today and in 2 years Apple just leaves it.
 
There is about zero good reason to do that. Two major reasons. First, the card does one and only one thing ( decode the various formats of ProRes), Perhaps there will be a later upgrade so that it will add either encoding ProRes and/or an updated format of ProRes to the mix.

I was under the impression that it was re-programmable? I could be wrong, but if it only does pro-res at the moment I thought that was purely a software issue - isn't it essentially just a big FPGA that can be re-purposed?
[automerge]1588137421[/automerge]
But currently my Mac is MacBook Pro 7,1. It still works and it's almost fine. A little slow on Unity though. The only reason I'd buy a new Mac is because of Xcode, because I've been looking around and I can get a gaming PC for the price of MacBook Air.
I dont want it to be for example I'll buy a Mac today and in 2 years Apple just leaves it.

If you buy now I'm fairly sure that you'll be fine for the next 3-5 years at least. The Mac Pro just got revised and I don't think apple will be ready to move 100% off x86/x64 at the high end for at least a few years, which means they'll still be selling brand new high level machines using x86 based CPUs at that price point (which a company will expect 3-5 years of functionality out of).

MacOS in some form will still be x86 friendly/supported for AT LEAST 5 years at this point - to support the pro level hardware they are selling today.
[automerge]1588137554[/automerge]
MacBook Pro 7,1

That's a 2010 machine - you'll see a massive leap in performance jumping to something (anything) newer in Apple's current line-up. There was a big step in performance in 2011, and then several generations of 5% improvements or so every model revision for the past 9 years. Plus your 2010 machine will be missing hardware decode for various modern video codecs, no crypto acceleration instructions, etc.; doing any of those things will be much, much faster.

In your specific case (due to the age of the machine and being behind the big 2011 step where the CPUs gained crypto support and much better integrated graphics, quicksync, etc.), I'd pull the trigger now, if something catches your eye, and not worry about support for at least 5 years.
 
Last edited:
Exceeding few folks are buying at iMac Pro or Mac Pro to boost web browser JS times.

Few folks are buying an iMac Pro or Mac Pro (or any computer beyond $3k), period.

And yes, even a Mac Pro 28-core owner will frequently have a bunch of web pages open.
[automerge]1588159099[/automerge]
The reason I (and someone else in this thread) mentioned it, is because it is proof that ARM can be more powerful then what we see on the iPhone.

Fair enough (I don't think it was in dispute).

I think that's what Apple is working towards with their rumored 12 core processor. I'm ganna call it the MC-1 (haha MaC 1). The MC-1 - having 8 high performance cores and 4 energy efficient cores would be perfect for the 12 inch MacBook. And the second rumored chip based off next years A15 will probably be for the Mac Mini or iMac. Or maybe both.

I don't know whether to trust such details in Gurman reports (he has a tendency to take real sources, but then embellish them with his own narrative), but, sure.

I don't think they will start work on an of their "Pro Mac's" until after these consumer level devices are released, and will probably give a spec bump to at least one other MacBook Pro or iMac Pro until they are ready for their "Pro ARM Chips".

This will give pro app developer's time to get their code running on ARM macOS.

Yup.
 
IMO they can’t do it in one year, they’d be putting all of their eggs in the basket and if the plan fails, there’s no fallback.

Well, that is also one reason but Apple isn’t (historically) the type of company to have two types of processors in their computers. And if they switch to ARM for some of their machines and tell Developers to change over their software and then stop doing it that would be more trouble.

But I don’t think Apple will fail. I think the reason they haven’t pulled the trigger is they have been in the background preparing and designing “Mac Chips” and making sure they can do this. Just like they had been making OS X compiled for Intel ever since OSX was released. The “Just in case” team as Jobs called it.
 
  • Like
Reactions: throAU
Well, that is also one reason but Apple isn’t (historically) the type of company to have two types of processors in their computers. And if they switch to ARM for some of their machines and tell Developers to change over their software and then stop doing it that would be more trouble.

But I don’t think Apple will fail. I think the reason they haven’t pulled the trigger is they have been in the background preparing and designing “Mac Chips” and making sure they can do this. Just like they had been making OS X compiled for Intel ever since OSX was released. The “Just in case” team as Jobs called it.

There's always a Just in case movement. I am not saying Apple will fail but they're not going to overnight change all the configurations, nor are they necessarily going to drip feed things in. I guess we'll see this WWDC what happens.

Heck, for all we know, these chips could be cancelled at the last minute due to bad bin/ yield.

There's no guarantee what the adoption is for this new generation of chips.
 
But currently my Mac is MacBook Pro 7,1. It still works and it's almost fine. A little slow on Unity though. The only reason I'd buy a new Mac is because of Xcode, because I've been looking around and I can get a gaming PC for the price of MacBook Air.
I dont want it to be for example I'll buy a Mac today and in 2 years Apple just leaves it.

Using the PowerPC to Intel transition as an example, Apple released first Intel Mac’s in 2006 and in 2009 was the last OS X that was released for PowerPC.

And because of how much work Apple will need to do and how much they will have to ramp up ARM production, I don’t feel that they will switch to ARM that quickly. Which in theory will give Intel Mac’s more support.

However, if your current Mac is a 2010 MacBook you have already lost support. It wouldn’t be a bad idea if you are using this for work to get a new Intel MacBook now. Worst case, it’ll get four years of software upgrades. Best case, it will be 4-6 years of software updates.

In either case, it sounds like you probably need a new computer. But if you can live with this MacBook for a couple of months (until WWDC 20) that might be best. Hopefully, if these rumors are true, Apple will tell us their plan and how long it will take. Then that will give you a better idea on what’s best for you.
 
  • Like
Reactions: throAU
Using the PowerPC to Intel transition as an example, Apple released first Intel Mac’s in 2006 and in 2009 was the last OS X that was released for PowerPC.

And because of how much work Apple will need to do and how much they will have to ramp up ARM production, I don’t feel that they will switch to ARM that quickly. Which in theory will give Intel Mac’s more support.

However, if your current Mac is a 2010 MacBook you have already lost support. It wouldn’t be a bad idea if you are using this for work to get a new Intel MacBook now. Worst case, it’ll get four years of software upgrades. Best case, it will be 4-6 years of software updates.

In either case, it sounds like you probably need a new computer. But if you can live with this MacBook for a couple of months (until WWDC 20) that might be best. Hopefully, if these rumors are true, Apple will tell us their plan and how long it will take. Then that will give you a better idea on what’s best for you.
No, Leopard was the last OS released for PowerPC in 2007. In 2011 it received it's last security updates.
I haven't lost the support on my 2010 MacBook. I still get security updates and have the latest Safari. That's why I've been tinkering around PCs and a MacBook again. As I've said: You can get a powerful gaming PC for the price of Air and it'll be probably supported for ages.
 
Using the PowerPC to Intel transition as an example, Apple released first Intel Mac’s in 2006 and in 2009 was the last OS X that was released for PowerPC.

And because of how much work Apple will need to do and how much they will have to ramp up ARM production, I don’t feel that they will switch to ARM that quickly. Which in theory will give Intel Mac’s more support.

However, if your current Mac is a 2010 MacBook you have already lost support. It wouldn’t be a bad idea if you are using this for work to get a new Intel MacBook now. Worst case, it’ll get four years of software upgrades. Best case, it will be 4-6 years of software updates.

In either case, it sounds like you probably need a new computer. But if you can live with this MacBook for a couple of months (until WWDC 20) that might be best. Hopefully, if these rumors are true, Apple will tell us their plan and how long it will take. Then that will give you a better idea on what’s best for you.
I am sorry, but this is a bad advice. The last OS that supported PowerPC was released in 2007. My MacBook from 2010 still receives security updates, thus is supported.
 
There's always a Just in case movement. I am not saying Apple will fail but they're not going to overnight change all the configurations, nor are they necessarily going to drip feed things in. I guess we'll see this WWDC what happens.

Heck, for all we know, these chips could be cancelled at the last minute due to bad bin/ yield.

There's no guarantee what the adoption is for this new generation of chips.

I think we are really saying basically the same thing. I’ve said multiple times that they aren’t going to switch to their own chips magically over one year. It’s going to be a multi-year thing. It will allow them time to slowly ramp up and invest in their chip design teams. To hire more people if they need to. Allows time for developers to switch their apps over. Will give customers time to know that everything they need has been moved over, and won’t keep them on an older Mac if they need a newer Intel machine today. It’s just makes more business sense to do this over two or three years.
[automerge]1588167599[/automerge]
I am sorry, but this is a bad advice. The last OS that supported PowerPC was released in 2007. My MacBook from 2010 still receives security updates, thus is supported.

OS X Leopard was released in 2007, but it was last updated in 2009. So for security updates Apple continued to support PowerPC until 2009.

And your MacBook isn’t getting new feature updates, right? My late 2011 iMac isn’t gettting updated anymore (it’s why I bought my 2017 MacBook Pro last year). But this is also why I’m waiting to replace my iMac until they release an ARM iMac/Mac Mini. My kids still use the iMac and it definitely needs updating. The hard drive is on the last leg.

But again it really depends on you man. If you 2010 MacBook is doing good, and you don’t need to get a new one ... wait until at least WWDC 20. If they really are starting this next year they will tell their developers this year and probably at WWDC. Just like they did in WWDC 05. And just in Jobs did in 2005, they will tell us their time line. It was 2006-2007 back then. I’m pretty sure this will be a longer transition this time.

So you do you man.
 
Last edited:
  • Like
Reactions: ct2k7 and Tekguy0
Well, that is also one reason but Apple isn’t (historically) the type of company to have two types of processors in their computers. And if they switch to ARM for some of their machines and tell Developers to change over their software and then stop doing it that would be more trouble.

But I don’t think Apple will fail. I think the reason they haven’t pulled the trigger is they have been in the background preparing and designing “Mac Chips” and making sure they can do this. Just like they had been making OS X compiled for Intel ever since OSX was released. The “Just in case” team as Jobs called it.
Historically Apple has been more prone to use two types of processors in its computers than perhaps any other company.

the Apple ][ line of computers used completely different processors than the Mac (and both were sold at the same time). Then, for a time, they sold macs simultaneously with 68k chips and powerpcs. Then simultaneously with Intel and PowerPC. Then simultaneously with 64 bit and 32 bit CPU’s.

probably only IBM and some old mainframe makers have a history of selling more machines with different cpus at the same time.
 
I think we are really saying basically the same thing. I’ve said multiple times that they aren’t going to switch to their own chips magically over one year. It’s going to be a multi-year thing. It will allow them time to slowly ramp up and invest in their chip design teams. To hire more people if they need to. Allows time for developers to switch their apps over. Will give customers time to know that everything they need has been moved over, and won’t keep them on an older Mac if they need a newer Intel machine today. It’s just makes more business sense to do this over two or three years.
[automerge]1588167599[/automerge]


OS X Leopard was released in 2007, but it was last updated in 2009. So for security updates Apple continued to support PowerPC until 2009.

And your MacBook isn’t getting new feature updates, right? My late 2011 iMac isn’t gettting updated anymore (it’s why I bought my 2017 MacBook Pro last year). But this is also why I’m waiting to replace my iMac until they release an ARM iMac/Mac Mini. My kids still use the iMac and it definitely needs updating. The hard drive is on the last leg.

But again it really depends on you man. If you 2010 MacBook is doing good, and you don’t need to get a new one ... wait until at least WWDC 20. If they really are starting this next year they will tell their developers this year and probably at WWDC. Just like they did in WWDC 05. And just in Jobs did in 2005, they will tell us their time line. It was 2006-2007 back then. I’m pretty sure this will be a longer transition this time.

So you do you man.

No, Leopard received the last security update in 2011. Intel and PowerPC. I remember becaue I had one. My MacBook would get new features if I bothered to install DosDudes Catalina on it (which worked better than High Sierra. Even had Nightshift etc, which Apple disabled because my Mac is "too old". Apple should hire @dosdude1 . He knows how to add features to older Macs.

Of course. I'm waiting for the new MacBook Pro. I've heard too many negative things about newer macs, which is why I haven't upgraded. I just dont want my new Mac to become obsolete after 3 years though.

Thank you for your advice.
 
Of course. I'm waiting for the new MacBook Pro. I've heard too many negative things about newer macs, which is why I haven't upgraded. I just dont want my new Mac to become obsolete after 3 years though.

Thank you for your advice.

I absolutely love my 2017 MacBook Pro. Although the thing gets really hot and the keyboard took a day to get use to. But the 2019 16 inch MacBook Pro fixed every issue pretty much (the over heating and the new keyboard). So hopefully they keep that design going forward.

And yeah I get it. Apple computers aren’t cheap, and you definitely don’t want to invest in one and next year it’ll be obsolete.
 
  • Like
Reactions: HighSierraCatalina
PowerPC is still a powerful processor, and is now open source allowing anyone to make processors with PowerPC as a reference design. I don't think IBM is doing anything anymore with it, but they were developing it way beyond when Apple stopped using them.

I have read a lot of this thread, and there does seem to be a lot of people saying ARM isn't good enough and also stating that the ARM Mac's will be the end of Mac for Professionals (for a myriad of reasons, but I did pick up some implied belief that developer's wouldn't be able to put some pro level app's on the platform). In fact, one of such posts is quoted below. So yes, it is implied here over and over again that a Mac Pro with ARM could never be as good as an x86 processor.

IBM has continued making Power series CPUs. Much more powerful than anything on the x86 front.
 
I was under the impression that it was re-programmable? I could be wrong, but if it only does pro-res at the moment I thought that was purely a software issue - isn't it essentially just a big FPGA that can be re-purposed?

Huge difference between reprogrammable by Apple and reprogrammable by random "Joe" developer. The Afterburner is not presented to the end users to use. Programmers make a Apple standard A/V API call. Inside Apple's library if the Afterburner is present that 'work' is handed to the card. If it isn't present the same call is handled by software. The card itself is opaque to the applications.

The reprogrammability is that Apple could add some other path for some other existing API call to redirect the work to the card. For example, can now decode up to 4 8K RAW streams. So what if reconfigured the transistor connections so that there was a 1-2 stream decoder and 1-2 stream encoder on the card. Reconfigure and perhaps Apple's compressor sends a simple (no add on render overlay) job to the Afterburner card to transcode to another format. Goes into the card in one format and comes out in another Apple/Open format.

Similarly if had two Afterburner cards installed. One does decode and the other does encode could do some quick transcodes that way.

There are a number of things Apple could do with just the existing or slightly expended A/V API that could be shuffled to the different "fixed function" engines on the Afterburner card.

One of the primary issues is that Apple's libraries is going to probably rely on knowing what the computational engine state of the Afterburner call is. To have the fastest, cleanest path through the standard library call not going to want to do:

1. query what the FPGA state is. Is it Afterburner that matches this library call. No? Is there some other process on macOS using the card? Yes. ... do what? wait or use the software version. Nobody using now? Are they completely done or paused or timesliced out in a in between state? Request exclusive access to the FPGA card ? Still too long. Punt to software version? etc.

versus radically more simple code.

2. Is the card present? Yes .. use it. Otherwise don't use it.

Once make it a dynamically configured shared resource, you then open up a whole can of worms. This is way easier to deploy in a highly bug free roll out. Focused mission and fewer bugs probably makes it worth more to target audience aimed at.


Over time ProRes rolled out new formats. The first non RAW ProRes format wasn't the last. Over ProRes 4444 came later than the initial basic one. This ProRes RAW probably will get updates. Chroma log changes or new ideas. Substantive different data loads ( crazy 16K sensor with its scaling issues. ) etc. 3-4 years from now minimally there will probably be another ProRes RAW addition to the list of formats. That is why it is a FPGA. If it was an ASIC would have to tell folks to buy a whole another card.

It costs more to fab a ASIC in very low volume numbers. The low number count is why use a FPGA ( more folks with different needs can buy the same thing and lower costs by collective volume buying. ) . Some folks buy FPGA to debug their ASIC designs before do fab them. Just because it is a FPGA doesn't mean that single user has to put it so divergent uses.


I suspect there will be an "Afterburner 2" if the units sold numbers turn out decent over 2-3 years. PCI-e v4 interface and more programmable transistors connection will allow a future card to do more in a fixed space in a future system with more bandwidth. The current Afterburner card will probably still be useful though.



And for the Mac Pro system in general, if there was some huge deep seated need for direct programmer access to a substantive FPGA then some 3rd party could sell a FPGA card ( same as done with Linux / Windows boxes. ). macOS doesn't particularly prohibit it, it just require some driver work be done.
 
  • Like
Reactions: throAU
What I find ironic about all the doubt about ARM is that when the ARM architecture was originally developed the very first version outperformed basically everything else in the microcomputer market by a couple of times.

It was a freaking powerhouse. If people weren't so wedded to x86 back in the day due to cheap clone PCs running DOS, things could have been a lot different.

I remember lusting over one of these, as an amiga owner:

 
Well, that is also one reason but Apple isn’t (historically) the type of company to have two types of processors in their computers. And if they switch to ARM for some of their machines and tell Developers to change over their software and then stop doing it that would be more trouble.

But I don’t think Apple will fail. I think the reason they haven’t pulled the trigger is they have been in the background preparing and designing “Mac Chips” and making sure they can do this. Just like they had been making OS X compiled for Intel ever since OSX was released. The “Just in case” team as Jobs called it.

I think that since the 68k situation Apple have maintained several architectures for their software internally irrespective of what they release. Apple do not want to be held back by a particular hardware vendor in terms of performance. Maintaining cross-architecture compatibility uncovers bugs in your code anyway, its a good thing to maintain.

OS X for example had been running on intel internally since the beginning. Apple already have ARM based development tools to support iOS. You can be 100% certain that they have been running macOS internally on ARM/iPad style hardware since before the iPhone/iPad was even released. iOS started as a fork/branch of macOS after all and there's a lot of similarity there already.

This is just a matter of Apple pulling the trigger when they decide the time is right based on
  • how their Apple CPU performs vs. what is available from intel. Based on how well the iPads perform, I'd say they have the ability to scale up and get "good enough" performance out of their own platform for everything but the (i)Mac Pro right now.
  • progress of any x86 emulator/translator software they have developed internally. there will be a transition period, before they had Rosetta, this time around there will be some sort of similar or alternative strategy to enable the transition
  • what intel have in the pipe. if intel are giving apple good products then they don't need to move. But the writing is on the wall that intel have been struggling for 5 years and AMD for the past 2-3 years have been accelerating past them. Apple can either choose to go to AMD (and then maybe have AMD fall behind at any moment) or become masters of their own destiny with their own platform. This time around (unlike the PPC / 68k days) Apple is flush with cash, so much they literally don't know what to do with. Setting up control of their own CPUs for the Mac would be a sound strategy to give themselves a performance/integration/potential security advantage that people simply can't get from either a PC or a Hackintosh.
[automerge]1588203467[/automerge]
You can get a powerful gaming PC for the price of Air and it'll be probably supported for ages.

Sure. But if you want a portable lightweight Mac then why would you do that?

It's like buying a truck instead of a bicycle. Literally two entirely different products with different use cases. One doesn't replace the other (I have both for example).

Also... if you cast your mind back to the x86 switch from PPC, the original, initial x86 Macs were... not amazing. Performance was on par with the machines they replaced, and in a couple of years Apple went from 32 bit to 64 bit. The 32 bit machines were dropped relatively soon. I'm really not sure why they didn't just wait for 64 bit intel processors in the first place, but I guess they were kinda screwed for mobile parts as the g5 was a dumpster fire in terms of power consumption and laptops were their biggest segment. Hands tied I guess.

Point being though is that you can probably expect there to be huge change in the second wave of ARM based machines that they release. Even if ARM is coming, unless you're keen on buying a short-term machine, the first wave of ARM is probably something you want to avoid.

Much like the original iPhone, original iPad, original MacBook Air, etc. All the rev 1.0 products had their (significant) flaws and were better seen as tech previews rather than the finished version of what came later.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.