Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
RISC has an inherent advantage when dealing with memory because you can coalesce reads and writes (since only load/store instructions access memory, not every ALU instruction, etc.) and because you have more registers (so you need to read or write to RAM less frequently).

Can you comment on the challenges / disadvantages of RISC (or more specifically Apple Silicon)?

I get the argument that in at any power level they are more efficient given natively compiled code.

What has been putting off everyone from going to ARM ? The lack of support?

What made Apple decide that now it's the time to tackle all that? Is it because ARM/x86 outperformance is that high? How high is it, 100x, 10x, 2x faster (again in the all code runs natively world) when it comes to desktop workloads (maxed out Intels @125W )?
 
Can you comment on the challenges / disadvantages of RISC (or more specifically Apple Silicon)?

I get the argument that in at any power level they are more efficient given natively compiled code.

What has been putting off everyone from going to ARM ? The lack of support?

What made Apple decide that now it's the time to tackle all that? Is it because ARM/x86 outperformance is that high? How high is it, 100x, 10x, 2x faster (again in the all code runs natively world) when it comes to desktop workloads (maxed out Intels @125W )?

I always predicted on here that on the same process node, with the same quality designers, using the same techniques, the arm chip would have up to 20% advantage in performance per watt (you can choose whether to make it 20% faster for the same power, or 20% less power for the same speed, or mix and match).

At the moment TSMC also is a huge advantage over Intel. TSMC at 5nm is approximately what Intel would be at 7nm - the design rules are very similar. But TSMC is really making chips at 5nm and Intel is still pretty much at 10nm. So there’s maybe another 20% advantage there for Apple (and for AMD!) at the moment.

Then you take into account that Intel’s designers are not great - the designers at Apple came from AMD, Exponential, DEC, etc., and are among the best. I used to benchmark our designers against designers who worked at tool companies (synopsys, cadence, etc) and found a 20% advantage to our designers. Figure Intel isn’t as bad as them, so call it 10%.

So that gets you to around 50%, which is the bottom range of what Kuo is claiming.

So what are the disadvantages? Well, we don’t know what the GPU will be like. The ipad gpus are not bad, but clearly aren’t desktop class. That said, polygons per watt they seem about on par. So can Apple scale those designs to be competitive with desktop/laptop dedicated graphics?

And what about ports? Will they support thunderbolt? Are they waiting for USB 4? What about PCI?

For certain obscure types of apps, ARM would be worse (code that modifies itself, for example, though that is not commonly used anymore). If you rely heavily on Intel floating point, ARM floating point is different (though, to be fair, ARM uses the IEEE standard and Intel doesn’t. It’s not that Intel is better - it’s just different, but some people may need to keep the identical behavior to compare new calculations to historical ones).

There are certainly a few instructions here and there where x86 may have an advantage over Arm, depending on the particular app. But overall Arm is a solid step upward.
 
I think the Mac gaming community is about to get turned on it’s head to be honest. Apple is clearly building some great graphics going forward.

Obviously that will do nothing for or existing games (especially on Windows).

Highly doubtful. Apple has never been a gaming platform contender since the Apple II days so it's been irrelevant for nearly four decades. Gaming development studios have finite resources so there's no room for fourth place after Windows, consoles and Linux. It's debatable if there's even room for third place but Linux is making headway with performance on par if not faster with running Windows games under Linux Wine compatibility layer.
 
Last edited:
In CISC you would:

LOAD [memory A], R1
LOAD [memory B], R2
ADD R1, R2 -> R3
STORE R3, [memory C]

Even in this tiny example, the STORE can be postponed and other stuff can execute, as long as R3 isn’t needed for anything. Only when R3 is needed do you have to do the store - at that point perhaps you store multiple things at once, to minimize the penalty.

And this also simplifies the hardware design tremendously. For x86 you may try to fix this by adding secret registers to hold the results of memory A and memory B and memory C. But then you need to keep track of what those registers hold (with tags), and have complex logic to decide when to deal with them vs when to just do a load/store.

You also, in any modern x86, essentially try to convert the first example to the second example. But the hardware to do that takes space and multiple pipeline stages. It’s simple in this example, but much harder in some of the goofier x86 things - like x86 lets you do things like:

ADD [memory A offset by the contents of register A], [memory B] -> [memory C]

Now you have to add just to figure out what memory address you are adding!
I assume you are referring to RISC here, am i right?
 
  • Like
Reactions: Argoduck
Well you get your wish! macOS Big Sur has basically gone over to the iOS/iPadOS UX paradigm completely. They all look pretty much exactly the same now! ...

Just remember that UX is about a LOT more than look. The question is how differently they operate in terms of workflow and efficiency with keyboard/mouse vs touch, and workspace differences, multitasking, etc.

IMO, the look is positive (in the little I've seen). It is coming back from the stupid flat design, adding in some color and depth (which are actually important to UIs).

Work out the GPU shader Texel/TOPs per Watt and they are quite close. The main disadvantage is that they (currently?) share the power and heat sink with all the application CPUs on the same silicon die.

Interesting. That would be cool, but I thought there'd be a lot more complexity to that... but I'm way out of my depth.

... They specifically state "MacOS is going to continue working exactly how it has in the past", implying no locked up, walled garden people keep catastrophising about.

Well, as they say that, MacOS continues to incrementally operate quite differently than it has in the past.

... They are going to use a variant of the new A14, and they want to show their true performance cards on release of that chip with that machine and no earlier.

Yeah, and hopefully a DESKTOP variant, not just taking the iPad A14 and putting it in a mini case, etc.! It should be exciting.

I'm more and more convinced apple will alienate pro users, but keep general consumers who like continuity etc with the transition to ARM based mac's. ...

Yes, this is the tricky part. It all comes down to software and target market. Lots of very advanced things don't get picked up because of the market. The question is whether it is so advanced that the general pro market (ie. software devs and pro users) sit up and take note. Or, it could just become a much bigger platform for the average user as an extension of the iPhone trend. BUT... as you noted, those users don't care that much about performance anyway, so even if Apple is 5x faster, does that matter to non-pros?

Apple has said they will transition the entire Mac lineup to custom silicon within two years. That includes Mac Pro. ...

But, does that mean FULLY transition, or just that they'll have a full range of Apple Silicon models? They could certainly still sell the Mac Pro in Intel format for pro users who need the x86 compatibility.

... Chip design is very complicated and just because someone created one kind of chips doesn't equate into superiority in other fields. ...

True, and I'm no expert on the matter. But, Apple has a TON of R&D $ and a lot of experts. Is this similar to the other phone makers saying Apple wasn't going to just walk in and make a better phone?

I'm just concerned that all Intel Macs will only receive os updates until 2022. Which basically makes my 2019 imac obsolete within 3 years...

Why? Is there any reason Apple would drop support for Intel based macs in 2022? Just because they'll (supposedly) have a full-line of Apple Silicon Macs by then doesn't even mean they won't still sell Intel-based Macs, let alone drop all software support for them!!!

... What I do not understand is what specifically the person I replied to, who made the point of identifying themselves as "a full stack web developer" needs a Windows x86 VM for.

I see someone already responded with .NET. But, I was kind of thinking the same thing. Most web dev work doesn't care about platform much at all besides certain tools.

The obsolescence for macOS users will not strictly be in terms of the processor architecture, but with battery consumption in laptops. The new Samsung Book S, built with Snapdragon ARM running Win10 ARM, lasts 18 hours.

Well, I'm not as much in the laptop market anymore, but does that really matter? If my laptop lasts 10+ hours (as they currently do), I plug it in overnight while I sleep. It mattered a LOT when it was 2-3 hours, but now it seems it becomes almost a spec-war for 99% of people.

The converse is also true...if your needs are so specialized, why do you attach to Macs in the first place? You would be better served by a Windows laptop or a desktop if you don’t mind me tied down to a desk. ...

It is mainly people who have to (or want to) work in both environments. For example, I spend a lot of time on Mac, but have to launch Revit or other apps. It is *really* handy to just launch Revit in Parallels and not have to reboot my Mac or switch to another machine.

If/when the time comes that I have to run Revit all day long, then yeah, I may as well be on a PC box, aside from liking some of the Mac peripherals better, or the overall machine quality.

Back to what I said earlier, it is all about the software (mostly). People need certain apps and workflows and the hardware kind of follows. It has been a big advantage for the Mac for decades now, to be able to run x86-world apps. The big question is whether the Mac market is now mature enough to ditch that (and take the consequences).

... I don't think a $75bn organisation is "saddled" with anything. You could equally say Apple are "saddled" with zero experience of making high-end desktop and professionally oriented CPUs. ...

They are 'saddled' to x86. And, that experience is a lot easier to obtain, especially if you've already got the basis of it in blowing everyone else away in other related markets.

... ARM has been around for decades. If it is so superior to x86 then where has it been for the past 30 years? ...

Software compatibility.

... In other words, the Mac may be the reason that Windows on ARM actually happens, just as the iMac was the reason that USB finally happened. ...

Well, we can hope. But, I think the performance advantage would have to be substantial and that margin maintained for some time before you'd see a shift in the market (devs getting on board, and finally mature industries following). I think we're talking a decade or two from now.
[automerge]1592940795[/automerge]
... So that gets you to around 50%, which is the bottom range of what Kuo is claiming. ...

And, if that's the case, I doubt that would be enough to take the entire mass of legacy developers and industries using those apps to sit up and take note. It has to be bigger than that, IMO.
 
I really do not understand how Apple can jettison Windows for X86 from its new ARM-based Max platform. This will seriously reduce their sales in the enterprise, where Windows is still king. They will have to come with a solution for running Windows for X86 on the ARM architecture one way or another.
maybe their own built in hypervisor will handle the X86 émulation for Windows.
[automerge]1592925883[/automerge]

Windows on ARM is dead. It has no applications, not even MS office.
[automerge]1592926195[/automerge]

This was on done on stage. Nice way to obfuscate, though. I’m talking about the X86 games emulated under Rosetta 2 in their secret lab. Nothing was mentioned about the silicon they ran that emulation on other than it was the Apple silicon. I’m 99%!sure it was A14X or whatever they are planning for the ARM Macs at the end of 2020.


Enterprise sales are iPads & iPhones - not macs. I suspect that Mac enterprise sales aren't even a rounding error for Apple.

The goal isn't to compete with Windows - they lost, decisively.

This is about locking down the platform and those sweet, sweet profit margins.
 
Not sure why running native x86 benchmarks matters.

The general point of benchmarks is to check for compatibility and performance. In an ideal world, benchmarks represent real applications, but I agree many benchmarks don’t matter.

Not sure why one would run these under Windows on ARM rather than the native macOS versions?

Cinebench for me personally is an important example, since my company uses the engine extensively. Of course we’ll want to use the native version, I just hope it’ll exist.
 
I always predicted on here that on the same process node, with the same quality designers, using the same techniques, the arm chip would have up to 20% advantage in performance per watt (you can choose whether to make it 20% faster for the same power, or 20% less power for the same speed, or mix and match).

At the moment TSMC also is a huge advantage over Intel. TSMC at 5nm is approximately what Intel would be at 7nm - the design rules are very similar. But TSMC is really making chips at 5nm and Intel is still pretty much at 10nm. So there’s maybe another 20% advantage there for Apple (and for AMD!) at the moment.

Then you take into account that Intel’s designers are not great - the designers at Apple came from AMD, Exponential, DEC, etc., and are among the best. I used to benchmark our designers against designers who worked at tool companies (synopsys, cadence, etc) and found a 20% advantage to our designers. Figure Intel isn’t as bad as them, so call it 10%.

So that gets you to around 50%, which is the bottom range of what Kuo is claiming.

So what are the disadvantages? Well, we don’t know what the GPU will be like. The ipad gpus are not bad, but clearly aren’t desktop class. That said, polygons per watt they seem about on par. So can Apple scale those designs to be competitive with desktop/laptop dedicated graphics?

And what about ports? Will they support thunderbolt? Are they waiting for USB 4? What about PCI?

For certain obscure types of apps, ARM would be worse (code that modifies itself, for example, though that is not commonly used anymore). If you rely heavily on Intel floating point, ARM floating point is different (though, to be fair, ARM uses the IEEE standard and Intel doesn’t. It’s not that Intel is better - it’s just different, but some people may need to keep the identical behavior to compare new calculations to historical ones).

There are certainly a few instructions here and there where x86 may have an advantage over Arm, depending on the particular app. But overall Arm is a solid step upward.

Thanks for a very informative reply. TBH I was expecting a bigger step up, that said a 30% increase in battery life for a Macbook Air over a generation would be pretty damn solid (50% - 20% for less efficient execution). Especially because that’s where Apple can limit efficiency loss as apps are rather simple, already running on ARM (Pages, Excel, Safari).

I guess the iPad Pro on par with XBOX S or Macbook Pros was just very selective statistics on some benchmark / aspect of the platforms then.

I am assuming scaling won’t be a problem at least in the beginning. How about the need of dedicated hardware to handle certain tasks ? Neural Engine or Afterburner is an example of that. Will we move towards more prop. dedicated hardware?
 
Highly doubtful. Apple has never been a gaming platform contender since the Apple II days so it's been irrelevant for nearly four decades. Gaming development studios have finite resources so there's no room for fourth place after Windows, consoles and Linux. It's debatable if there's even room for third place but Linux is making headway with performance on par if not faster with running Windows games under Linux Wine compatibility layer.
It’s very clear from the wording used at things like the SOTU that Apple is making a big push into gaming.

My best guess, we’re going to see an AppleTV using Apple silicon that is every bit on par with the next gen consoles in 3ish years. They will tout that you’ll have the ability to run any apple ecosystem game on any device.

Their lead chip designer took special care to note that their pipeline in graphics is what he’s MOST excited about. Given how exciting all the rest of apples chip developments and roadmap is already, I have to think there’s a reason their GPU enhancements are what he explicitly emphasized.

They’ve clearly been taking a long strategic approach to this as things like Metal were clearly designed for the capabilities of Apple’s own GPU’s which is why they started migrating development over years ahead of any other parts.

That’s just my tea leaves reading on the matter.
 
I'm still in the return window for my 3k (including applecare/tax) 2019 16 inch mbp. What to do, what to do... I absolutely love this computer and have no other mac to fall back on....
 
It’s very clear from the wording used at things like the SOTU that Apple is making a big push into gaming.

My best guess, we’re going to see an AppleTV using Apple silicon that is every bit on par with the next gen consoles in 3ish years. They will tout that you’ll have the ability to run any apple ecosystem game on any device.

Their lead chip designer took special care to note that their pipeline in graphics is what he’s MOST excited about. Given how exciting all the rest of apples chip developments and roadmap is already, I have to think there’s a reason their GPU enhancements are what he explicitly emphasized.

They’ve clearly been taking a long strategic approach to this as things like Metal were clearly designed for the capabilities of Apple’s own GPU’s which is why they started migrating development over years ahead of any other parts.

That’s just my tea leaves reading on the matter.

Candy Crush on the desktop.

There aren't that many games that use Metal.
 
Candy Crush on the desktop.

There aren't that many games that use Metal.
I understand that, but if you couple pulling in iOS gaming devs with the tools they already know PLUS what is going to be *massive* GPU leaps on the Mac, at some point Apple’s going to have some capabilities that will be awfully enticing.

I’m taking the long term approach here. I think anyone who thinks Apple isn’t going to be a contender in the gaming market in the coming years is going to be awfully shocked at the state of things in 5 years. Just my hunch from years of reading between the lines of WWDC presentations.
[automerge]1592942474[/automerge]
I'm still in the return window for my 3k (including applecare/tax) 2019 16 inch mbp. What to do, what to do... I absolutely love this computer and have no other mac to fall back on....
Use the fantastic machine you have. What downside is there to that?
 
I'm worried that it won't be supported in a few years and/or will have little resale value. If Apple bought it back at a fair price from me in an exchange in a couple of years (as they do now with intel MBPs), I'd be fine with that. I guess I'll hold on but it kind of takes the shine off my purchase. I think the move to ARM is pretty cool, but at the end of the day I spent a boatload of cash on this laptop and it may be the last of its kind!
 
Then why switch now? ARM is not natively compatible with today's software.

Because they don't care so much about that any longer. Apple simply doesn't need the pro-market who use apps outside of the ones they have solidly in their camp. If they get them, eventually, that's just icing on the cake.

Up until a decade ago, that was pretty much everything to them. Their survival depended on it.

(Note, I don't like this... I'm just stating the reality of the situation.)

I'm still in the return window for my 3k (including applecare/tax) 2019 16 inch mbp. What to do, what to do... I absolutely love this computer and have no other mac to fall back on....

What's the alternative? Return it and get some thing cheap to get by until fall... and then hope Apple transitions a model powerful enough to keep you happy? That might work, I suppose. But, I'm doubting the initial Apple Silicon model introductions will be at the level of your machine. Maybe in 2021?

I'm worried that it won't be supported in a few years and/or will have little resale value. If Apple bought it back at a fair price from me in an exchange in a couple of years (as they do now with intel MBPs), I'd be fine with that. I guess I'll hold on but it kind of takes the shine off my purchase. I think the move to ARM is pretty cool, but at the end of the day I spent a boatload of cash on this laptop and it may be the last of its kind!

I could be wrong, but I don't think I'd worry about that. Apple isn't just going to end software/OS support for Intel, especially when it is likely they'll be dependent on Intel in the high end for years yet.

I think ARM will be cool too... in several years when I might be considering a new machine. I have a 2018 mini with eGPU and don't at all think I won't be using it several years from now.

IMO, it's a non-issue unless you have some specific reason you really, really want to be on (Apple Silicon/ARM) right away.
 
  • Like
Reactions: Argoduck
I tend to agree. I somehow doubt they will have an ARM-powered 16 inch MBP to rival the higher end model I have within the next 18 months. But that's just a guess. I think people, including myself, may be underestimating just how intensely Apple is focused on making this shift happen.
 
  • Like
Reactions: SteveW928
I tend to agree. I somehow doubt they will have an ARM-powered 16 inch MBP to rival the higher end model I have within the next 18 months. But that's just a guess. I think people, including myself, may be underestimating just how intensely Apple is focused on making this shift happen.

Agreed, and it already seems to be going faster than anticipated pre-announcement.

But, I think the fears of end-of-support are flatly wrong, and it isn't like even if Apple Silicon Macs are 2x faster (which is highly unlikely), and fully transition in 2 years (also unlikely, IMO) that our Intel-Macs suddenly become paper-weights.

Unless you're in some business where speed directly equals $$$, you shouldn't worry about it. If you are, then you probably shouldn't worry about it either, as you'll just retire that 16" and happily spend the $ on buying the new model.
 
$3,000 is by far the most I've ever spent on a computer. It's a real investment, and I have no problem paying it (well, my bank account does, but anyway) because I know I'll be way more productive with this unit. But I had planned at least for it to hold it's value for a few years. If it's only worth $1,000 or less next year, I will not be pleased. If the transition does happen faster than some think, I want to be able to upgrade to an ARM MBP 16 without shelling out another $3,000. I'm not sure why people don't mention apple's buyback program more? It seems like a possible stop gap here for people with brand new intel products to upgrade in a year or two. Apple is worth 1.5T so they aren't really worried about me, but it would be a good gesture if they guaranteed to help with upgrades in that fashion. Then I could enjoy my mac a hell of a lot more!
 
  • Like
Reactions: SteveW928
I understand that, but if you couple pulling in iOS gaming devs with the tools they already know PLUS what is going to be *massive* GPU leaps on the Mac, at some point Apple’s going to have some capabilities that will be awfully enticing.

I’m taking the long term approach here. I think anyone who thinks Apple isn’t going to be a contender in the gaming market in the coming years is going to be awfully shocked at the state of things in 5 years. Just my hunch from years of reading between the lines of WWDC presentations.

As I said - Candy Crush, not real games. Apple will be in contention with the mobile game markets (with all of the micro-transaction nastiness that will come with it). Real gaming - not so much.

What "massive" GPU leaps? We are talking Apple - the company that hasn't shipped a current GPU with it's hardware in over a decade. JHC, they are shipping 4 year old Polaris cards with their "flagship" desktop.

You are building a very large house on a very slender reed. I would really like to see how those SOC graphics drive a large 4K panel - I am thinking slide show, as opposed to frame rates.

The upcoming consoles will be 8 core/16 thread systems with 2080ti video performance. That will be the baseline for game development. I don't see that happening on a jumped up iPad processor.

I won't even get into desktop productivity - I have a 16 core/32 thread system for a reason.
 
  • Like
Reactions: Jordan Klein
Then why switch now? ARM is not natively compatible with today's software.

It is in the nature of a switch, that not all software will be available at day one. On the other hand when would be the time to switch to a superior architecture?
 
  • Like
Reactions: Argoduck
It is in the nature of a switch, that not all Software will be available at day one. On the other hand when would be the time to switch to a superior architecture?

Once that architecture has been proven superior for your work case.

All we have been shown is a jumped up iPad processor, and Timmy's promises of great things coming. Problem is that I have seen this show before.
 
  • Like
  • Haha
Reactions: Argoduck and djjeff
$3,000 is by far the most I've ever spent on a computer. It's a real investment, and I have no problem paying it (well, my bank account does, but anyway) because I know I'll be way more productive with this unit. But I had planned at least for it to hold it's value for a few years. If it's only worth $1,000 or less next year, I will not be pleased. If the transition does happen faster than some think, I want to be able to upgrade to an ARM MBP 16 without shelling out another $3,000. I'm not sure why people don't mention apple's buyback program more? It seems like a possible stop gap here for people with brand new intel products to upgrade in a year or two. Apple is worth 1.5T so they aren't really worried about me, but it would be a good gesture if they guaranteed to help with upgrades in that fashion. Then I could enjoy my mac a hell of a lot more!

I hear you.... similar here. I suppose I spent a bit more back in earlier Mac days, as Macs have been fairly cheap over the last couple decades compared to previous decades. But, my mini setup is also intended to last quite some time. I think the reality is that things are just moving more quickly these days in terms of obsolescence, just based on software/OS/security issues... but that is across all computing industries/devices.

I'm a bit more concerned about the future of eGPU to be honest, but unless there are incredible leaps, I can just buy a used Blackmagic Pro or a non-quiet box I can put any GPU in, and greatly extend the life of my system. I think my 6-core i7 is going to be competitive, or at least adequate for me, for quite some time. And, if I suddenly get richer, then I won't have to worry about it.

Does that mean I won't be drooling over an Apple Silicon machine in a couple years? Probably not. But, will I really need one? Or, will my current machine not be adequate? I doubt it.

And, like I said, I really doubt Apple will cut-off support. The more likely problem would be if we start seeing apps I'm really wanting that take advantage of Apple Silicon that are too compelling.
[automerge]1592945650[/automerge]
Once that architecture has been proven superior for your work case.

All we have been shown is a jumped up iPad processor, and Timmy's promises of great things coming. Problem is that I have seen this show before.

Well, and even then... if I had a dollar for every time a proven superior technology was mostly ignored by a professional industry...
 
It is in the nature of a switch, that not all software will be available at day one. On the other hand when would be the time to switch to a superior architecture?
That's not what he said. He said "Software compatibility. " That's it, no qualifications. As for this being a "superior" architecture I repeat my question: Why now? Why not when they made the switch to PPC?
 
... What "massive" GPU leaps? We are talking Apple - the company that hasn't shipped a current GPU with it's hardware in over a decade. JHC, they are shipping 4 year old Polaris cards with their "flagship" desktop.

...

The upcoming consoles will be 8 core/16 thread systems with 2080ti video performance. That will be the baseline for game development. I don't see that happening on a jumped up iPad processor.

I won't even get into desktop productivity - I have a 16 core/32 thread system for a reason.

As others have said... if they take the tech and are able to apply it to desktop designs. This is out of my wheelhouse, for sure, but some seem to think they can do it. It isn't about a jumped up iPad chip.

Why do you not think they could make 20-core/40 thread system, for example? Maybe there is some reason, but I don't think AMD or Intel have some secret Apple can't have.

Isn't part of the problem that they HAVE either not had the resources to do their own thing (early days) or have been held back by the advancement of others. When they broke those ties on the phone, they've blown others away. I guess the *assumption* is they might be able do that on the desktop, too. Maybe, maybe not, but I haven't really heard good reasons why they won't be able to.
[automerge]1592946110[/automerge]
That's not what he said. He said "Software compatibility. " That's it, no qualifications. As for this being a "superior" architecture I repeat my question: Why now? Why not when they made the switch to PPC?

Because in those days, Apple was the underdog, with a brighter future in the potential of compatibility, and being dependent on others to make chips that would fuel their laptops, etc.

Things are MUCH different today.

Today, Apple *could* just move forward, leaving all the legacy behind. We'd (a small group of us) hate that, but they could do it.
 
All we have been shown is a jumped up iPad processor, and Timmy's promises of great things coming. Problem is that I have seen this show before.
Exactly! This is PPC versus x86 all over again. The difference this time, at least at the time I write this is, we haven't seen any supporting data. People keep touting ARM as the best thing to hit the processor world. I have no doubt these processors will be competitive and, in some instances, faster than x64. Just like PPC was back in the day. But to listen to people tell it Intel is doomed. They may as well liquidate their processor business and focus on something else.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.