Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
APUs might be good for a cheaper iMac or a Macbook Air

Or a Mac mini. Their higher end CPUs also make very enticing Intel alternatives for the rest of the iMac range these days.

My iMac really needs replacement (it's a maxed out 2013 model with 32 GB), but Apple had been behind the curve the last couple of years... I've been waiting for 802.11ax, HDR, 10 nm Intel and for the storage situation (no-ridiculously priced SSDs.). If we get AMD APUs in the Mac mini with wifi 6, getting one with a nice 30-32" screen might be a very good alternative.

Of course, while I'm dreaming - if only Apple and Nvidia could make peace, we could get the GPUs too.
 
I was still surprised the new Mac Pro went with Intel. For the sort of market segment it is aimed at, AMD's Threadripper chips crush Intel.
Zen 3 must be really competitive if Apple is working to put it in their products.

Timing is everything! AMD was not ready to show off the newer generations of Ryzen or Threadripper. I'm sure they did a dog and pony show with Apple but Apple needed a firm product which AMD just didn't have at the time (3 years ago!)

Today its clear AMD has jumped over Intel! Their gamble on the chiplets design and on TSMC's 5 um node process is what made the difference!
 
  • Like
Reactions: Si Vis Pacem
So why haven't they moved to ARM years ago ...
Maybe its because x86 is for High-Performance better, instead of ARM?

Money, experience, and focus. But Apple has billions to spend on R&D, more than Intel and AMD combined. So the only question is where they will focus. Intel is currently the only big enough concern left experienced in HPC. So x86 will be a dead-end for HPC if or when Intel runs out of money before its competition (be that AMD, Apple, or some Chinese consortium). The ISA just wasn't designed for that kind of headroom. And RISC-V is the dark horse.
 
You can't do VLIW on CISC. ARM is a RISC architecture.

Really? Are you unaware that x86 chips from Pentium Pro onwards are all RISC underneath, with the microcode layer translating x86 to the underlying RISC?


The first step of software testing is putting AMD specific code into the OS build. That is what they did. Going from Intel to AMD is not a small MB change. It is a total redesign.
You are mixing up a bunch of different things. Linux, freebsd, openbsd, Windows, even solaris x86 runs well on intel and AMD cpus. What makes you think Apple is so dumb that it would be that challenging for them?

And if there's any special crap that needs to be done, it is only at the kernel and the compiler. All the other applications doesn't need to be changed - that's the point of x86 compatibility.
 
Interesting. I depend on BootCamp, so I hope that whatever Apple does allows me to continue using Windows natively. (Anyone here remember the horrible SoftPC?)

Ah. Nostalgia ain't what it used to be. I remember SoftWindows...

I also remember running PC Emulator on a ARM 3 back when the 'A' in 'ARM' stood for Acorn.
Spoiler: it wasn't great - and that was in the days when ARM was substantially more powerful than a contemporary x86. Solution - first, plug in an expansion card with, effectively, a 'headless' x86 PC sans video/sound card on it. Later, the RISC PC had two processor slots - one for the ARM 6, one for a 386 or 486 (on a little circuit board with just one other custom chip).

Emulation has got better since then (e.g. Rosetta) and tends to use code translation rather than strict emulation but, yeah - that is the biggest problem* with any hypothetical switch to ARM - losing Windows virtualisation/bootcamp capability. Question is, does Apple care about supporting that? (YMMV but, personally, it is something I'm finding less and less useful, especially now Internet Explorer is losing relevance).

* given that getting all the software patched and re-compiled is something that Apple have already managed 4 times (68k->PPC, MacOS9->MacOSX, PPC->x86, x86-32->x86-64 - and anything that's survived all that should be fairly easy to switch to ARM)...
 
Intel... AMD... all I want is an updated MacBook Air without a butterfly keyboard.

But given that AMD's integrated GPUs are way better than Intel's, I'd love to see an AMD APU in the new MacBook Air. A better webcam (at least as good as the one in the pre-retina Airs) would be nice, too.
 
But if you have chipmakers in-house they don‘t need to raise their prices in order to survive themselves ... see Apple A-chips.

That's a different situation here. You see there were 2 major players in both the desktop CPU (Intel/AMD) and GPU (AMD/Nvidia) market, but not the same in ARM SoC (Qualcomm is a jerk and MediaTek isn't at the same level). They have to made their own SoC because they simply have no cheaper options, so much so they even willing to acquire that crappy Intel's 5G department to avoid future expensive modem fee. See how Intel react on their pricing when AMD finally catches up after 10 years? Why make more enemies just for something that won't save you much? (You buy AMD, you then have to compete with Intel/Nvidia.) By contrast, A-series SoCs isn't directly in compete with Snapdragon as they don't sell them to others.

I do believe Apple could have done it better than anyone else with the capital they have even on their own, like their crazy A-series line. But why invested so much money for a tiny and shrinking market when you have multiple options? Apple does best in squeezing their suppliers than anything, and they have investors to answer to.
 
All this hardware talk is fun...

But Apple has a hot garbage software quality problem that is holding back every single product line right now.

That’s where the majority of effort needs to be focused for them.
[automerge]1581183837[/automerge]
Intel... AMD... all I want is an updated MacBook Air without a butterfly keyboard.

So true..

I was at Best Buy the other day and even a couple shopping that honestly seemed like just “normal consumers” knew to avoid the Butterfly keyboards...(they asked if the guy knew when the new keyboards would be on the smaller machines).

There must be a lot of stranded existing inventory out there as more and more people know not to touch those things with a 10’ pole..
 
Last edited:
Exactly why I think the Mac Pro is a failed release..And this year is only going to get worse for Intel especially as far as Laptops are concerned.
AMD 3990x crushing a Mac Pro

The problem is 3990X lack of RAM support (no ECC, no > 256GB RAM), and that's a no go for many. Mac Pro needs EPYC.
 
There is no way in hell AMD can scratch Intel in laptop market anytime soon. AMD saw an opening in desktop and they took a swing at it only cause Intel went all in on laptop chipsets. I can see some AMD in iMacs and some low end laptops but that's about it. Intel is way too ahead in the game and now that they have been battle tested in desktop market they will just grip even tighter in laptop.
 
  • Like
Reactions: lysingur
It's actually more likely that the AMD chips are there for compatibility reasons (so that Mac is not completely and abruptly cut off from the PC world) while the ARM chips taking more and more of a prominent role in the new generation of Macs. Replacing Intel with AMD doesn't offer that much of a performance boost and doesn't fit the trajectory of ARM vis-à-vis Mac that we have seen for the past five years.

And judging from the gaming scene in the Mac community, there likely won't be a "gaming" Mac. Probably just a desktop Mac with higher specs or a new iMac model that's in between the current iMac and iMac Pro.
 
what the benefit of adding a gaming PC...
For people that want a high powered, modular and easily upgradable Mac desktop with discrete graphics. I don't want an iMac, not dealing with a Mac mini with external GPU, and the new Mac Pro isn't realistic either (its priced for business use only).
 
  • Like
Reactions: 09872738
I expect this is for one particular audience: Intel.
Not likely. Large multinational companies have to plan FAR in advance. This isn’t a signal to Intel now, that’s not how negotiation is done. The negotiation was last year when Apple signed the contract for how many chips they’d need this year. It led Intel to believe that Apple’s going to transition this year.
Apple is rethinking that after the ARM32 -> ARM64 transition on iOS, as well as the dropping of x32 apps in Catalina. There's been a *lot* of pushback due to some people not being able to update. Not unwilling, but unable to.
There’s not been a lot of pushback. There’s been a tiny amount of pushback from a tiny number of users, certainly not enough for Apple to rethink anything.
"ARM transition period" can happen in the form of Co-Processors.
Not usually how Apple does things, though. Additionally, the ONLY way to get your developers on board is to show them that there’s no other way. If you release a product that still does the old thing perfectly well, but also does the new thing, developers are going to keep doing the old thing because that’s loads easier.
Isn't this what sank the Itanic?
I always thought that the fact that there was an option provided that yielded a far less impactful transition to 64bit. If Intel was the ONLY player, folks would have had to learn how to do things in the new way and we’d all be in a better 64-bit place today.
Do they care enough about the Mac and the PC industry as a whole to commit to the transition away from Intel or offer both Intel and AMD, which seems incredibly unlikely. To me, that is more important than the speculation of Apple moving to A-Series CPUs...
A good question, and I think the first part feeds into the second part. I personally feel they’re following Steve Jobs “milk the Macintosh for all it’s worth”. If macOS is important, they’ll go to ARM as that gives them long term control. If macOS is less important, then you just transition to a cheaper version of what you already have (AMD) so you can make more of a profit as sales drop.
And render their computers useless for anything meaningful?

Yeah, doubtful. ARM is not and never will be ready to replace CPUs in MacBook Pro, iMac, Mac Pro. Apple just made Mac Pro, for the filmmaker industry. And they would switch to arm and achieve what? 4-5 times worse performance, than measily 28 core Intel CPU? It won't happen. ARM in any Mac apart from Chromebook-Competitor is a pipe dream for the foreseeable future.
The vast majority of folks equate “meaningful” with “checking facebook”. The vast majority of folks have FAR more buying power than every Pro in the world, so I could see Apple concentrating on what’s “meaningful” to that much larger group over the smaller group.

And really, all a future Mac has to do is run macOS and run Mac apps (that have been compiled specifically for ARM). It doesn’t have to EVER approach the raw performance of Intel as long as, side by side, FCP (or Logic Pro) workflows on the ARM system work as well or better.
I hope this end this ridiculous debate about ARM vs x86.
Of course not! :) Because no one has performed the same tests using Apple’s ARM processors. Apple’s are already several generations beyond anything that anyone else, ARM and Qualcomm included, are able to produce, so inferring Apple’s performance from the performance of more poorly designed solutions is a mistake.
Apple does not use ARMs design. They don’t even use ARMs micro architecture. They implement the whole thing themselves.
And, with the current iPhone processors, they even have unpublished operations allowed by the latest ARM licensing agreement. They could design a CPU with commands specially geared towards FCP and Logic Pro workflows, for example.
 
So why haven't they moved to ARM years ago, if they could do such things as you say, and why they test AMD APUs in transition to x86 AMD based Macs, instead of ARM, if everything what you say is so simple?

...because switching from x86 to ARM would still require all x86 MacOS applications to be replaced with ARM versions. That's not as unthinkable as some people seem to think - but it is still a big deal and some things (like Bootcamp) would probably be lost. Apple couldn't completely switch to ARM without a few year transition period, probably starting with an 12" Macbook or MacBook Air replacements which might even be viable with the existing A13 chips... Its not that a MacPro class ARM is impossible, just that (a) Apple would have to design it and (b) those big creative pro apps and their armies of plug-ins are the most likely refuge of legacy x86-specific code.

AMD, on the other hand, is designed to run Intel x86 binaries without modification, and most tweaks could probably be confined to the OS and a very few applications with Intel-specific optimisations. so the switch would be would be trivial compared to changing to a totally different instruction set. Apple could replace their entire range with AMD systems (APUs for the 13" MacBooks and Minis, Ryzen/Threadripper for the 16" MBPs, iMac and missing entry-level Mac Pro, EPYC for the 24/28 core Mac Pro...) overnight if they wanted to, or keep a mix of AMD and Intel without needing two versions of MacOS or 'fat binary' Apps.

There's different advantages, too: ARM offers more bangs-per-Watt and the opportunity for Apple to make their own SOCs customised for each model, AMD offers more bangs-per-buck and higher core counts in the midrange. Apple could even persue both, with AMD as a short-term solution and ARM as a longer term one.

Why now is Intel's much publicised production and new process problems, the increasing importance of power consumption (both for economic/green reasons and to pack lots of cores on to chips for modern, increasingly multithreaded workloads) and Apple's increasing experience in rolling their own A-series systems-on-a-chip. Also, each passing year means more and more applications are getting rid of lingering x86 specific code (and the 32-bit switchoff will have culled a lot of abandonware). 5 years ago, not having native MS office on launch day would have been suicide - today, with the rise of online/cloud options, it would be merely 'courageous'.

Also - USB4 has only recently emerged, which is a pre-requisite for either AMD, ARM or Apple to be able to integrate a TB3-compatible interface in their chips.

But hang on a second - we're probably all getting steamed up because (a) some bod at Apple just pasted some AMD graphics driver code into Mac OS and didn't bother cleaning up all the junk and (b) someone at Intel who can't keep a secret saw an an internal memo saying ''I just read on Macrumors that Apple may be switching to ARM chips, should we put that as a risk in the 'due dilligence' section of our next financial report?".

Apple - given their position and the resources they have - would be stark staring bonkers not to be looking into both ARM and AMD as future contingency plans, if only to strengthen their position in the next round of bargaining with Intel. What stage these developments might be at is anybody's guess (but the Hackintosh community already have MacOS running on AMD).

What's annoying here is people suggesting that the sky would fall if Apple used AMD and/or that using the ARM64 instruction set in anything bigger than a tablet would violate the laws of physics.
 
Or a Mac mini. Their higher end CPUs also make very enticing Intel alternatives for the rest of the iMac range these days.

My iMac really needs replacement (it's a maxed out 2013 model with 32 GB), but Apple had been behind the curve the last couple of years... I've been waiting for 802.11ax, HDR, 10 nm Intel and for the storage situation (no-ridiculously priced SSDs.). If we get AMD APUs in the Mac mini with wifi 6, getting one with a nice 30-32" screen might be a very good alternative.

Of course, while I'm dreaming - if only Apple and Nvidia could make peace, we could get the GPUs too.
If you’re waiting for 10nm, the problem isn’t that Apple’s behind the curve.
 
The vast majority of folks equate “meaningful” with “checking facebook”. The vast majority of folks have FAR more buying power than every Pro in the world, so I could see Apple concentrating on what’s “meaningful” to that much larger group over the smaller group.
For these folks, the modern smartphone will fit all of their needs.
 
Last edited:
I always thought that the fact that there was an option provided that yielded a far less impactful transition to 64bit. If Intel was the ONLY player, folks would have had to learn how to do things in the new way and we’d all be in a better 64-bit place today.

Nah. As @RalfTheDog Itanic wasn't even conventional RISC, it was an exotic VLIW/explicitly parallel design that relied on non-existent compiler technology to produce properly optimised code. Or, to put it another way, it was pants. Even if it had lived it would probably never have been viable for anything smaller than a server or workstation.

Oh, and if Intel had ever been the only player we'd probably be running some benighted descendent of the Pentium 3 that needed liquid nitrogen cooling and used 32-bit addressing and used some nightmare segmented memory kludge to access more than 4GB of RAM.

What Intel forgot with the Itanic (and AMD remembered when they developed x86-64) was that the x86's unique selling point has always been about backward compatibility - remove that requirement and better architectures have always been available (not always better implementation, because Intel had the money to throw at R&D). At one stage, there were DEC Alpha, MIPS and PowerPC versions of Windows NT but MS killed support.

The 8086 started out as a 16-bit stopgap between the 8-bit 8080 and Intel's forgotten 'proper' 32-bit processor, at a point when the Zilog Z80 was eating the 8080s lunch. Its USP? Source code compatibility (and a familiar instruction set) with the 8080 while Intel's competitors Zilog and Motorola went with from-scratch 16 (Z8000) and 32-bit (68000) instruction sets. So enough 8-bit software got ported to CP/M 86 to tip the balance away from the 68000 when IBM went looking for a processor, and since nobody lost their job for buying IBM, x86/DOS/Windows got an unassailable back catalogue of software which - even when written in C - was x86-specific because of the kludgey 16-bit segmented addressing ('near' and 'far' pointers anybody?). Yes, later x86 chips gained a clean 32-bit mode, but 16-bit mode is still cluttering up x86 processors to this day.

The Core-i probably was the best choice for the Mac in 2005, but only because Wintel had killed the competition.
 
Really? Are you unaware that x86 chips from Pentium Pro onwards are all RISC underneath, with the microcode layer translating x86 to the underlying RISC?

Do you know what VLIW is? It means to do all the instruction scheduling at compile time. If you have "microcode" (actually the front-end) translating instructions and then having to deal with the scheduling of those micro-ops, you defeat the entire purpose of VLIW.

So yes, he is completely correct. You cannot do VLIW on a CISC architecture.

As has been pointed out, the translation of x86 into the underlying RISC-like architecture is not free. Intel's front ends today are about twice as large as the execution units. Intel throws money in engineering and fab at the x86 tax. AMD historically tried and only recently figured it out. Everybody else went to RISC or GPU.
 
Apple should have bought AMD a couple of years ago when they were at $2 per share. Would have given them even more control over the Macs graphics and now (possibly) processors.
They can still buy them. ;)

But I hope not. It is good for competition in the PC market, that there are AMD CPUs.

AMD's x86 license is not transferable. Which means it would be useless for Apple if they wanted to retain x86 compatibility.

Nope, long story short, Thunderbolt 3 is no longer proprietary to Intel. You can buy an AMD machine with Thunderbolt 3 today.

Thunderbolt 3 certification is still Intel only. And apart from the ASRock motherboard announced literally 24 hours ago, there are no AMD machine or parts using TB3 that is not using an additional PCI-E addin card ( With Intel's Chip ).
 
  • Like
Reactions: Andres Cantu
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.