Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Transitioning to AMD vs transitioning to ARM isn’t even in the same ballpark. Hacintoshes have ran on AMD with very few modifications that Apple could probably have an engineer do in a day. The transition would be seem-less for users.

Yep - Apple could flip over to AMD with an unbelievably low amount of work needed on their part..
[automerge]1581103771[/automerge]
Prepare for AMD powered Hackintoshes.

Already a thing and powerful as all get-out...
 
Transitioning to AMD vs transitioning to ARM isn’t even in the same ballpark. Hacintoshes have ran on AMD with very few modifications that Apple could probably have an engineer do in a day. The transition would be seem-less for users.
Nothing is ever “seemless” in these types of transitions, although Apple has “THE” best record at this, so I have to give them their due. However, if Apple does move to AMD, then it means Intel is gone after a period of transition and it also means that Apple is NOT transitioning to A-Series for at least another decade. Apple does not invest in multiple parallel paths, especially not on a product line that only accounts for 8% of total sales in 2019.
 
Already a thing and powerful as all get-out...
This is already a thing. Has been for about a year.

I wasn't aware of that.

I just built a new 3900X windows PC, and it totally flys compared to anything Intel offers in even a remotely close price range. It's a shame that AMD's GPU drivers are complete garbage and my 5700XT crashes all the time.
 
  • Like
Reactions: Stromos
I wasn't aware of that.

I just built a new 3900X windows PC, and it totally flys compared to anything Intel offers in even a remotely close price range. It's a shame that AMD's GPU drivers are complete garbage and my 5700XT crashes all the time.

I picked up a AMD Asus laptop that I have been very happy with. They are certainly back into the game.
 
  • Like
Reactions: Si Vis Pacem
Yes. Apple's silicon team has been absolutely KILLING IT recently with the H1, A13, A12X, et cetera. Their processors are way faster than a lot of Intel CPUS, plus they have integrated Neural engines and GPUs.
[automerge]1581094739[/automerge]


That's not what the article is saying. They're not replacing GPUs with AMD, the references are for combined CPU and GPU chips from AMD. And if Apple is developing their own ARM chips for the Mac, they would definitely be all-in-one CPU/GPU/Neural Engine/etc. chips, like the A13.
Apple ARM based CPUs are not faster than Intel CPUs because they have better cores. They don't. With each cycle, any 4 GHz x86 CPU will do 4-5 times more work than any ARM based CPU. There is a reason why ARM is RISC(REDUCED Instruction Set...) Architecture.

The reason why iPad processes video faster than Intel CPUs is because of integrated Image processor, not the cores, themselves. The cores are, and will always be slower, than any x86.

Guys, if you are not enough educated on the topic of CPU architectures, don't hype products up, because if Apple will switch to ARM on anything other than basic Services-access computers, competitors to Chromebooks, to put it plainly - you will make other customers hurt in their workflow.

ARM is not ready, and will never be ready to replace x86 for anything High-Performance. There is a very good reason why there does not exist a single SuperComputer that is based on ARM, all of them are based on x86. Apple is not going to change this.
 
I'm shocked Apple hasn't gone to AMD by now. Intel is showing no innovation. Stagnation is the word du jour.
Changing from one x86 vendor to another x86 vendor has nothing to do with innovation... It’s essentially an operations issue, with a little marketing sprinkled in.
 
I wasn't aware of that.

I just built a new 3900X windows PC, and it totally flys compared to anything Intel offers in even a remotely close price range. It's a shame that AMD's GPU drivers are complete garbage and my 5700XT crashes all the time.

I have to admit that I find it VERY interesting that we have gone from AMD Hackintoshes being a no-go and not recommended about 18 months ago to it seemingly being a 1-2-3 easy peasy lemon-squeezy operation. I get that there are a lot talented people in the Hackintosh community, but AMD used to be NOT a thing.

Yes, apparently AMD’s Navi drovers are a hot mess on Windows and macOS. If this is the way Apple is going, the GPU drivers are the actual hold up, not the other possible AMD hardware pieces.
 
It looks like Apple really was just waiting on AMD to deliver the goods as far as laptop chips to finally begin sourcing x86 chips from AMD as well.

Indeed. Laptops make up around 80% of all Mac sales so have a strong APU package would be beneficial for the MacBook Pro, MacBook Air and Mac Mini families (with the MacBook Pro 16" also having a dGPU in addition to the APU).


Going to AMD would be a total redesign of the motherboard. It would also require a significant amount of software testing. This is not something they would do for BTO only.

As noted, Apple is constantly tweaking their motherboards for their current products as Intel and AMD change their CPU packages and other ancillary chips. So doing it for AMD would not be an issue. And AMD runs the same x64 code as Intel so there would be little to no recompiling needed for macOS and none for third-party applications running on top of macOS.


The amount of fragmentation that is in the pipeline for Apples products...

This would actually de-fragment the Mac product line as everything would be from AMD instead of a mix of Intel and AMD.


No TB3 no AMD. (TB3 is proprietary INTEL protocol, right ? )

Apple was a co-creator of Thunderbolt with Intel. The TB3 protocol has now been made part of the USB 4.0 standard so non-Intel Thunderbolt controllers will be coming online.
 
I'm shocked Apple hasn't gone to AMD by now. Intel is showing no innovation. Stagnation is the word du jour.
Because AMD processors haven't been competitive from a perspective until now. With Zen 2 and the upcoming Zen 3, now is the right time to do it.
 
If high end means starting at $2k and having user-replaceable components, shut up and take my money! If high end means $6k (like the new Mac Pro) or the $1k monitor stand, I have no choice but to look elsewhere.

If we are talking 'high end gaming' then it would need to be closer to the $2k, you are not going to switch people from a gaming PC otherwise.
 
I think at this point everybody should get back to that Mark Gurman's Report that in 2020 Apple will start phase out Intel CPUs from their computers.

Finding AMD APU/CPU references in Mac OS Catalina Beta, which will be ready in time for March event should show you that paint is on the wall quite clear.
 
Intel FAB is stalled for the next 18 months. Apple isn't negotiating when Zen 3 will be at 5nm and Intel is stuck at 14nm.

Apple most certainly can maintain both. Their resources dwarf the rest of the industry. Apple would provide an upgrade path for first-generation Mac Pro customers.

Intel’s fab issues aside...the $64,000 question is WHY would Apple maintain both in a shrinking PC industry? Why would they move from Intel to AMD given that the Mac sits at 8% of it sales? Because Intel is not doing great, but they aren’t so bad that Apple couldn’t ride it out a while longer.

Apple’s upgrade path for Mac Pro customers is to buy a new Mac Pro, they aren’t going to take your trade-in if they introduce an AMD-based Mac Pro.

IF Apple is willing to maintain both AMD AND Intel product lines in tandem, it means that a quantum shift in Apple’s thinking about the PC industry has occurred. People need to adjust their expectations accordingly, myself included (IF it happens).
 
  • Like
Reactions: PickUrPoison
You can't do VLIW on CISC. ARM is a RISC architecture. In VLIW, the compiler realigns instructions so they can be loaded into the execution units as a very large block. Each spot on the block corisponds to a specific execution unit. The first 12 could be general instructions. The next 15 could be integer, while another 18 could be floating point.... Using VLIW, it is not out of bounds to expect 64 instructions to be completed per clock, with a much higher clock than you could get with X-86.



The first step of software testing is putting AMD specific code into the OS build. That is what they did. Going from Intel to AMD is not a small MB change. It is a total redesign.
What is so brilliant about VLIW?
Itanium was supposed to be a VLIW architecture and look how that floundered. It reallt didn't scale that well when compared to other CPU's.
Oh, and Intel CPU's are not CISC at the Hardware level. They are RISC and use microcode to emulate the X86/64 Instruction set. I'm sure that there is a lot of performance to be gained by just removing that layer and letting the raw CPU free.
I suspect that AMD is much the same.
 
Only if you consider your data to be important.

Not really, the soft error rate is negligible on desktops with frequent reboots (once every few weeks vs once a year on servers) and small amounts of RAM (16 GB vs 512+).

Modern ECC is for availability. That is, it is designed to be tolerant to complete chip and bus failures (a.k.a. chipkill), which are caused by things like pins going bad, chips having solder cracking. IBM realized this was a significant failure mode decades ago.

To do this there's costs. You have to use low density (ideally x4) DRAM, you pay for extra power in refresh and bus, and you pay in performance (lockstep mode).

So the question is whether you want to pay for all of this to keep your desktop computer running, or just pop out the bad DIMM and reboot.
 
But WHY is the PC market shrinking Zdigital2015?

Intel went on autopilot for a decade. People aren't going to replace their computers for a 3-5% performance increase. That is precisely why the 4,1s & 5,1s lasted as long as they did. Intel stopped innovating.

AMD is showing a 15 - 20% performance upgrade per generation (which is about 18 months). If you go from a 1st gen Ryzen to a 3rd gen Ryzen, you will see a major performance increase (for very little money).

One thing that no one has pointed out is that going with AMD means that Apple wouldn't have to completely redesign a motherboard everytime AMD releases a new CPU. The AM4 platform goes from 1st Gen Ryzen to 4th Gen Ryzen.

If AMD does the same thing with the AM5 socket (4 years or so), Apple wouldn't have to redesign a motherboard. Simply drop in the next gen AM5 processor.
[automerge]1581106615[/automerge]
Not really, the soft error rate is negligible on desktops with frequent reboots (once every few weeks vs once a year on servers) and small amounts of RAM (16 GB vs 512+).

Modern ECC is for availability. That is, it is designed to be tolerant to complete chip and bus failures (a.k.a. chipkill), which are caused by things like pins going bad, chips having solder cracking. IBM realized this was a significant failure mode decades ago.

To do this there's costs. You have to use low density (ideally x4) DRAM, you pay for extra power in refresh and bus, and you pay in performance (lockstep mode).

So the question is whether you want to pay for all of this to keep your desktop computer running, or just pop out the bad DIMM and reboot.

I'd first have to troubleshoot the system to discover the bad ram & then I would have to have a replacement DIMM handy.

Saving time from that is worth the cost of ECC, but that is just me.
 
This would so be a WWDC moment.
it would be nice for it to be sooner but you can so see Phil amping this one up during his time on-stage.

iMac Pros with Threadripper.
Mac Pros with Epyc.
iMacs with Ryzen 5-9.
Minis with Ryzen 4000 series APU.
 
  • Like
Reactions: turbineseaplane
Nope, long story short, Thunderbolt 3 is no longer proprietary to Intel. You can buy an AMD machine with Thunderbolt 3 today.
My last years AMD PC has Thunderbolt PCI 4. Intel is behind. AMD has been more interesting for about 24 months now, in my opinion. If you going to game, do it on a PC, you will be happier, and I am a life long Mac user.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.