I'm shocked Apple hasn't gone to AMD by now. Intel is showing no innovation. Stagnation is the word du jour.
Transitioning to AMD vs transitioning to ARM isn’t even in the same ballpark. Hacintoshes have ran on AMD with very few modifications that Apple could probably have an engineer do in a day. The transition would be seem-less for users.
Prepare for AMD powered Hackintoshes.
This is already a thing. Has been for about a year.Prepare for AMD powered Hackintoshes.
wouldn't it be better to go ARM chips instead?
Nothing is ever “seemless” in these types of transitions, although Apple has “THE” best record at this, so I have to give them their due. However, if Apple does move to AMD, then it means Intel is gone after a period of transition and it also means that Apple is NOT transitioning to A-Series for at least another decade. Apple does not invest in multiple parallel paths, especially not on a product line that only accounts for 8% of total sales in 2019.Transitioning to AMD vs transitioning to ARM isn’t even in the same ballpark. Hacintoshes have ran on AMD with very few modifications that Apple could probably have an engineer do in a day. The transition would be seem-less for users.
Does ECC memory really that matter? I mean for a desktop user?
Would there be any issues with Thunderbolt licensing?
Already a thing and powerful as all get-out...
This is already a thing. Has been for about a year.
No TB3 no AMD. (TB3 is proprietary INTEL protocol, right ? )
I wasn't aware of that.
I just built a new 3900X windows PC, and it totally flys compared to anything Intel offers in even a remotely close price range. It's a shame that AMD's GPU drivers are complete garbage and my 5700XT crashes all the time.
Apple ARM based CPUs are not faster than Intel CPUs because they have better cores. They don't. With each cycle, any 4 GHz x86 CPU will do 4-5 times more work than any ARM based CPU. There is a reason why ARM is RISC(REDUCED Instruction Set...) Architecture.Yes. Apple's silicon team has been absolutely KILLING IT recently with the H1, A13, A12X, et cetera. Their processors are way faster than a lot of Intel CPUS, plus they have integrated Neural engines and GPUs.
[automerge]1581094739[/automerge]
That's not what the article is saying. They're not replacing GPUs with AMD, the references are for combined CPU and GPU chips from AMD. And if Apple is developing their own ARM chips for the Mac, they would definitely be all-in-one CPU/GPU/Neural Engine/etc. chips, like the A13.
Changing from one x86 vendor to another x86 vendor has nothing to do with innovation... It’s essentially an operations issue, with a little marketing sprinkled in.I'm shocked Apple hasn't gone to AMD by now. Intel is showing no innovation. Stagnation is the word du jour.
I wasn't aware of that.
I just built a new 3900X windows PC, and it totally flys compared to anything Intel offers in even a remotely close price range. It's a shame that AMD's GPU drivers are complete garbage and my 5700XT crashes all the time.
It looks like Apple really was just waiting on AMD to deliver the goods as far as laptop chips to finally begin sourcing x86 chips from AMD as well.
Going to AMD would be a total redesign of the motherboard. It would also require a significant amount of software testing. This is not something they would do for BTO only.
The amount of fragmentation that is in the pipeline for Apples products...
No TB3 no AMD. (TB3 is proprietary INTEL protocol, right ? )
Because AMD processors haven't been competitive from a perspective until now. With Zen 2 and the upcoming Zen 3, now is the right time to do it.I'm shocked Apple hasn't gone to AMD by now. Intel is showing no innovation. Stagnation is the word du jour.
If high end means starting at $2k and having user-replaceable components, shut up and take my money! If high end means $6k (like the new Mac Pro) or the $1k monitor stand, I have no choice but to look elsewhere.
Intel FAB is stalled for the next 18 months. Apple isn't negotiating when Zen 3 will be at 5nm and Intel is stuck at 14nm.
Apple most certainly can maintain both. Their resources dwarf the rest of the industry. Apple would provide an upgrade path for first-generation Mac Pro customers.
What is so brilliant about VLIW?You can't do VLIW on CISC. ARM is a RISC architecture. In VLIW, the compiler realigns instructions so they can be loaded into the execution units as a very large block. Each spot on the block corisponds to a specific execution unit. The first 12 could be general instructions. The next 15 could be integer, while another 18 could be floating point.... Using VLIW, it is not out of bounds to expect 64 instructions to be completed per clock, with a much higher clock than you could get with X-86.
The first step of software testing is putting AMD specific code into the OS build. That is what they did. Going from Intel to AMD is not a small MB change. It is a total redesign.
Have you seen the beasts AMD is cranking out?wouldn't it be better to go ARM chips instead?
Only if you consider your data to be important.
Not really, the soft error rate is negligible on desktops with frequent reboots (once every few weeks vs once a year on servers) and small amounts of RAM (16 GB vs 512+).
Modern ECC is for availability. That is, it is designed to be tolerant to complete chip and bus failures (a.k.a. chipkill), which are caused by things like pins going bad, chips having solder cracking. IBM realized this was a significant failure mode decades ago.
To do this there's costs. You have to use low density (ideally x4) DRAM, you pay for extra power in refresh and bus, and you pay in performance (lockstep mode).
So the question is whether you want to pay for all of this to keep your desktop computer running, or just pop out the bad DIMM and reboot.
My last years AMD PC hasNope, long story short, Thunderbolt 3 is no longer proprietary to Intel. You can buy an AMD machine with Thunderbolt 3 today.