Apple Plans to Ditch Intel and Use Custom Mac Chips Starting in 2020

Discussion in 'MacRumors.com News Discussion' started by MacRumors, Apr 2, 2018.

  1. alien3dx macrumors 6502a

    alien3dx

    Joined:
    Feb 12, 2017
    #1626
    If you said it true, it is possible apple changed to ARM ?
     
  2. CodeJoy, Aug 31, 2018
    Last edited: Aug 31, 2018

    CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1627
    Sorry, this is incorrect from start to finish.

    All processors are Turing complete, and as such they can all run the same type of software, regardless of complexity.

    Simply speaking, in modern CPU's, the difference between RISC and CISC is the instruction decoder. For x86 this takes up ~20% of the die area. This translates directly into either making the chip smaller and cheaper to manufacture, or into being able to pack more stuff on the die (cores, caches, ...).

    This is then also the reason why RISC processors may "shrink better". They can skip the 20% extra complexity of the instruction decoder. But otherwise CISC transistors shrink the same as RISC transistors.

    RISC processors were the first superscalar processors, had deeper pipelines, and as such were able to execute more instructions per clock than their CISC counterparts. Over time, CISC processors also got deep pipelines, multiple execution units, and became superscalar. In modern processors it's not clear to me that either technology would fundamentally be capable of higher IPC than the other. There's also no fundamental reason why one should be able to execute software faster than the other.

    For modern RISC architectures that are quite competitive vs x86 you can look at SPARC or POWER for example. It has been attempted with ARM architecture as well. This hasn't yet seen the same level of performance, but it's not because of RISC vs CISC. I'm sure these powerful RISC chips consume just as much power as CISC chips.

    The one thing you got right was that both RISC and CISC processors are ultimately limited by the laws of physics.

    Edit: While not a chip designer, I do program these chips regularly at a low level, both x86 and ARM. And I guess I technically built a one-instruction CPU for a uni lab once, but it's not clear whether that counts for anything. And I think I have the 20% figure from cmaier in an earlier discussion, any mistakes there would be mine.
     
  3. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #1628
    I believe they will at some point.
     
  4. TheMacDaddy1 macrumors regular

    Joined:
    Aug 17, 2016
    Location:
    Merica!
    #1629
    My explanation was from a simple, high level. The Sparc and PowerPC, in their day got out classed by cheaper Intel CISC CPU's. The Sparc and the PowerPC at the time ran into thermal problems trying to compete with Intel CISC chips. Where are the RISC workstations today?? Why do we not see high end applications and games running on RISC?

    RISC is great, especially if you completely control the OS and the software so you can optimize for it. Hence small IOT devices running some form of stripped down Linux running one application can do well. If you need to throw a bunch of different software at it, in real time, CISC is better.



    Here is a good explanation of both for the audience.

    https://www.allaboutcircuits.com/news/understanding-the-differences-between-arm-and-x86-cores/

    "For example, many RISC-based machines perform operations between registers, which commonly requires the program to load variables into registers before performing an operation. A CISC-based machine, however, can (or should) be able to perform operations between registers, between a register and a memory location, and even between memory locations. Other common operations include multiplication with floating point numbers, barrel rolls, single instruction loops, complex memory manipulation, memory searches, and much more. "
     
  5. curmudgeonette macrumors 6502

    Joined:
    Jan 28, 2016
    Location:
    California
    #1630
    CISC/RISC have nothing to do with the ability to shrink silicon geometry.

    RISC IPC is likely higher than CISC, all else being equal. With less instruction complexity, it is easier to execute more instructions. The flip side is that some operations will take more instructions. Also, higher level CISC instructions could also provide more hints to the scheduler.

    The real question today regarding RISC v CISC is the power draw of the more complex instruction decoding needed by CISC. Does this extra expense return the investment? Also, what if a RISC team can spin their design every year, while the CISC team needs two years?

    The reason it looks like CISC won is that for many years Intel's silicon fabrication technology was a generation ahead of everyone else.
     
  6. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1631
    Not just simple and high level, your explanation was incorrect.

    If I recall correctly, SPARC M8 claimed 2x Intel performance when it was launched. I haven't used one in a good while so I don't know exactly what they're comparing, but whether it's 2x or 0.5x it's still the same order of magnitude as Intel. It's true that RISC workstations got replaced by Intel counterparts, but that's down to economy of scale rather than anything else. Intel sold more chips, so they could put more into R&D, which paid off and they got faster quicker than the RISC chips, and here we are. Intel x86 was not particularly impressive back in the day, but in modern day I think they are. The instruction set is still weird, but that's arguably true for SPARC and ARM as well. I did write a.... erm... Lisp to SPARC compiler back in the day, which is a weird thing to do but you do get exposed to the instruction set.

    RISC servers are still a thing, though maybe a bit niche these days.
     
  7. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #1632
    TODAY you can buy PowerPC and spare cpus with higher IPC than intel’s best.

    RISC cpus do floating point. CISC cpus have to break up instructions that access memory into separate instructions that include a load or store. You have no idea what you are talking about.

    X86 has one instruction that adds two numbers from ram and puts the result in ram. It breaks it into two instructions to fetch the operands, one to add, and one to put the result in memory. So it takes 5 instructions and many cycles. Same as RISC.
     
  8. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1633
    To make things more complex and interesting, it's possible to build a chip that supports multiple ISA, a bit like POWER can switch endianness. So a chip could actually be built that is both RISC and CISC in the same silicon. AMD were presumably working on such a chip a few years back, but I don't know what came of it.

    You wouldn't happen to know more about any of that, would you?
     
  9. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #1634
    We weren’t doing that at AMD. Our chips all used risc cores but there was never a plan to allow direct access to it as a separate architecture.

    The original plan at Exponential Technology was to do exactly that.

    What this dude seems to believe is that intel has more “powerful” instructions that magically are just as fast as reduced instructions. It just doesn’t work that way. If an instruction is Three times as powerful it takes three times longer to run.
     
  10. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1635
    Yeah, that's clearly not the case, though with a naive understanding of processors I can understand how one might get that idea. When you have x86 instructions that does multiply+add+move in one instruction, it might appear to do more work than 2-3 separate instructions. But if that was ever a good mental model for how processors work, it certainly isn't anymore.

    RAM also doesn't work the way people generally think (don't know about dude). Neither do caches. Or probably just about anything in a modern computer :)

    And neither does macOS. It's weird when people say macOS and IOS will eventually merge, when they have been merged from the very start. At the core it's the same OS, and this is why it doesn't take any leaks at all to know that macOS has been running on ARM for at least half a decade, probably more. Making ARM Macs is about performance and compatibility, and of course most of all whether it makes financial sense. (which it probably does)

    The press reported Skybridge as dead in May 2015, though maybe that was just socket compatibility. Doesn't matter, multi-ISA is a cool idea in theory, but the fact that it isn't done is evidence that it doesn't make sense in practice.
     
  11. curmudgeonette macrumors 6502

    Joined:
    Jan 28, 2016
    Location:
    California
    #1636
    PPC "died" because the AIM alliance failed. Each of the three members needed to take PPC in a different direction. Apple had neither the money nor expertise to continue developing PPC for the desktop market. The architecture is still alive as IBM's POWER and (former)-Motorola embedded chips. The lack of development funds also meant that Apple was forever one DRAM generation behind all the way from the PC33 era through to PC2-4200.
     
  12. arkitect macrumors 603

    arkitect

    Joined:
    Sep 5, 2005
    Location:
    Bath, United Kingdom
    #1637
    Well it will be interesting if it does happen.

    A great opportunity for smaller and faster moving software companies.

    The behemoths like Adobe might find it similar to the meteor that did for the dinosaurs.

    The software world is in many ways very different today compared to 2005.

    As long as Apple don't lose their collective heads and make the desktop OS useless for actual production of content… then the exodus to Windows will be massive.
     
  13. curmudgeonette macrumors 6502

    Joined:
    Jan 28, 2016
    Location:
    California
    #1638
    That's actually a lousy explanation.

    An assembly language programmer may be very thrilled to have this type of CISC instruction. However, it is not all that useful. It could be hundreds of lines of code between instances of this instruction class. That's because usually you'll want to immediately do something else with that sum - and you'd like it to be in a CPU register. Therefore you'll usually code it almost like you're on a RISC processor.

    So - this instruction adds lots of complexity to a CPU while providing very little usefulness.
     
  14. Howard2k macrumors 68000

    Howard2k

    Joined:
    Mar 10, 2016
    #1639
    I know we're probably drifting way of course here, but you're talking about Darwin right?
    So in practice, how easy/difficult is cross-compatibility? Performance constraints aside.
     
  15. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #1640
    Exactly. CISC was great when we were hand compiling code. There’s no advantage now that we aren’t. You still convert to risc, but do it less efficiently and do it every time you run instead of once when you compile.
     
  16. Mainyehc, Aug 31, 2018
    Last edited: Aug 31, 2018

    Mainyehc macrumors 6502a

    Mainyehc

    Joined:
    Mar 14, 2004
    Location:
    Lisbon, Portugal
    #1641
    It's most definitely not just Darwin. Yes, there's the whole UIKit vs. AppKit thing, but some of the frameworks and APIs are either the same or very similar across both OSes. Also, don't forget that Apple is coming up with an official way to easily port iPhone apps to the Mac using an implementation of AppKit compatible with macOS, which they've admittedly used internally to port Home, Stocks and Voice Memos to Mojave; should they take the plunge and release ARM-based Macs a few years from now, ARM-compatible versions of those very apps would be just a recompile away. In fact, cross-platform game engines would be easier to make compatible with both the Mac and the iPhone/iPad, which might somewhat offset the disadvantages of deprecating OpenGL in favor of Metal. The writing is on the wall, as all of Apple's recent moves seems to be a strategy to prime developers for that very scenario.

    Also, you'd be crazy to think Apple doesn't have a full version of macOS running on an ARM-based machine (like, say, an Apple TV, which, being kind of a “shrunken Mac mini” of sorts, complete with native 4K HDMI output and all, is the most obvious candidate) hidden in an R&D office somewhere, ever since iPhone OS 1, just like they did maintain their x86 Mac OS X branch from the NeXTStep/Rhapsody days in secret all the way to the first public release of Tiger for x86.

    Sure, they deprecated PowerPC and I don't believe they have a secret build compatible with whatever POWER-based processors are available these days, but ARM on Macs? They would be foolish not to have it in their pipeline or at least as a plan B; they are a full-blown ARM licensee, they develop their own A-series custom chips and, as such, they control the whole stack, as per Jobs' and Cook's philosophy.

    After developing their own integrated graphics and M-series chip on the iPhone, and switching to a T-series chipset and a modern filesystem tuned for flash drives on the Mac as well, swapping the x86 processor to an ARM-based one and stacking a desktop OS on top of it seem to be the next obvious steps. They are doing the transition right in front of our eyes, one small piece at a time, and they're down to the last physical one… My guess is: they will release a round of new Intel-based Macs with T-series chips (perhaps a lower-power T3 chip, even? Don't forget that to this moment, only the iMac Pro and the 2018 MacBook Pros have those…), just to iron out the kinks and be sure they work fine, and only then start a new architecture transition (which fits in perfectly with the rumoured 2020 date, if I may add).

    If I had to guess, there will come a time when there are two Macs very similar to one another (sorry, guys, no case redesigns until *after* the transition, save for the new Mac Pro, which may remain as an Intel-based machine for a loooooong time), except for their processor architecture, kind of like the Rev. C iMac G5 and the Early 2006 Intel iMac. Remember those? They were so similar on the inside that it seemed as if they were developed simultaneously and the former was just released as a stopgap, or as if the latter was just a minor redesign (check the image attachment to see what I mean… If it wasn't for the key I've added and the Intel chips on the Intel board, you could easily confuse which was which; even the screws are in similar places!).

    And if I had to bet on the model, the transition would start from the bottom up, with the 12'' Retina MacBook; it might even take a while, just to “test the waters”. That's not a professional computer (in the sense of raw computational power) anyway, so it's not like the software its target market would want to run (say, productivity suites like Office and said apps converted from iOS) wouldn't be available on launch. In fact, most of the software for that machine is already available for the iPad Pro, which one could argue that it's just a keyboard-less MacBook or vice-versa.

    By the way, for the sake of comparison, let's see when each of Apple's big transitions started and finished: 68k to PowerPC lasted from 1994 to 1998; Classic Mac OS to Mac OS X/OS X/macOS lasted from 2000 to 2007 (yes, Tiger for PowerPC still ran the Classic environment, and it is patently obvious why such an OS transition would last longer; it was much harder to port software from Mac OS Classic, and many of it just had to be replaced with, well, alternative software from different companies); PowerPC to Intel lasted from 2006 to 2009; so, if we consider them inevitable – and survivable, as Apple proved time and time again! –, the iPhone was both a “distraction” of sorts and the catalyst for next one (and in and of itself it amounted to sort of a shadow “OS transition”, as indeed many of the technologies developed for iOS ended up being used for an internal revamp of iOS), which seems to be reaching its logical time, so to speak… It will likely be announced next year, and take place over 2-3 years, so, from 2020 to 2022-3, possibly with an exception made for higher-end machines, depending on how the market and Intel development evolves (I mean, a dual-processor architecture could be workable; even though Apple likes to chuck away as much legacy cruft as they can, they are still the biggest company in the world and they could certainly manage that if they wanted).
     

    Attached Files:

  17. gavroche macrumors 6502a

    gavroche

    Joined:
    Oct 25, 2007
    Location:
    Left Coast
    #1642
    Please explain how your take away from what I said is that he MacOSsx is worth $1500?
    --- Post Merged, Aug 31, 2018 ---
    Suggesting that people who want to dual boot could easily buy one of many affordable laptops is not rude. Certainly o more rude than those minority of people who dual boot demanding Apple bend over backward, at the expense of the majority of its users, to give them exactly what THEY want. That’s selfish.
     
  18. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1643
    Yes, Darwin is, to the best of my understanding, the underlying core operating system for both macOS and iOS. It's a Unix-like OS that is a mix between Mach (kernel), BSD, NeXTSTEP, other free software, as well as some Apple code. It covers things like file system, networking, core OS, threading, memory management, security, and basic inter-process communication. I think it also includes Cocoa [Touch]. On top of this, both macOS and iOS adds custom parts like UI Kit, and I'm not sure what the counterpart is called on macOS since I don't usually program it directly. What's interesting about all this is that Darwin is an open-source OS.

    In terms of cross compatibility, apps (in general) are written to API's. For iPhone/iPad apps, I think a lot of the code is written to UIKit and similar, and that is effectively why you can't run iOS apps on macOS. Today this means that you have to write separate user interface code for macOS and iOS apps, even though code that targets lower level API's can stay mostly the same. However, Apple are moving the iOS APIs to macOS, thereby making it possible for developers to target both platforms with the same code. This wouldn't necessarily have to include ARM emulation on x86 Macs, but presumably it would. This would mean that you can run iOS apps under macOS seamlessly.

    I assume by cross compatibility you meant app compatibility, but if you mean Darwin then it is also designed to be easy to put on other architectures. This has been done with PPC and 32-bit ARM and x86 respectively, and if they wanted to put it on any other architecture in the future, that would be relatively easy. (to whatever extent writing low level kernel code can ever be thought of as easy) But as already stated, for 64-bit ARM they don't need to do this, because they already have it. If I remember correctly, iOS has been 64-bit since around the iPhone 6 and iOS 9.0 which released in 2014, though Apple would have had Darwin running on 64-bit ARM for some time before then.

    Note, this is from my understanding, and I'm not really an Apple developer. If others can improve on this, then please be welcome.
     
  19. jerwin, Aug 31, 2018
    Last edited: Aug 31, 2018

    jerwin macrumors 68020

    Joined:
    Jun 13, 2015
    #1644
    You were the one suggesting that we all use a $300 computer from walmart for our windows needs. That suggests to me that you think there is little to no value associated with apple hardware.

    Personally, I think that a 5k display is useful, even when I have to use windows. It's not as if I downgrade my expectations of what's possible when I use a windows computer. CAD is CAD.
     
  20. CodeJoy, Aug 31, 2018
    Last edited: Aug 31, 2018

    CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1645
    The main CISC architectures that I'm familiar with are VAX PDP, x86, Motorola 68k and MOS 6502. These are designs from the 60's and 70's. From the 80's and onward, to my knowledge, all new CPU architectures have been RISC. Even Intel would probably do a RISC architecture if they were to redesign one today, and in fact they already did this two decades ago with itanium. Itanium didn't get much of anywhere for whatever reasons, but I think everyone agrees that CISC isn't necessarily the best way to go anymore, but x86 is going to stay the way it is because of its massive market share.
    --- Post Merged, Aug 31, 2018 ---
    In terms of running iOS apps on macOS, this has technically been possible (for developers) since forever in the XCode simulator. I think this is actually iOS compiled for x86, and apps are compiled to x86 code. This is of course the exact opposite of what we're discussing in this thread, but conceptually the same thing. It's also not streamlined for end users at all, but it shows that tech wise both operating systems are interoperable.

    Anyway, from a tech perspective, Apple could have had ARM Macs a long long time ago. Whether it ends up happening, completely or partially, is going to depend on other things entirely. The tech is already there, at least for a low end laptop like the 12" MacBook.
     
  21. Mainyehc, Aug 31, 2018
    Last edited: Aug 31, 2018

    Mainyehc macrumors 6502a

    Mainyehc

    Joined:
    Mar 14, 2004
    Location:
    Lisbon, Portugal
    #1646
    Sure, they could. But they would still be dependent on Intel or other companies for their chipset…

    That's why I believe they waited until they had all their ducks in a row (meaning, all the main custom chips) to take that leap. The only reason they haven't done it earlier, I believe, is just the fact that they want to thoroughly test the newer components before they to fully commit their whole production chain to the new architecture… And all those issues with the T2 firmware are a clear indication that the people from the small niche that is their professional market are, rather unfortunately, being used a bit like public beta testers for the main event, the consumer machines.

    In a sense, the transition has started already, with the ancillary components and technologies, from the top down, and will start officially and visibly, with the processor itself, from the bottom up. If you really stop and think about it, it makes huge sense and explains a lot of Apple's recent actions (those used to become obvious only in hindsight, but Apple has done so many transitions already that, by now, they are becoming a bit predictable; remember when they started pushing heavily for devs to switch to XCode and Cocoa? Yes, they pulled the rug out from under Adobe when they deprecated Carbon 64 at the last minute, and that wasn't very cool, and all, but those lazy frenemy bastards should've known better since the x86 transition writing was indeed on the wall; and do you know who's ready for an ARM transition, this time? Their competition, Serif… Or do you think they ported their Affinity suite to the iPad just because? They're using modern, platform- and architecture-agnostic C code on their graphics engine for a reason ;) ).

    By the way, the fact that ARM chip manufacturing would have to be spread out across the iPhone, the iPad, the Apple TV, the Apple Watch, the Home Pod *and* the Mac lines might become a bit of an issue. Are there any other chip manufacturers around besides TSMC and Samsung which could rise up to the task? AMD? Or even, Jobs forbid, Intel itself? :D Either way, we really should pay attention to supply chain rumours, especially those on backstage deals, as they may be telling of things to come. I mean, all those processors have to come from somewhere.
     
  22. lederermc macrumors 6502

    lederermc

    Joined:
    Sep 30, 2014
    Location:
    Seattle
    #1647
    Yes.
     
  23. Mainyehc, Aug 31, 2018
    Last edited: Aug 31, 2018

    Mainyehc macrumors 6502a

    Mainyehc

    Joined:
    Mar 14, 2004
    Location:
    Lisbon, Portugal
    #1648
    Are you sure? I mean, I understand that they painted themselves into a thermal envelope corner with the Mac Pro, and the entire MacBook and MacBook Pro lines are now so thin that they indeed have to wait for faster speeds at the same TDP, but… the Mac Mini? The iMac? The former could and should've been upgraded 3 or 4 times already, and the latter could very well receive speed bumps between the minor internal redesigns it underwent.

    It's ridiculous, and Intel is not the only one to blame here. Apple is making a conscious choice when selling severely outdated processors on machines priced as if they were just released. It's insulting, and I'm guessing they have huge sales spikes whenever new machines are released. It's a bit stupid, because if they released upgrades more frequently, maybe the demand would be more easily manageable… People wouldn't just rush to buy new Macs, and they would lose a lot on margins, but they would probably make up for it goodwill and in numbers.
     
  24. lederermc macrumors 6502

    lederermc

    Joined:
    Sep 30, 2014
    Location:
    Seattle
    #1649
    A11 is much cheaper and less power hungry than Intel. Apple could put 4 or 8 ARM CPUs in a laptop.
     
  25. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1650
    Indeed, the transition makes a lot of sense, though possibly more so to Apple than to end users. I think you're spot on w.r.t the chipset, what they're doing there is actually quite interesting. With the T2 and its built-in SSD controller and disk encryption, they only have to add the nand chips. One subtle benefit is that they can spread out the SSD chips across the logic board, instead of having them all in the same place, and they of course already do this. This doesn't necessarily benefit end users who would prefer a replaceable M.2 device instead, but I suspect it benefits Apple in terms of the design and the manufacturing costs. Another benefit being that they can offer disk encryption with no slowdown. One can easily imagine other functionality being moved into custom chips over time. One thing I don't think they have yet is a TB3 controller, but that's easy enough to either buy from Intel or to integrate themselves eventually.

    As far as chip manufacturing, they're already selling well over 50M ARM devices per quarter, and something like 3.5M Macs. Only a fraction of those Macs would be moving to ARM in the first wave, so I don't think the manufacturing would be a major issue initially. And y'know... I wouldn't be very surprised to eventually see Apple end up with their own fabs...

    For the foreseeable future though, I suspect that if and when ARM Macs do appear, it's still going to be a mixed lineup with Intel chips at the higher end of performance for some time. That's not how it happened last time, but then the performance of the Intel chips were quite far ahead of the PPC chips, whereas the ARM chips are still at the lower end of the spectrum for macOS devices. (and they're not just suddenly going to pop out Xeon 18 core level ARM chips out of nowhere)
     

Share This Page