Apple Plans to Ditch Intel and Use Custom Mac Chips Starting in 2020

Discussion in 'MacRumors.com News Discussion' started by MacRumors, Apr 2, 2018.

  1. alien3dx macrumors 6502a

    alien3dx

    Joined:
    Feb 12, 2017
    #1651
    Don't think so, since the fiasco of graphic card in mojave. It is choice end user to use windows 100% hardware. It is about choices not about limitation. "Think outside the box"
     
  2. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1652
    I really don't think you should be expecting 4 or 8 ARM CPUs in a laptop anytime soon. Or probably ever.
    --- Post Merged, Aug 31, 2018 ---
    What is the fiasco?
     
  3. alien3dx macrumors 6502a

    alien3dx

    Joined:
    Feb 12, 2017
    #1653
    My ex-client using itanium. We test some php application in Xeon it work as suppose to. When test in production , all thing getting weird. Suddenly client mention they bought itanium server from hp clamming it was good for billing system. A bit long time need to figure out what's going on.
    --- Post Merged, Aug 31, 2018 ---
    maybe i'm wrong, nvdia doesn't support in mojave ?
     
  4. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1654
    I haven't heard about it, but I haven't been looking either. I've been using nVidia drivers for High Sierra though, and that works fine. I'd have to assume they will have drivers for Mojave too, once it's released. Apple also might be doing what they can to block nVidia cards, but there would probably be workarounds available pretty quickly. Not ideal, would be better if Apple didn't mess with which card I want to use, but whatever.
     
  5. cmaier macrumors G4

    Joined:
    Jul 25, 2007
    Location:
    California
    #1655
    Amd doesn’t have a foundry. And globalfoundries has backed off 7nm. So it’s samsung and tsmc.
     
  6. gavroche macrumors 6502a

    gavroche

    Joined:
    Oct 25, 2007
    Location:
    Left Coast
    #1656
    I was suggesting that PCs can be used for Windows, as intended, if whatever path Apple takes moving forward impacts dual boot. Its not something prohibitive, as they are very inexpensive. That has absolutely nothing to do with the value i perceive Apple to have. I just really dont think Apple should choose a future path based on appeasing people that want to use the machine for non-Apple purposes. Thats to me would be like ripping a computer maker for eliminating the CD tray because they liked using it as a cup holder.

    And let me hypothesize something to you. Imagine you walk outside tomorrow, and a wormhole opens up and a person steps through.... and is from 20 years in the future. And that person handed you a computer from that time. Would you prefer its essentially the same as todays... an Intel proceesor some 30% faster than todays, macos, ssd storage, etc... or would you be more excited if it had some completely different architecture, made by some company that might not even exist today? Just saying... tech advancement is much easier without reliance on backward compaitability, or reliance on compatibility with products that arent even made by that company.
     
  7. jerwin macrumors 68020

    Joined:
    Jun 13, 2015
    #1657
    depends what you mean by Xeon 18 core "level"...

    https://www.anandtech.com/show/12694/assessing-cavium-thunderx2-arm-server-reality

    Here's the Xeon Platinum 8176
    (8719 USD)
    https://ark.intel.com/products/120508/Intel-Xeon-Platinum-8176-Processor-38_5M-Cache-2_10-GHz

    and here's the Xeon Gold 6148

    https://ark.intel.com/products/120489/Intel-Xeon-Gold-6148-Processor-27_5M-Cache-2_40-GHz
    (3072 USD)
    The Xeon W-2191 is an apple only chip. It's closest non Apple equivalent is the this one.

    https://www.anandtech.com/show/1311...w-2195-w-2155-w-2123-w-2104-and-w-2102-tested

    (2553 USD)



    Arm is a pretty scalable architecture. If you want to design a high core count chip that competes with multi thousand dollar server grade Xeons you can.

    I'm not sure that an imac needs to be designed around performance per watt., though. Unlike a macbook, it doesn't need batteries. Unlike a server farm or a super computer, it doesn't need to be built in consultation with the local power companies.
     
  8. alien3dx macrumors 6502a

    alien3dx

    Joined:
    Feb 12, 2017
    #1658
    PC not windows.. AND mac doesn't mean cannot run linux,windows.
    --- Post Merged, Aug 31, 2018 ---
    For me, i have try run httpd/mysql /php in my tablet android 7 inchi samsung. It great for just showing of to the customer how it works. I'm not sure even IPAD have this type of application. I also have seen hp sell arm server. I do hope Google
    Chromium/Ipad can replace normal laptop someday.
     
  9. Mainyehc, Aug 31, 2018
    Last edited: Aug 31, 2018

    Mainyehc macrumors 6502a

    Mainyehc

    Joined:
    Mar 14, 2004
    Location:
    Lisbon, Portugal
    #1659
    Interesting analysis you've got going there. I concur with all points, and I completely forgot about the TB3 controller. Does the fact that Intel recently made TB3 royalty-free play into that, or would they still have to rely on them in some form?

    As for having a mixed lineup, I hinted at that but I hadn't gone as far as explicitly mentioning it. I've always envisioned a split between professional, Intel machines (especially desktops/workstations) and professional/prosumer/consumer machines on the low end with ARM processors. In fact, Apple could very well offer an ARM MacBook Pro with insane battery life alongside the regular Intel model (they could just give it a different name and call it a day, as long as they made very clear what software it was compatible with).

    I know it wouldn't be very “Apple-like” and would definitely break their “good-better-best” tiers, but seeing how they threw the four-quadrant matrix out of the window a long time ago (and that the iPhone and iPad lineups have a lot of different SKUs and, yet, people don't seem very confused when buying them), it wouldn't be that big of a deal. I mean, having a 12'' Retina MacBook and a cheaper 13'' MacBook Air (which is heavier than the regular MacBook) seems to be already as confusing as it gets (as are the rumours about its upcoming replacement), and you don't really see people complaining. It's not like Apple would suddenly resort to giving them customer-facing SONY-sounding (or Quadra/Centris/Performa-era-sounding) product names…

    By the way, does any of you CPU architecture experts in this thread reckon Apple might also be able to/willing to develop at least some dual-architecture models, like the old x86 upgrade cards you could plug into those old beige Macs? Would that be feasible and useful for, say, power-saving measures, kind of like the T2 chip already does but in a more pervasive fashion? Or would they just be useless, power-hogging Frankenstein monsters instead? I mean, with the return of fat binaries and the advent of AppKit for macOS, the app situation wouldn't be a problem, but I wonder about the strictly computational and energetic side of things. If you could have a kind of “super MacBook Pro” with both chip architectures inside, and if you were able to outright turn off all of the x86 cores (on the fly or otherwise) like you can turn off the dedicated GPU, and if it just worked™, that would be the best of both worlds.

    As for having the users' best interests in mind or not, well… I'm one of those users who would rather tinker with his own Mac as if it were a PC (and with good reason; I was raised on the PC, so I guess I never really let go of that tinkerer spirit), but some day I will have to cave in. And the fact of the matter is that if their choices make their computers faster and more secure, it will be a tough choice and they may end up winning over a lot of old-timer converts. I mean, if their decisions were that terrible, the Mac market share would've had completely collapsed by now (though I hope they are paying close attention to the hard data and not just to the total net gain of users… In the long run, losing a few influencers can be much worse than winning over a thousand wannabes/nouveau riche who buy expensive models without any need for them and just to show off; in a sense, it feels a bit like consumer electronics gentrification, and it hurts our wallets almost as much as the one of the real estate kind).
     
  10. CodeJoy macrumors 6502

    Joined:
    Apr 3, 2018
    #1660
    This has been discussed before. It could be done, but it's not worth the trouble. If you wanted them to share RAM, then they'd have to figure out a way for Intel and ARM chips to agree on memory consistency. While perhaps technically possible, it's most likely too much work to get there. You could put the other CPU on the PCIe bus though, and talk to it just like you talk to a graphics card. Then it would need its own RAM, and you'd copy data back and forth. This is perhaps easier (and similar to the old x86 cards), but you end up duplicating some hardware and probably lose some performance. At the end of the day, if this was something that would sell them another 50M devices per quarter, I'm sure they'd do it. But would we even be talking 500K devices? I'm pretty sure it's not worth it.

    You don't *have* to. You can run macOS on a delidded 5.2GHz 8700K, undervolted Vega 56, maxed out with RAM and SSD, just to take a completely random and unrelated example :) That's really the Mac Midi that they should have released a year ago, but didn't. As far as tinker boxes from Apple, I think those days are gone. I think everyone hoping for a tinker friendly Mac Pro will be very disappointed.
     
  11. Mainyehc, Sep 1, 2018
    Last edited: Sep 1, 2018

    Mainyehc macrumors 6502a

    Mainyehc

    Joined:
    Mar 14, 2004
    Location:
    Lisbon, Portugal
    #1661
    Yeah, I thought about the implications of that. Two processors sharing the same memory would be a recipe for disaster unless some memory management subsystem was in place, and that would definitely add some overhead, of course.

    What if… accepting hardware duplication as a fact of life (hey, such a machine would always be more expensive, so… as long as it was worth it, people would pay for it; just imagine the use cases – besides the potential power-saving aspect of such a config if done on a portable system –, like full-speed virtualization which would be more like having a KVM switch and two real systems in a box), the ARM side of things had, as is common on iOS devices, not only its own integrated graphics but also Package-on-Package memory? Heat dissipation-wise, would such an arrangement still be compatible with overclocking and active cooling, or would the memory, being sandwiched between the CPU+GPU and the heatsink, be outright fried? Questions, questions…

    I know I don't. That's why I'm still using a Late 2009 27'' iMac with a 2.93 GHz Core i7, 32 GB of RAM, a DIY fusion drive and a BT4+802.11ac card. I could also be using a Hackintosh (I'm guessing the config you suggested would only be feasible on such a machine, and not on an official, Apple-branded Mac). The thing is, while I'm widely know across my circle of friends, acquaintances, former Mac room goers (I used to be a monitor at my Uni, and I was responsible for the upkeep of 30+ Macs spanning two labs) and even current students as “the Mac guy” (I don't know, I guess I'm kinda famous on a local level…?), I'm seriously trying to pivot to academia (also, as of late, I've been neglecting my design career a bit as well).

    This crap is fun, but way too time consuming, and I feel I should leave my ersatz repair & upgrade business to someone else (it's not like you need to have a degree or to be a very special person to do it; any PC geek can do the same things I do and much more) and even skip the DIY aspect of it altogether (though I am loath to the idea of paying someone else to do anything I can do on my own, and I had enough bad experiences even with AASPs for me to be able to trust, well, anyone else but me – save perhaps for Louis Rossmann or someone else with his level of skill, of course; as such, I am intending on buying the next 4K iMac revision and giving it the SnazzyLabs treatment two years down the line, once the EU-mandated two-year warranty expires, but that would be kind of a one-time thing for each desktop machine I own in the future). ;)

    Case in point (because I always believed laptops would inevitable – and sadly, but such is life – turn into a different kind of [monolithic] beast down the road): I also own a Mid-2011 13'' MacBook Pro, which I also want to get rid of (and I still haven't managed to do so… I tried our local equivalent to Craigslist and I only got answers from blatant scammers) because it can't officially run Mojave (or unofficially with proper GPU support, at least); I decided to try a dual SSD RAID 0 config before buying a 2012 machine, and it worked perfectly and was wicked fast for its age and price (I would have to resort to Carbon Copy Cloner in order to even be able to update the OS, but I wouldn't mind that in the least as I have enough spare HDDs lying around).

    Guess what, I bought the 2012 machine and when trying out the same config on it I found the ODD connector/controller, supposedly of the SATA III 6 Gbps kind, always craps out with whatever SATA III devices you connect to it (I tried everything: SSDs and HDDs, both old and new, to no avail). To this day, I'm still not sure whether this is an issue with the machine itself, with the entire model range, with the firmware, or what; I seriously thought of buying another one to test that config out, but I just can't afford to go around buying old computers, at €600 a piece, that I may then have a hard time flipping on scammer-ridden websites, so I decided to wait until a few trustworthy friends and colleagues are willing to be my guinea pigs (that time will eventually come, and I may indeed resell both my 13'' MacBook Pros and buy one – maybe from one of said friends, even – that is known to be working with SATA III on both bays).

    So, anyway, I had to return the SSDs and make do with a regular Fusion Drive made up of a crappy Toshiba SATA II HDD instead, and since it had to be stuck in the ODD bay so I could make use of the SATA III connection in the HDD bay for the SSD, I also had to do away with the screw-in ODD plastic caddy/adapter and replace it with self-adhesive sponge bits and blu-tack, both on the bottom cover and on the underside of the keyboard, for shock absorption. It's one hell of a jury rig, but it works. Also, I've been running Mojave PB on it from an external USB 3 dock, but sometimes it refuses to boot and gives me the forbidden sign (it's just a matter of trying again until it does boot, but… it's the kind of issue I never got before and which does give me pause).

    I also shoehorned Mountain Lion into a 2008 MacBook back in the day, and it was fun, and usable, and all, but… having to make sure each and every update was compatible with MLPostFactor and to reapply the kext patches was, to put it mildly, yet another chore. At that point, you might as well cave in and switch to a PC, or install some version of Windows or Linux on it. :p Gone were the days when you could install Leopard on a 800 MHz G4 eMac just by putting it into Target Disk Mode, running the DVD installer from a G5 tower and calling it a day…

    To be honest, I'm sick of these constant little snags, and especially of wasting so much time on these projects. They are fun, yes (even when they get seriously frustrating… I must be a masochist or something), but I have much more to contribute to the world elsewhere… Besides, I am savvy enough to deal with T2-equipped machines which you can't recover data from if they die, as I keep on-site, off-site (both encrypted) and cloud-based backups. I've also nearly had my computer stolen once (back then, I only had my dependable – until it died from capacitor rot, that is – iMac G5, and a single external FW400 HDD), when I was still living with my parents (they had a break-in, and I entered the place while the robbers were still in there – I never saw them, as they did a silent escape through the same window from which they broke into the house –, so I guess I saved my computer and my backups in the nick of time), so… the safer my data is, the better.

    All things considered and if my budget allows for it, I, for one, welcome our new anorexic, glued- and soldered-on Mac overlords. :p Buuuut… thanks for your suggestion anyway. If I ever hit a financial rut and need a new machine while in it, I may seriously consider that option. Just because it's a bit of a hassle, I'd still rather deal with the known issues and shortcomings of running macOS on PC hardware (such as, say, not being able to use iMessage on it, or having to wait a few weeks to perform updates; it's not like many regular professional Mac users don't do that anyway already) than having to deal with *gasp* Windows on a daily basis. ;) I already had to finish my BFA on a Toshiba running Vista when my G5 iMac crapped out (hey, I was waiting for the rumoured 27'' iMac I am still using to this day to be announced, and the computer was a loaner from my dad's small business, so… while excruciating, it was definitely worth it), and it's not something I want to go through again if I can help it.
     
  12. crsh1976 macrumors 6502a

    Joined:
    Jun 13, 2011
    #1662
    One has nothing to do with the other.

    Not sure how Apple could emulate x86 faster than what Intel chips do natively, either.

    Die size means nothing in this case.
     
  13. jerwin macrumors 68020

    Joined:
    Jun 13, 2015
    #1663
    10 nm? Pah! I'd be satisfied with 600 mm, if it mean that apple could pack in the cores and the cache.
     
  14. gavroche macrumors 6502a

    gavroche

    Joined:
    Oct 25, 2007
    Location:
    Left Coast
    #1664
    If you wish to use that hardware for another purpose, "thinking outside the box" as you choose to call it... thats great. No problem. I take no issue what that. But if Apple decides to go another route, such as if it decides to stop using Intel processors.... well, then you would need to find another solution. Heck... wouldn't you have to admit that it's Apple, at that point, "thinking outside the box" and not going the same Intel processor route as everyone else? Can't SOME computer company, at some point in time, decide to use a different processor? Or are we stuck with Intel for the rest of eternity.
     
  15. alien3dx, Sep 3, 2018
    Last edited: Sep 3, 2018

    alien3dx macrumors 6502a

    alien3dx

    Joined:
    Feb 12, 2017
    #1665
    That is why i see apple(non steve job era) don't like long term stable (lts) . If they stòp,they loose more developer except ios developer. I rather stick to fedora ,ubuntu if non mac osx or windows.

    I never user VIA proc,AMD PROC. I just use intel from era 80286 and nothing wrong with it. If handphone, i have huawei (krilin),samsung,some cheap mediatek, iphone,ipod. The limitation is you not me.
     
  16. jerwin macrumors 68020

    Joined:
    Jun 13, 2015
    #1666
    I'm not sure that apple is that company. Meanwhile, the latest generation of x86 chips looks rather more capable than the last.

    People may look at Apple's chips and think-- "wow, by the standards of a dubious benchmark, they're really catching up to intel"-- but what makes you think they'll surpass x86-- on a metric that isn't power related?
     
  17. SactoGuy18 macrumors 68030

    SactoGuy18

    Joined:
    Sep 11, 2006
    Location:
    Sacramento, CA USA
    #1667
    Look, Intel has made great strides in recent years lowering the power consumption of their x86-64 CPU's. That will likely keep Apple in the Intel camp.

    Besides, to get the ARM architecture so it could compete against the latest x86-64 CPU's, it may require a far more complex CPU design that will make the enhanced ARM CPU use effectively as much power as the Intel CPU. Why bother.
     
  18. ActionableMango macrumors G3

    ActionableMango

    Joined:
    Sep 21, 2010
    #1668
    It's not because Intel is lacking. Apple just really likes control and vertical integration. The article itself states "With its own chips, Apple would not be forced to wait on new Intel chips before being able to release updated Macs, and the company could integrate new features on a faster schedule."
     
  19. Sydde macrumors 68020

    Sydde

    Joined:
    Aug 17, 2009
    #1669
    ARM appears to be bothering. MS has real Windows running on ARM, not that silly RT thing. The target is the notebook market, which is not to be taken lightly. If A76 can compete level with i5-U and offer greatly improved battery life, we could start to see some nontrivial ARMward migration across the board. Get a foot into the notebook door and the workstation sector will ultimately not be far behind.

    They are banking on big gains from 7nm process, about which I have my doubts – Intel has backed up to 22nm, but ARM is somewhat structurally different from x86-64, so smaller processes may work better for them.

    SoC design is ideal for portables, and Intel is kind of squeezed in SoC development. The diversity in the ARM arena allows for greater flexibility in product design than Intel can manage when Intel is stuck producing large volumes of generic CPUs.

    To me, it looks a lot like ARM is in a darn good position here.
     
  20. Mainyehc, Sep 24, 2018
    Last edited: Sep 24, 2018

    Mainyehc macrumors 6502a

    Mainyehc

    Joined:
    Mar 14, 2004
    Location:
    Lisbon, Portugal
    #1670
    If ANY computer company would do that, that would surely be Apple. Apple and Microsoft are the only companies which produce both OSes and hardware, but Apple is the only one that oh-so-conveniently has a full-blown in-house chip design unit. Conversely, it is the ONLY chip designing company that actually develops OSes (no, their competition, with their embedded processor+OS combos, doesn't really count, as their wares are extremely specialised solutions and not mass-market, über-versatile stuff like the chips Apple churns out on a yearly basis).

    As I said on this thread before, Apple is plugging all the holes on their logic boards, one by one, with their own custom silicon. It's not a matter of if, but of when, and by which order. I'd say that the CPU will be the last piece of the puzzle, for obvious reasons, but as everything else, it will be a gradual at first – we're currently in the “gradual” phase, with the T1 and T2 chips –, and then sudden process (besides some other obvious Intel-related stuff which Apple would do well to implement in-house before having a potential falling out with them, such as Thunderbolt [3/4/whatever], it would indeed make sense for Apple to just replace the Intel CPU and the Intel Iris integrated graphics in one fell swoop, as they already has its own CPU+GPU solutions, and Metal 2 is an obvious stepping stone in that direction; if you fail to acknowledge that, and the implications of – and motivations behind – the deprecation of OpenGL, you know nothing about Apple's MO, as this is Carbon 64 vs. Cocoa 64 all over again…).
     
  21. Matrix_uk macrumors newbie

    Matrix_uk

    Joined:
    Dec 12, 2018
    #1671
    Probably zero as arm64 versions of Windows 10 and Linux exist ..
     
  22. Janichsan macrumors 65816

    Janichsan

    Joined:
    Oct 23, 2006
    #1672
    And what use are these when the vast majority of available software is still for Intel?
     
  23. noxivs macrumors newbie

    noxivs

    Joined:
    Sep 16, 2015
    Location:
    ist
    #1673
    Good luck finding ARM compiled versions for many of the popular pro grade windows programs, let alone games
     
  24. KPOM macrumors G5

    Joined:
    Oct 23, 2010
    #1674
    There are Windows PCs running on Snapdragon 845 chips. They emulate a 32-bit Intel processor.
     
  25. Janichsan macrumors 65816

    Janichsan

    Joined:
    Oct 23, 2006
    #1675
    ...very badly and slowly.
     

Share This Page