Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A11 is much cheaper and less power hungry than Intel. Apple could put 4 or 8 ARM CPUs in a laptop.
I really don't think you should be expecting 4 or 8 ARM CPUs in a laptop anytime soon. Or probably ever.
[doublepost=1535766715][/doublepost]
Don't think so, since the fiasco of graphic card in mojave. It is choice end user to use windows 100% hardware. It is about choices not about limitation. "Think outside the box"
What is the fiasco?
 
The main CISC architectures that I'm familiar with are VAX PDP, x86, Motorola 68k and MOS 6502. These are designs from the 60's and 70's. From the 80's and onward, to my knowledge, all new CPU architectures have been RISC. Even Intel would probably do a RISC architecture if they were to redesign one today, and in fact they already did this two decades ago with itanium. Itanium didn't get much of anywhere for whatever reasons, but I think everyone agrees that CISC isn't necessarily the best way to go anymore, but x86 is going to stay the way it is because of its massive market share.
[doublepost=1535760039][/doublepost]In terms of running iOS apps on macOS, this has technically been possible (for developers) since forever in the XCode simulator. I think this is actually iOS compiled for x86, and apps are compiled to x86 code. This is of course the exact opposite of what we're discussing in this thread, but conceptually the same thing. It's also not streamlined for end users at all, but it shows that tech wise both operating systems are interoperable.

Anyway, from a tech perspective, Apple could have had ARM Macs a long long time ago. Whether it ends up happening, completely or partially, is going to depend on other things entirely. The tech is already there, at least for a low end laptop like the 12" MacBook.
My ex-client using itanium. We test some php application in Xeon it work as suppose to. When test in production , all thing getting weird. Suddenly client mention they bought itanium server from hp clamming it was good for billing system. A bit long time need to figure out what's going on.
[doublepost=1535766869][/doublepost]
I really don't think you should be expecting 4 or 8 ARM CPUs in a laptop anytime soon. Or probably ever.
[doublepost=1535766715][/doublepost]
What is the fiasco?
maybe i'm wrong, nvdia doesn't support in mojave ?
 
maybe i'm wrong, nvdia doesn't support in mojave ?
I haven't heard about it, but I haven't been looking either. I've been using nVidia drivers for High Sierra though, and that works fine. I'd have to assume they will have drivers for Mojave too, once it's released. Apple also might be doing what they can to block nVidia cards, but there would probably be workarounds available pretty quickly. Not ideal, would be better if Apple didn't mess with which card I want to use, but whatever.
 
  • Like
Reactions: alien3dx
Sure, they could. But they would still be dependent on Intel or other companies for their chipset…

That's why I believe they waited until they had all their ducks in a row (meaning, all the main custom chips) to take that leap. The only reason they haven't done it earlier, I believe, is just the fact that they want to thoroughly test the newer components before they to fully commit their whole production chain to the new architecture… And all those issues with the T2 firmware are a clear indication that the people from the small niche that is their professional market are, rather unfortunately, being used a bit like public beta testers for the main event, the consumer machines.

In a sense, the transition has started already, with the ancillary components and technologies, from the top down, and will start officially and visibly, with the processor itself, from the bottom up. If you really stop and think about it, it makes huge sense and explains a lot of Apple's recent actions (those used to become obvious only in hindsight, but Apple has done so many transitions already that, by now, they are becoming a bit predictable; remember when they started pushing heavily for devs to switch to XCode and Cocoa? Yes, they pulled the rug out from under Adobe when they deprecated Carbon 64 at the last minute, and that wasn't very cool, and all, but those lazy frenemy bastards should've known better since the x86 transition writing was indeed on the wall; and do you know who's ready for an ARM transition, this time? Their competition, Serif… Or do you think they ported their Affinity suite to the iPad just because? They're using modern, platform- and architecture-agnostic C code on their graphics engine for a reason ;) ).

By the way, the fact that ARM chip manufacturing would have to be spread out across the iPhone, the iPad, the Apple TV, the Apple Watch, the Home Pod *and* the Mac lines might become a bit of an issue. Are there any other chip manufacturers around besides TSMC and Samsung which could rise up to the task? AMD? Or even, Jobs forbid, Intel itself? :D Either way, we really should pay attention to supply chain rumours, especially those on backstage deals, as they may be telling of things to come. I mean, all those processors have to come from somewhere.

Amd doesn’t have a foundry. And globalfoundries has backed off 7nm. So it’s samsung and tsmc.
 
  • Like
Reactions: Mainyehc
You were the one suggesting that we all use a $300 computer from walmart for our windows needs. That suggests to me that you think there is little to no value associated with apple hardware.

Personally, I think that a 5k display is useful, even when I have to use windows. It's not as if I downgrade my expectations of what's possible when I use a windows computer. CAD is CAD.

I was suggesting that PCs can be used for Windows, as intended, if whatever path Apple takes moving forward impacts dual boot. Its not something prohibitive, as they are very inexpensive. That has absolutely nothing to do with the value i perceive Apple to have. I just really dont think Apple should choose a future path based on appeasing people that want to use the machine for non-Apple purposes. Thats to me would be like ripping a computer maker for eliminating the CD tray because they liked using it as a cup holder.

And let me hypothesize something to you. Imagine you walk outside tomorrow, and a wormhole opens up and a person steps through.... and is from 20 years in the future. And that person handed you a computer from that time. Would you prefer its essentially the same as todays... an Intel proceesor some 30% faster than todays, macos, ssd storage, etc... or would you be more excited if it had some completely different architecture, made by some company that might not even exist today? Just saying... tech advancement is much easier without reliance on backward compaitability, or reliance on compatibility with products that arent even made by that company.
 
and they're not just suddenly going to pop out Xeon 18 core level ARM chips out of nowhere)
depends what you mean by Xeon 18 core "level"...

https://www.anandtech.com/show/12694/assessing-cavium-thunderx2-arm-server-reality

The ThunderX2 delivers 87% of the performance of the twice as expensive EPYC 7601. Since this benchmark scales well with the number of cores, we can estimate that the Xeon 6148 will score around 4.8. So while the ThunderX2 can not really threaten the Xeon Platinum 8176, it gives the Gold 6148 and its ilk a run for their money.

Here's the Xeon Platinum 8176
(8719 USD)
https://ark.intel.com/products/120508/Intel-Xeon-Platinum-8176-Processor-38_5M-Cache-2_10-GHz

and here's the Xeon Gold 6148

https://ark.intel.com/products/120489/Intel-Xeon-Gold-6148-Processor-27_5M-Cache-2_40-GHz
(3072 USD)
The Xeon W-2191 is an apple only chip. It's closest non Apple equivalent is the this one.

https://www.anandtech.com/show/1311...w-2195-w-2155-w-2123-w-2104-and-w-2102-tested

(2553 USD)



Arm is a pretty scalable architecture. If you want to design a high core count chip that competes with multi thousand dollar server grade Xeons you can.

I'm not sure that an imac needs to be designed around performance per watt., though. Unlike a macbook, it doesn't need batteries. Unlike a server farm or a super computer, it doesn't need to be built in consultation with the local power companies.
 
I was suggesting that PCs can be used for Windows, as intended, if whatever path Apple takes moving forward impacts dual boot. Its not something prohibitive, as they are very inexpensive. That has absolutely nothing to do with the value i perceive Apple to have. I just really dont think Apple should choose a future path based on appeasing people that want to use the machine for non-Apple purposes. Thats to me would be like ripping a computer maker for eliminating the CD tray because they liked using it as a cup holder.

And let me hypothesize something to you. Imagine you walk outside tomorrow, and a wormhole opens up and a person steps through.... and is from 20 years in the future. And that person handed you a computer from that time. Would you prefer its essentially the same as todays... an Intel proceesor some 30% faster than todays, macos, ssd storage, etc... or would you be more excited if it had some completely different architecture, made by some company that might not even exist today? Just saying... tech advancement is much easier without reliance on backward compaitability, or reliance on compatibility with products that arent even made by that company.
PC not windows.. AND mac doesn't mean cannot run linux,windows.
[doublepost=1535775164][/doublepost]
depends what you mean by Xeon 18 core "level"...

https://www.anandtech.com/show/12694/assessing-cavium-thunderx2-arm-server-reality



Here's the Xeon Platinum 8176
(8719 USD)
https://ark.intel.com/products/120508/Intel-Xeon-Platinum-8176-Processor-38_5M-Cache-2_10-GHz

and here's the Xeon Gold 6148

https://ark.intel.com/products/120489/Intel-Xeon-Gold-6148-Processor-27_5M-Cache-2_40-GHz
(3072 USD)
The Xeon W-2191 is an apple only chip. It's closest non Apple equivalent is the this one.

https://www.anandtech.com/show/1311...w-2195-w-2155-w-2123-w-2104-and-w-2102-tested

(2553 USD)



Arm is a pretty scalable architecture. If you want to design a high core count chip that competes with multi thousand dollar server grade Xeons you can.

I'm not sure that an imac needs to be designed around performance per watt., though. Unlike a macbook, it doesn't need batteries. Unlike a server farm or a super computer, it doesn't need to be built in consultation with the local power companies.
For me, i have try run httpd/mysql /php in my tablet android 7 inchi samsung. It great for just showing of to the customer how it works. I'm not sure even IPAD have this type of application. I also have seen hp sell arm server. I do hope Google
Chromium/Ipad can replace normal laptop someday.
 
Indeed, the transition makes a lot of sense, though possibly more so to Apple than to end users. I think you're spot on w.r.t the chipset, what they're doing there is actually quite interesting. With the T2 and its built-in SSD controller and disk encryption, they only have to add the nand chips. One subtle benefit is that they can spread out the SSD chips across the logic board, instead of having them all in the same place, and they of course already do this. This doesn't necessarily benefit end users who would prefer a replaceable M.2 device instead, but I suspect it benefits Apple in terms of the design and the manufacturing costs. Another benefit being that they can offer disk encryption with no slowdown. One can easily imagine other functionality being moved into custom chips over time. One thing I don't think they have yet is a TB3 controller, but that's easy enough to either buy from Intel or to integrate themselves eventually.

As far as chip manufacturing, they're already selling well over 50M ARM devices per quarter, and something like 3.5M Macs. Only a fraction of those Macs would be moving to ARM in the first wave, so I don't think the manufacturing would be a major issue initially. And y'know... I wouldn't be very surprised to eventually see Apple end up with their own fabs...

For the foreseeable future though, I suspect that if and when ARM Macs do appear, it's still going to be a mixed lineup with Intel chips at the higher end of performance for some time. That's not how it happened last time, but then the performance of the Intel chips were quite far ahead of the PPC chips, whereas the ARM chips are still at the lower end of the spectrum for macOS devices. (and they're not just suddenly going to pop out Xeon 18 core level ARM chips out of nowhere)

Interesting analysis you've got going there. I concur with all points, and I completely forgot about the TB3 controller. Does the fact that Intel recently made TB3 royalty-free play into that, or would they still have to rely on them in some form?

As for having a mixed lineup, I hinted at that but I hadn't gone as far as explicitly mentioning it. I've always envisioned a split between professional, Intel machines (especially desktops/workstations) and professional/prosumer/consumer machines on the low end with ARM processors. In fact, Apple could very well offer an ARM MacBook Pro with insane battery life alongside the regular Intel model (they could just give it a different name and call it a day, as long as they made very clear what software it was compatible with).

I know it wouldn't be very “Apple-like” and would definitely break their “good-better-best” tiers, but seeing how they threw the four-quadrant matrix out of the window a long time ago (and that the iPhone and iPad lineups have a lot of different SKUs and, yet, people don't seem very confused when buying them), it wouldn't be that big of a deal. I mean, having a 12'' Retina MacBook and a cheaper 13'' MacBook Air (which is heavier than the regular MacBook) seems to be already as confusing as it gets (as are the rumours about its upcoming replacement), and you don't really see people complaining. It's not like Apple would suddenly resort to giving them customer-facing SONY-sounding (or Quadra/Centris/Performa-era-sounding) product names…

By the way, does any of you CPU architecture experts in this thread reckon Apple might also be able to/willing to develop at least some dual-architecture models, like the old x86 upgrade cards you could plug into those old beige Macs? Would that be feasible and useful for, say, power-saving measures, kind of like the T2 chip already does but in a more pervasive fashion? Or would they just be useless, power-hogging Frankenstein monsters instead? I mean, with the return of fat binaries and the advent of AppKit for macOS, the app situation wouldn't be a problem, but I wonder about the strictly computational and energetic side of things. If you could have a kind of “super MacBook Pro” with both chip architectures inside, and if you were able to outright turn off all of the x86 cores (on the fly or otherwise) like you can turn off the dedicated GPU, and if it just worked™, that would be the best of both worlds.

As for having the users' best interests in mind or not, well… I'm one of those users who would rather tinker with his own Mac as if it were a PC (and with good reason; I was raised on the PC, so I guess I never really let go of that tinkerer spirit), but some day I will have to cave in. And the fact of the matter is that if their choices make their computers faster and more secure, it will be a tough choice and they may end up winning over a lot of old-timer converts. I mean, if their decisions were that terrible, the Mac market share would've had completely collapsed by now (though I hope they are paying close attention to the hard data and not just to the total net gain of users… In the long run, losing a few influencers can be much worse than winning over a thousand wannabes/nouveau riche who buy expensive models without any need for them and just to show off; in a sense, it feels a bit like consumer electronics gentrification, and it hurts our wallets almost as much as the one of the real estate kind).
 
Last edited:
  • Like
Reactions: CodeJoy
By the way, does any of you CPU architecture experts in this thread reckon Apple might also be able to/willing to develop at least some dual-architecture models, like the old x86 upgrade cards you could plug into those old beige Macs?
This has been discussed before. It could be done, but it's not worth the trouble. If you wanted them to share RAM, then they'd have to figure out a way for Intel and ARM chips to agree on memory consistency. While perhaps technically possible, it's most likely too much work to get there. You could put the other CPU on the PCIe bus though, and talk to it just like you talk to a graphics card. Then it would need its own RAM, and you'd copy data back and forth. This is perhaps easier (and similar to the old x86 cards), but you end up duplicating some hardware and probably lose some performance. At the end of the day, if this was something that would sell them another 50M devices per quarter, I'm sure they'd do it. But would we even be talking 500K devices? I'm pretty sure it's not worth it.

As for having the users' best interests in mind or not, well… I'm one of those users who would rather tinker with his own Mac as if it were a PC (and with good reason; I was raised on the PC, so I guess I never really let go of that tinkerer spirit), but some day I will have to cave in.
You don't *have* to. You can run macOS on a delidded 5.2GHz 8700K, undervolted Vega 56, maxed out with RAM and SSD, just to take a completely random and unrelated example :) That's really the Mac Midi that they should have released a year ago, but didn't. As far as tinker boxes from Apple, I think those days are gone. I think everyone hoping for a tinker friendly Mac Pro will be very disappointed.
 
  • Like
Reactions: lysingur
This has been discussed before. It could be done, but it's not worth the trouble. If you wanted them to share RAM, then they'd have to figure out a way for Intel and ARM chips to agree on memory consistency. While perhaps technically possible, it's most likely too much work to get there. You could put the other CPU on the PCIe bus though, and talk to it just like you talk to a graphics card. Then it would need its own RAM, and you'd copy data back and forth. This is perhaps easier (and similar to the old x86 cards), but you end up duplicating some hardware and probably lose some performance. At the end of the day, if this was something that would sell them another 50M devices per quarter, I'm sure they'd do it. But would we even be talking 500K devices? I'm pretty sure it's not worth it.

Yeah, I thought about the implications of that. Two processors sharing the same memory would be a recipe for disaster unless some memory management subsystem was in place, and that would definitely add some overhead, of course.

What if… accepting hardware duplication as a fact of life (hey, such a machine would always be more expensive, so… as long as it was worth it, people would pay for it; just imagine the use cases – besides the potential power-saving aspect of such a config if done on a portable system –, like full-speed virtualization which would be more like having a KVM switch and two real systems in a box), the ARM side of things had, as is common on iOS devices, not only its own integrated graphics but also Package-on-Package memory? Heat dissipation-wise, would such an arrangement still be compatible with overclocking and active cooling, or would the memory, being sandwiched between the CPU+GPU and the heatsink, be outright fried? Questions, questions…

You don't *have* to. You can run macOS on a delidded 5.2GHz 8700K, undervolted Vega 56, maxed out with RAM and SSD, just to take a completely random and unrelated example :) That's really the Mac Midi that they should have released a year ago, but didn't. As far as tinker boxes from Apple, I think those days are gone. I think everyone hoping for a tinker friendly Mac Pro will be very disappointed.

I know I don't. That's why I'm still using a Late 2009 27'' iMac with a 2.93 GHz Core i7, 32 GB of RAM, a DIY fusion drive and a BT4+802.11ac card. I could also be using a Hackintosh (I'm guessing the config you suggested would only be feasible on such a machine, and not on an official, Apple-branded Mac). The thing is, while I'm widely know across my circle of friends, acquaintances, former Mac room goers (I used to be a monitor at my Uni, and I was responsible for the upkeep of 30+ Macs spanning two labs) and even current students as “the Mac guy” (I don't know, I guess I'm kinda famous on a local level…?), I'm seriously trying to pivot to academia (also, as of late, I've been neglecting my design career a bit as well).

This crap is fun, but way too time consuming, and I feel I should leave my ersatz repair & upgrade business to someone else (it's not like you need to have a degree or to be a very special person to do it; any PC geek can do the same things I do and much more) and even skip the DIY aspect of it altogether (though I am loath to the idea of paying someone else to do anything I can do on my own, and I had enough bad experiences even with AASPs for me to be able to trust, well, anyone else but me – save perhaps for Louis Rossmann or someone else with his level of skill, of course; as such, I am intending on buying the next 4K iMac revision and giving it the SnazzyLabs treatment two years down the line, once the EU-mandated two-year warranty expires, but that would be kind of a one-time thing for each desktop machine I own in the future). ;)

Case in point (because I always believed laptops would inevitable – and sadly, but such is life – turn into a different kind of [monolithic] beast down the road): I also own a Mid-2011 13'' MacBook Pro, which I also want to get rid of (and I still haven't managed to do so… I tried our local equivalent to Craigslist and I only got answers from blatant scammers) because it can't officially run Mojave (or unofficially with proper GPU support, at least); I decided to try a dual SSD RAID 0 config before buying a 2012 machine, and it worked perfectly and was wicked fast for its age and price (I would have to resort to Carbon Copy Cloner in order to even be able to update the OS, but I wouldn't mind that in the least as I have enough spare HDDs lying around).

Guess what, I bought the 2012 machine and when trying out the same config on it I found the ODD connector/controller, supposedly of the SATA III 6 Gbps kind, always craps out with whatever SATA III devices you connect to it (I tried everything: SSDs and HDDs, both old and new, to no avail). To this day, I'm still not sure whether this is an issue with the machine itself, with the entire model range, with the firmware, or what; I seriously thought of buying another one to test that config out, but I just can't afford to go around buying old computers, at €600 a piece, that I may then have a hard time flipping on scammer-ridden websites, so I decided to wait until a few trustworthy friends and colleagues are willing to be my guinea pigs (that time will eventually come, and I may indeed resell both my 13'' MacBook Pros and buy one – maybe from one of said friends, even – that is known to be working with SATA III on both bays).

So, anyway, I had to return the SSDs and make do with a regular Fusion Drive made up of a crappy Toshiba SATA II HDD instead, and since it had to be stuck in the ODD bay so I could make use of the SATA III connection in the HDD bay for the SSD, I also had to do away with the screw-in ODD plastic caddy/adapter and replace it with self-adhesive sponge bits and blu-tack, both on the bottom cover and on the underside of the keyboard, for shock absorption. It's one hell of a jury rig, but it works. Also, I've been running Mojave PB on it from an external USB 3 dock, but sometimes it refuses to boot and gives me the forbidden sign (it's just a matter of trying again until it does boot, but… it's the kind of issue I never got before and which does give me pause).

I also shoehorned Mountain Lion into a 2008 MacBook back in the day, and it was fun, and usable, and all, but… having to make sure each and every update was compatible with MLPostFactor and to reapply the kext patches was, to put it mildly, yet another chore. At that point, you might as well cave in and switch to a PC, or install some version of Windows or Linux on it. :p Gone were the days when you could install Leopard on a 800 MHz G4 eMac just by putting it into Target Disk Mode, running the DVD installer from a G5 tower and calling it a day…

To be honest, I'm sick of these constant little snags, and especially of wasting so much time on these projects. They are fun, yes (even when they get seriously frustrating… I must be a masochist or something), but I have much more to contribute to the world elsewhere… Besides, I am savvy enough to deal with T2-equipped machines which you can't recover data from if they die, as I keep on-site, off-site (both encrypted) and cloud-based backups. I've also nearly had my computer stolen once (back then, I only had my dependable – until it died from capacitor rot, that is – iMac G5, and a single external FW400 HDD), when I was still living with my parents (they had a break-in, and I entered the place while the robbers were still in there – I never saw them, as they did a silent escape through the same window from which they broke into the house –, so I guess I saved my computer and my backups in the nick of time), so… the safer my data is, the better.

All things considered and if my budget allows for it, I, for one, welcome our new anorexic, glued- and soldered-on Mac overlords. :p Buuuut… thanks for your suggestion anyway. If I ever hit a financial rut and need a new machine while in it, I may seriously consider that option. Just because it's a bit of a hassle, I'd still rather deal with the known issues and shortcomings of running macOS on PC hardware (such as, say, not being able to use iMessage on it, or having to wait a few weeks to perform updates; it's not like many regular professional Mac users don't do that anyway already) than having to deal with *gasp* Windows on a daily basis. ;) I already had to finish my BFA on a Toshiba running Vista when my G5 iMac crapped out (hey, I was waiting for the rumoured 27'' iMac I am still using to this day to be announced, and the computer was a loaner from my dad's small business, so… while excruciating, it was definitely worth it), and it's not something I want to go through again if I can help it.
 
Last edited:
I think Apple with have a fast x86 emulator to maintain compatibility and still outperform Intel Chips. That is unless Intel finally figures how to make 10nm and 7nm chips by then.
One has nothing to do with the other.

Not sure how Apple could emulate x86 faster than what Intel chips do natively, either.

Die size means nothing in this case.
 
Don't think so, since the fiasco of graphic card in mojave. It is choice end user to use windows 100% hardware. It is about choices not about limitation. "Think outside the box"

If you wish to use that hardware for another purpose, "thinking outside the box" as you choose to call it... thats great. No problem. I take no issue what that. But if Apple decides to go another route, such as if it decides to stop using Intel processors.... well, then you would need to find another solution. Heck... wouldn't you have to admit that it's Apple, at that point, "thinking outside the box" and not going the same Intel processor route as everyone else? Can't SOME computer company, at some point in time, decide to use a different processor? Or are we stuck with Intel for the rest of eternity.
 
If you wish to use that hardware for another purpose, "thinking outside the box" as you choose to call it... thats great. No problem. I take no issue what that. But if Apple decides to go another route, such as if it decides to stop using Intel processors.... well, then you would need to find another solution. Heck... wouldn't you have to admit that it's Apple, at that point, "thinking outside the box" and not going the same Intel processor route as everyone else? Can't SOME computer company, at some point in time, decide to use a different processor? Or are we stuck with Intel for the rest of eternity.
That is why i see apple(non steve job era) don't like long term stable (lts) . If they stòp,they loose more developer except ios developer. I rather stick to fedora ,ubuntu if non mac osx or windows.

I never user VIA proc,AMD PROC. I just use intel from era 80286 and nothing wrong with it. If handphone, i have huawei (krilin),samsung,some cheap mediatek, iphone,ipod. The limitation is you not me.
 
Last edited:
And that person handed you a computer from that time. Would you prefer its essentially the same as todays... an Intel proceesor some 30% faster than todays, macos, ssd storage, etc... or would you be more excited if it had some completely different architecture, made by some company that might not even exist today?

I'm not sure that apple is that company. Meanwhile, the latest generation of x86 chips looks rather more capable than the last.

People may look at Apple's chips and think-- "wow, by the standards of a dubious benchmark, they're really catching up to intel"-- but what makes you think they'll surpass x86-- on a metric that isn't power related?
 
  • Like
Reactions: alien3dx
Look, Intel has made great strides in recent years lowering the power consumption of their x86-64 CPU's. That will likely keep Apple in the Intel camp.

Besides, to get the ARM architecture so it could compete against the latest x86-64 CPU's, it may require a far more complex CPU design that will make the enhanced ARM CPU use effectively as much power as the Intel CPU. Why bother.
 
Look, Intel has made great strides in recent years lowering the power consumption of their x86-64 CPU's. That will likely keep Apple in the Intel camp.

Besides, to get the ARM architecture so it could compete against the latest x86-64 CPU's, it may require a far more complex CPU design that will make the enhanced ARM CPU use effectively as much power as the Intel CPU. Why bother.

It's not because Intel is lacking. Apple just really likes control and vertical integration. The article itself states "With its own chips, Apple would not be forced to wait on new Intel chips before being able to release updated Macs, and the company could integrate new features on a faster schedule."
 
  • Like
Reactions: lysingur
... to get the ARM architecture so it could compete against the latest x86-64 CPU's, it may require a far more complex CPU design that will make the enhanced ARM CPU use effectively as much power as the Intel CPU. Why bother.

ARM appears to be bothering. MS has real Windows running on ARM, not that silly RT thing. The target is the notebook market, which is not to be taken lightly. If A76 can compete level with i5-U and offer greatly improved battery life, we could start to see some nontrivial ARMward migration across the board. Get a foot into the notebook door and the workstation sector will ultimately not be far behind.

They are banking on big gains from 7nm process, about which I have my doubts – Intel has backed up to 22nm, but ARM is somewhat structurally different from x86-64, so smaller processes may work better for them.

SoC design is ideal for portables, and Intel is kind of squeezed in SoC development. The diversity in the ARM arena allows for greater flexibility in product design than Intel can manage when Intel is stuck producing large volumes of generic CPUs.

To me, it looks a lot like ARM is in a darn good position here.
 
If you wish to use that hardware for another purpose, "thinking outside the box" as you choose to call it... thats great. No problem. I take no issue what that. But if Apple decides to go another route, such as if it decides to stop using Intel processors.... well, then you would need to find another solution. Heck... wouldn't you have to admit that it's Apple, at that point, "thinking outside the box" and not going the same Intel processor route as everyone else? Can't SOME computer company, at some point in time, decide to use a different processor? Or are we stuck with Intel for the rest of eternity.

If ANY computer company would do that, that would surely be Apple. Apple and Microsoft are the only companies which produce both OSes and hardware, but Apple is the only one that oh-so-conveniently has a full-blown in-house chip design unit. Conversely, it is the ONLY chip designing company that actually develops OSes (no, their competition, with their embedded processor+OS combos, doesn't really count, as their wares are extremely specialised solutions and not mass-market, über-versatile stuff like the chips Apple churns out on a yearly basis).

As I said on this thread before, Apple is plugging all the holes on their logic boards, one by one, with their own custom silicon. It's not a matter of if, but of when, and by which order. I'd say that the CPU will be the last piece of the puzzle, for obvious reasons, but as everything else, it will be a gradual at first – we're currently in the “gradual” phase, with the T1 and T2 chips –, and then sudden process (besides some other obvious Intel-related stuff which Apple would do well to implement in-house before having a potential falling out with them, such as Thunderbolt [3/4/whatever], it would indeed make sense for Apple to just replace the Intel CPU and the Intel Iris integrated graphics in one fell swoop, as they already has its own CPU+GPU solutions, and Metal 2 is an obvious stepping stone in that direction; if you fail to acknowledge that, and the implications of – and motivations behind – the deprecation of OpenGL, you know nothing about Apple's MO, as this is Carbon 64 vs. Cocoa 64 all over again…).
 
Last edited:
  • Like
Reactions: lysingur
Well, so it's beginning. I wonder how many people will choose to move on from Apple due to lack of X86 support. I for one will not be buying a Mac that I cannot run windows on

Probably zero as arm64 versions of Windows 10 and Linux exist ..
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.