Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This would be the end of Mac as we know it... or even Apple.

The transition from PPC to X86 was difficult, but it also made the Mac compatible with what the rest of world is doing (as in Intel/Windows monopoly). Because they made Mac easier to work with and realize the potential of using a Mac without having to take many compromises. Apple had the full support of software devs.

Google, with much less burden on its shoulder, can't really seem to quite crack the windows dominance. I can't think how Apple is able to pull it off.

And quite honest though, I have the latest iPad Pro... and it's so far away from replacing my Mac that I don't know where to begin to talk about it.

I'd just put it this way... some things are meant to stay the way they are. The way I feel is that Cook & Co are betting that by merging both OSes, they can get devs for develop for one thing and make it work on the other relatively easily.

Nevertheless, they have to check first where the market is at. Unless Microsoft is also calling it quit on X86 architecture, going without the full support of devs will easily create a scenario that I can already envision:
The mobile version (UI-wise) will be too underpowered, while the desktop version will be too dumbed down due to the common denominator effect.

Devs either create a core that's targeting the desktop users (plenty of power) with the mobile version severely slowed down or a core targeting mobile devices with a desktop version that's too overly simplified.

The end of Apple?

Even if apple stopped making macs altogether it would hardly be a blip.
 
I'd just put it this way... some things are meant to stay the way they are. The way I feel is that Cook & Co are betting that by merging both OSes
Well, I think that Cook has addressed that point quite nicely, with a resounding "Never".
[doublepost=1550881418][/doublepost]
For this to be acceptable, Microsoft would have to offer Windows 10 ARM retail and only support pure .NET even on x86. Oracle would have to provide ARM Java.

I thought there was a Downloadable Win 10 for ARM 64 from MS, but I couldn't find it.

As for .NET, MS seems to have a really spasmodic "policy".


However, it seems things are more rosy for the JDK:

JDK 8 for ARM via Oracle:

https://www.oracle.com/technetwork/java/javase/downloads/jdk8-arm-downloads-2187472.html
 
Last edited by a moderator:
  • Like
Reactions: nouveau_redneck
This sounds like an unmitigated disaster. There is a ton of software that will never be ported to ARM. Will it also use a browser that can’t run Javascript or just not have a browser? This rumor has me questioning whether I am wasting my time reading this blog, and I just created an account to post here. As others stated, why would a developer purchase a Mac if this rumor turns out to be true? To develop for Apple products on a almost completely closed system. This would likely kill macOS for all other professional use, including mine. If that is accurate, it will be a death spiral for macOS users who depend on any third-party tools, particularly those that modify functionality in any way.

You can count on Apple using this opportunity to require all MacOS apps go through their app store.
 
Next you'll say that there are no FM chips in iPhones! HAHA! D
There haven't been for quite some time now. Ajit Pai is a liar.
[doublepost=1550882005][/doublepost]
It's not about whether Mac OS is better or not. Sure A-chips could run Windows software, but none of it is really available for ARM.
You're wrong about that.

Due to WoW64 layer in Windows 10 ARM 64, no "ARM Port" of x86 is needed. It is generated "on the fly":

https://docs.microsoft.com/en-us/windows/uwp/porting/apps-on-arm-x86-emulation

They CALL it "Emulation"; but it is more correctly called "Just In Time Cross-Compiling". The code that actually RUNS is ARM 64 NATIVE, NOT EMULATED.
[doublepost=1550882278][/doublepost]
Unfortunately I believe they’ll completely lock down macOS
Based on what, other than rank paranoia?
[doublepost=1550882421][/doublepost]
Perhaps rather than starting with MacBooks and iMacs as in the Intel transition, they're starting with the Mac Pro.
I think it will be "low-end to high-end" Transition, not the other way around.
[doublepost=1550882527][/doublepost]
That's honestly my greater concern also...

The #1 thing I love about my Mac over my iOS devices is the immense flexibility in what I'm able to do with it, in a way I want to do it, etc
And why, oh, why would Apple ever mess with that flexibility?

An iPhone is not a Mac, and never will be, and iOS is not macOS, and never will be.
 
macOS benefits *most* from an SSD. If you have an HDD or even a fusion drive, your Mac will feel sluggish, even with a great CPU and lots of RAM.

This was me. my old iMac was a pain in the arse to use. then I put a ssd in... new machine. Night and day. Then a month later Apple announces that my iMac can't run the new os. Booooo :(
 
Nope, windows for ARM is gimped and does not run 'Normal' windows apps. Windows Apps too need to be rewritten for the ARM architecture.
Wrong.

You are thinking of Windows RT or 10s. Windows 10 on ARM is NOT THAT.

Read THEN Post:

https://www.pcmag.com/news/353637/windows-10-on-arm-runs-all-win32-apps-unmodified

You (as a Software Dev.) certainly CAN recompile/rewrite Windows Applications for ARM, and can develop Native ARM Windows Applications; but, thanks to their JIT-Compiler-Based "Emulation", you don't HAVE to!

See:

https://docs.microsoft.com/en-us/windows/arm/
[doublepost=1550883885][/doublepost]
I like this but I know it will be a mess for few years. So probably not a bad idea to buy something and wait for them to bridge it. Probably 2019 products will be the best for next few years :)
Why will it be a mess for a few years? Because the past TWO Architecture changes were?

Oh, wait...
[doublepost=1550884084][/doublepost]
before Mr. Jobs had the good sense to bring Apple into the REAL world... putting the Mac on the Intel platform being one sensible [should have been a no-brainer] long overdue change.
And of course REAL COMPUTERS ONLY RUN INTEL, right?

Please.
[doublepost=1550884628][/doublepost]
Which Apple would LOVE to do I'm sure...
Wrong.

Hackintoshes remain a rounding-error as far as percentage goes. Apple could care less; so long as they remain a rounding-error.
 
The market has changed so much since the PPC to Mac transition.

Serious question: what things have changed that have made system architecture less relevant?

I didn't pay much attention to this and then sort of "woke up" a couple years ago and realized that absolutely everything I use daily is either built-in to every OS or is part of a subscription I pay for that allows me to download the software on whatever computer I use. I am accustomed to using Mac OS and like it. It would slow me down with my work flow for a few weeks if I switched to Windows. But otherwise it would not matter.

Same goes for Android vs. iOS, except for all the digital media purchases... and even those are becoming less important now as the music subscription model fixed half of that for me.

To a large extend, the only thing that matters is a diverse app ecosystem. Apple will instantly enjoy that with this transition where the software packages that people really need will be ported over almost instantly as far as 99% of the population is concerned.

I could be wrong. But I think the 1-2% of specificity users who are more likely to come to sites like this one are not very indicative of the overall market. There will be a 12-18 month period of minor pain. After that, it'll be fine and no one will talk about it.
 
This sentence should read: “This transition will greatly increase the number of Mac apps available, and it will greatly cut down on the overall quality of Mac apps.”
Why would it have to?

Because YOU think that ARM == iOS?
No. Because I think iOS does not equal macOS.

Read again the sentence from the original article to which I was referring. It came immediately after this paragraph:
Apple's transition to a single app for all devices has already begun. Last year, Apple ported several of its iOS apps, such as Voice Memos, Stocks, and Home, to macOS. This year, Apple plans to let developers transition iPad apps to macOS, and in 2020, that will include iPhone apps. In 2021, then, developers will be able to make just one app that users can download on any of Apple's platforms.
This is the transition to which I was referring, not the rumored transition from Intel to Apple CPUs.

KazKam already explained the point I was making perfectly clearly:
Processing power isn't the problem with this... it's app functionality and UI scalability.

Write once, compile for multiple platforms is a cool idea... but, and it's a BIG but (and I cannot lie), the Mac and even the iPhone and iPad are radically different devices when it comes to scale and user interface/interaction.

It's already difficult for developers to conceive of or create apps just in terms of two form factors on the same OS with the same input options, let alone desktop computing where the screens are huge, the mouse gives precise control, and menus allow access to powerful hidden features and complexity.

There are very few apps I can think of that could succeed as a fit for all three device sizes/uses. Also, the complexity of coding exceptions for the different devices, use-cases, and features is so prohibitive you might as well stick with separate code bases and SDKs.
 
What in the HELL are you blathering about?!?
I assume what he’s trying to say is that Apple is not allowed to implement chips using the x86-64 instruction set. That would be true unless they used a fab that had a license. Used to be IBM had one, for example, so maybe Global Foundries inherited it from them or AMD - no idea. Or they would have to get Intel to license its contributions and AMD to license the AMD64 extensions. There used to be some other licensees floating around, and other clones (Cyrix, rise, national, etc.) Some had licenses, some used licensed fabs.
 
That more or less eases my mind, that they will probably stick with intel for the more "serious" work-horses and support both architectures. (I hope)
I think they’ll start with the little MacBook and work their way up.
[doublepost=1550887612][/doublepost]
I’m totally fine with the transition to Arm chips but as soon as they’ll decide to lock down macOS like they do with iOS, I’m out. I want to have the freedom to install 3rd party apps, to switch back to a previous MacOS when there are problems with the latest version or to install another OS (e.g. an arm based version of Linux), so when Apple drops support for the device I still can use it for lighter tasks. Unfortunately I believe they’ll completely lock down macOS, which might be fine for iOS devices, but is a complete no-go for a PC imo...
I agree—I do not want to be locked into an App Store in a Mac, the way Windows is doing with its Arm Windows thingy.
 
Here we go again, back to the endless incompatibilities of the 90's where everything always worked perfectly on the Windows machines and nothing but incompatible this and that on the Macs.
Lost count of how often I couldn't open, or play or see or hear, email attachments that Windows users opened up just fine.
 
Apple had better be really careful with this one. I transitioned to Mac from Windows in part because I could virtualize Windows and run all of my old programs while I found Mac-based replacements. If corporate software gave me issues, I could always virtualize Windows and run things there. Once I had settled on the Mac, I went all-in with Apple to get the full benefit of the "Apple ecosystem" - iPhone, iPad, Airport router, Watch, Apple TV...

But at the end of the day, my computer is still #1 among all of those devices in its importance. If Apple goes full ARM and it's a messy transition that doesn't offer any major benefits, I'll be forced back to Windows. And once that's done, the hold that Apple has on me with all of their other products will be at risk, because my most important computing device will no longer be in their ecosystem.

I hope it doesn't happen, or that we're somehow pleasantly surprised.
 
And how big will that logic-board get at that rate? Better yet, how big a heatsink will it need? All speculation based on...? Don't really care-but Apple can prove me wrong then I'll explore spending the money, not you. Everyone's got an opinion-and yours is not mine. I chose the lesser of my processor to compare-but if you'd like we can compare to my Xeons.

BTW, 8x$30=$240 and it would cover an entire logic board, mostly. Nice use of Apple pricing on one and retail pricing on the other.

I only used the info I could find readily on Google. I made a point of noting that the Intel pricing was retail b/c I have literally no idea what Apple is paying them & wanted to make that clear, as not to obfuscate.
As to your other few queries: I don’t think the A12 in the iPad Pro has an inordinate sized heatsink, based on the slim/sealed nature of the device, yet the latest I’ve heard is that it runs faster than 90% of all notebook computers sold in the last two years. As such, I scarcely think 8 of them would be a necessity. It seems that 2 would come in (with 5nm die shrinkage) the same size or smaller than an i7 & handily best it in raw power (are there any Xeons that bench twice as fast as an A12??). Further- I think, even not knowing Intel’s pricing given to Apple, we could assume it would be greater than $60, which would be roughly Apple’s cost for a dual A13 chip setup on a 2020 MacBook.
[doublepost=1550893631][/doublepost]
Benchmarks are great but if you don't have software to run on said system it's a fancy calculator.

Plus, I absolutely guarantee these benchmarks will be hit by thermal throttling. How do you cool an ipad running some high processor intensive application without throttling?

Who’s talking about an iPad???
The article is talking about 2020 Macs running Apple’s A-Series chips...
Yes, we all know that they’d be thermally throttled in an iPad, but given the vastly larger amount of space in a MacBook, why would they also be thermally throttled there?
 
As to your other few queries: I don’t think the A12 in the iPad Pro has an inordinate sized heatsink, based on the slim/sealed nature of the device, yet the latest I’ve heard is that it runs faster than 90% of all notebook computers sold in the last two years.

[doublepost=1550893631][/doublepost]

Yes, we all know that they’d be thermally throttled in an iPad, but given the vastly larger amount of space in a MacBook, why would they also be thermally throttled there?

That is a Geekbench at an equivalent standard in iOS vs macOS; that is not real world, or user experience and is only a measures at peak, not sustained performance.

The laws of physics will always thermally throttle any processor-it's more important that the machine doesn't become thermally saturated so the performance is sustained.
 
Last edited:
  • Like
Reactions: Novus John
You can count on Apple using this opportunity to require all MacOS apps go through their app store.

Nah. The first releases are likely to be developer Macs. Which means that any Mac developer can write and load any app they can build from code onto their ARM computer. (Same as with iOS devices currently.)

Plus Unix/server terminal command-line application development.
 
This Gary Explains video is a probable reason for Apple to want ARM-based Macs. It will make developing apps for iOS and iOS cloud services easier.

 
You want me to describe what a TDP of a chipset is? Quit trolling.
You said all chips, by the laws of physics, must throttle. That is certainly not true. It is only true when the ability of the cooling solution to remove heat is less than the maximum heat flux generated by the chip when it is running at its maximum speed. In other words, in SOME systems the maximum clock rate is determined by cooling. But certainly not in all systems. In many chips there is a critical delay path that limits clock frequency to something less than the cooling ability of the system. In other words, the chip inherently has a maximum frequency F, and power dissipation is F times the square of the voltage times the effective capacitance switched each cycle times some constant. If the power dissipated by the chip can be removed by the cooling system, which is often the case particularly in RISC workstation class systems and the like, and was always the case in the old days before throttling was a thing, then the CPU does not need to throttle ever. It’s actually not that hard to cool things if you have enough volume - 10W/cm^2 of heatsink was easily doable with a light breeze even back in the 1990’s.

You make these grand pronouncements about what physics demands and that all chips behave a certain way, but your physics is simply wrong.
 
That is a Geekbench at an equivalent standard in iOS vs macOS; that is not real world, or user experience and is only a measures at peak, not sustained performance.

Yup. Many real world usage situations do not saturate processor functional units nearly as much as benchmarks, which allows the machine to seem even faster under that real work usage to the user than indicated by the benchmarks.
 
  • Like
Reactions: Justanotherfanboy
You said all chips, by the laws of physics, must throttle....It’s actually not that hard to cool things if you have enough volume - 10W/cm^2 of heatsink was easily doable with a light breeze even back in the 1990’s.

You make these grand pronouncements about what physics demands and that all chips behave a certain way, but your physics is simply wrong.

And yet here we are at an agreement heat must be cooled at an effective rate or even the most powerful processors available will falter-you just seem to take the attitude of puking your degrees all over the thread like it matter which processors we are talking about. It doesn't. I hope they cool these new ARM's well so it's not a repeat of the 6,1. Better?
 
And yet here we are at an agreement heat must be cooled at an effective rate or even the most powerful processors available will falter-you just seem to take the attitude of puking your degrees all over the thread like it matter which processors we are talking about. It doesn't. I hope they cool these new ARM's well so it's not a repeat of history.

When you’re wrong you’re wrong. And the fundamental point is an A-series processor may very well NEVER have to throttle when placed in an enclosure as big as a MacBook or iMac or the like. If you don’t want to be corrected, don’t say one thing, get it wrong, and then pretend you said something completely different.
 
The laws of physics will always thermally throttle any processor

Not true by example.

The 6502 processor in the Apple II did not throttle.

Even the 150 kiloWatt Cray 1 did not throttle (although it was capable of an emergency shutdown if a leak in the liquid freon cooling system was detected). All you need is enough cooling to meet specified performance.
 
  • Like
Reactions: Reindeer_Games
Not true by example.

The 6502 processor in the Apple II did not throttle.

Even the 150 kiloWatt Cray 1 did not throttle (although it was capable of an emergency shutdown if a leak in the liquid freon cooling system was detected). All you need is enough cooling to meet specified performance.

Off hand I can’t recall any of the 8-bit or 16-bit systems that throttled. Early 32-bit systems didn’t either. Many industrial 32-bit systems still don’t.
 
  • Like
Reactions: firewood
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.