Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I know that Mathematica currently runs on ARM (Rasberry Pi). Please see the attached picture to my original post.

Compare the WolframMark benchmarks.

So is your assumption that Steve and his team can cross-compile for cheap off-the-shelf 32-bit CPUs, but is too stupid or egotistical to target some hypothetical 64-bit Apple A14/A15XYZ before you need to upgrade to a faster Mac?
 
  • Disagree
Reactions: Atlantico
I was a Windows/WinCE/Java Developer 10 years ago. When I had a x86 mbp, I had a chance to play with Xcode and now I am a iOS developer. I think x86 was the opportune to bring more developers. I don't think a lot of developers work on iOS development only. In a mixed development environment, such as full-stack development, mac(x86) is a good choice.

In my experiences, Windows x86/ARM64, share a lot of major APIs, so migration has less pain.

They pretty much have the same! API (minus OpenGL and a few other deprecated APIs). So porting a Windows application from x86_64 to ARM64 is just a matter of re-compilation - in fact there is no porting involved as far as the source code is concerned.

And then when having a Windows ARM64 machine you have Linux ARM64 as well via WSL2. So i do hope that Windows will be supported via Bootcamp.
 
Last edited:
  • Like
Reactions: 09872738
An end user should never notice any difference when using a ARM Mac except the performance.
[automerge]1591828288[/automerge]
Do you mean arm based mac or, for what concerns me, macbook air equipped with arm processors will be less or more powerful than the Intel equipped ones?
Thanks
 
Last edited:
"Easy" means costs and users experiences. I think it is easier to migrate software from Windows to MacOS.

There are a lot of iOS developers because MacOS uses x86 CPU. Developers from other platforms have a chance to try Xcode and migrate apps.
There are a lot of iOS developers because iOS is by far the most profitable mobile platform to develop for anywhere. It’s where the money is at.

I don’t see how this has any bearing on CPU architecture. Developers for iOS/iPadOS/watchOS/tvOS/macOS use Macs because Xcode, not because x86.

Now, Developers for multiple platforms (most of them) will need to see what options they have to be able to use an ARM Mac for Android/Linux/Windows (x86_64 or ARM) development and that will depend on the tools and capabilities Apple decides to put into these new machines. I have a feeling that there will be very acceptable emulation/virtualization options from the start.

But, iOS/iPadOS is definitely a first choice for developers, so they are going to need a Mac. Or if Xcode is really coming to iPadOS along with Final Cut Pro and Pro Logic, then we are really going to see some shake up.

And, I think we need to concentrate on the core issue here. Software already developed for macOS. Not software that hasn’t yet been developed. All software already on macOS will be perfectly fine. Apple will ensure this.

Now as far as the general ARM movement in consumer devices, this has of course already started. As noted, Linux has supported ARM for a long time. I run an rpi4 and a couple of other ARM Linux boxes and software options for the purpose are not that limited, I can pretty much find what I need. Some stuff not available...yet. Windows ARM is mostly a problem because of the nature of Windows. It is precisely because they have a policy of backwards compatibility and legacy support that developers drag their feet and raise up their voices in protest, and corporate entities and financial institutions are still running, in a lot of cases, based on software originally written for Windows 3.1 or 95!!!

Now, you don’t see that kind of behavior on macOS because everyone knows that Apple doesn’t do backwards compatibility or legacy support. So people move with it because they have to. With Apple going all in on the ARM movement, and telling everyone that this is the future of macOS and Apple, this will actually, IMHO, boost and accelerate the Windows ARM and ARM Linux platforms. Microsoft will most likely have to up their own game as well, and Linux is already ahead in the space, and it will most likely become more mainstream in its user base space.
 
One line in this verge article leapt out at me:

Another thing I’ve learned is that using a Windows computer with an ARM processor actually requires a higher level of technical expertise, because you need to know what won’t work and why going in.

Basically, 32-bit Windows apps can be emulated in ARM, but more modern 64-bit apps cannot. And short of Googling (or, uh, Binging) around for a decent chunk of time, it’s difficult to know if an app you need will work.

What Windows can teach the Mac about the switch to ARM processors
 
@firewood ,

Please carefully read my original post. I made no assumption.


richmlow
Guys, lets not get all hostile for no reason. All @richmlow said in his original post, which is getting hacked up and quoted in little bits out of context, is that he hopes Apple’s ARM solution will fair better than his current niche and specific use case, and if that’s ends up being true, then he is happy to give them his money next time around.

He may have shown some healthy skepticism in his original post, but also showed open mindedness. Performance is his metric, if I understand him correctly, not ISA.
 
Sorry if this question has already been asked but what does this mean for intel Macs? If you've just bought a 2020 Mac will they become obsolete next year?
 
A question I’m not really seeing asked/answered is what does this mean for GPUs? In the PPC days Apple had to have special versions made by Nvidia and ATI (now part of AMD) and it was kind of a pain, and was told to us as part of why Macs cost more.

Even if Apple can power their ARM chips into performing in the range of an Intel (x86) chip, I’m not sure they are ready to take on the GPU world yet. The GPU Apple bundles with the iPad/iPhone may be ok for basic usage but a lot of software requires a mode substantial graphics solution. Is Apples GPU tech ready to take on atleast Intels integrates graphics as a bare minimum?
 
An end user should never notice any difference when using a ARM Mac except the performance.

And his bank account lighter of another $5000 for a computer that was supposed to be a workstation non-plus-ultra, rendered obsolete less than a year after premiere.
 
ARM is a shared architecture, being developed by a company in Britain, with input from many users/manufacturers. Intel CPUs are made by Intel and AMD, with some input from their customers, but the designs are limited by the small number of sources (basically just 2).

Using the ARM specification, Apple designs custom SoCs for their devices, which are code-compatible with other ARM processors but tend to be loaded with additional features, like neural network logic (without which, FaceID would simply not exist) and dynamic logic arrays (FPGAs) that can selectively accelerate some workloads (possibly, facilitate emulating Intels).

So, Apple's custom SoCs may be designed with some unique performance enhancements that you will not find in an Intel processor, and the customization could, theoretically, offer better security than your generic Intel.
Thank you for the detailed explanation. I kept hearing about this but never quite understood why so many are wanting it.
 
A question I’m not really seeing asked/answered is what does this mean for GPUs? In the PPC days Apple had to have special versions made by Nvidia and ATI (now part of AMD) and it was kind of a pain, and was told to us as part of why Macs cost more.

Even if Apple can power their ARM chips into performing in the range of an Intel (x86) chip, I’m not sure they are ready to take on the GPU world yet. The GPU Apple bundles with the iPad/iPhone may be ok for basic usage but a lot of software requires a mode substantial graphics solution. Is Apples GPU tech ready to take on atleast Intels integrates graphics as a bare minimum?
The rumor that started this says Apple is designing its own gpu. Recall that they have a team that did gpus from scratch after the Imagination divorce, but recently they entered into some sort of new license with imagination again. I believe it may just be a patent license, and they are still using apple’s own designs. I suspect the reason for the license is that Apple wanted to use something that imagination invented for its Mac gpu.

keep in mind that gpus are much easier to design than cpus. unlike CPU’s, gpus can largely be designed with logic synthesis and automated tools.
 
Let’s get real here a bit about what might likely happen.

1. A Series Macs will not be any cheaper than their Intel predecessors. Did Intel Macs get any cheaper after switching from PowerPC? Nope! Heck, we have $1500 iPhones from Apple with A Series.

2. It’s up in the air what the likely hardware developer kit will be, but I’m still putting a strong bet on the recently released iPad Pro - its kinda too obvious, the keyboard/mouse support, extra core. But, I also believe, they could actually provide a modified 16 inch MacBook Pro that developers can order for $1000.

Why a MacBook Pro and not a modified Mac Pro or iMac?
1. Too costly to ship something so big, remember, this company is still kind of a cheapskate
2. Apple is all about mobility, the A Series is really about making mobility the key to the Macs future.
3. An A Series 16 inch MacBook Pro likely maxed out should be sufficient to write and compile code. Based on reviews of the Intel Mac Pro, its not really a great machine for software development and you actually get more bang for your buck going with a iMac or MacBook Pro.

3. The first retail A Series Macs is likely to be up in the air this time. Last time, in 2006 it was the MacBook Pro and iMac. Looking at consumer trends, I am suspecting the first candidates will be MacBook Air, MacBook Pro’s and iMac all at the same time.

The Mac Mini and Mac Pro will likely complete the transition by June 2021.

4. I believe Adobe, Microsoft and all the major developers are already on board and Wolfram or some big dev will be on stage to show how to get your code ready for A14.

5. Where does this leave virtualization? There are a couple options:

- If you need Windows, then you have the option of investing in Microsoft Azure or AWS to spin up a VM. It’s kinda the new reality to be honest and I see Apple and Microsoft partnering on this.

- Apple might commission Microsoft to do a port of Windows to A-Series with Office 365, but very unlikely at this point considering the market reach just to say you have native Windows with a small batch Of apps doesn’t make economical sense. So, thats why I see Apple recommending cloud providers like AWS and Azure as the best way for virtualization going forward.

Yes, there are concerns around bandwidth and all that stuff, but then again, computing is cheap and heavily comodotized; if you need to run that one obscure Windows app, get a cheap Windows PC and Notebook to do it. Heck, I have a MacBook Pro, Surface Pro and iPad Pro all on my desk. This is not 1986 or 2002 one PC with multiple user accounts household situation.

6. Support for existing Intel Macs. Apple I hope better not go in the same direction as PowerPC where Tiger and Leopard were the last two supported releases. This one is gonna require them going the extra mile. Meaning, 5 years of MacOS updates. With their resources, I think they can do it. At the end of the day, these are investments. When I look at how long I have been going with my 2015 MBP and others I know with even older models, it needs to support what will be a legacy architecture for years to come.

7. How ready is all of this? I suspect Apple has been running macOS on A Series since 2013 when they launched the iPhone 5s with the A7 64 bit processor. I remember it was Phil Schiller who described it as having desktop class performance. That stayed with me for a while and its obvious then thats when they started serious testing. What further accelerated it was the year 2015 with the disastrous roadmap from Intel that was not delivering as promised.

8. What do consumers who are going to buy these machines get out of it if its not even gonna be any cheaper?
- Obviously lighter, thinner designs, longer battery life, more security due to further lock down nature. I also think the first A-Series MacBooks will offer 5G modems as a built to order option. Again, its all about mobility and the changing landscape of work. Also, its access to a wider variety of apps on the Mac. I love using the YouTube app on my iPad vs the web.

9. What about all my existing software and hardware utilities. Good question, but I believe I’m in a niche group here where I am still holding onto that last perpetual copy of Adobe CS6 for dear life. Looking at how pretty much every major software vendor has switched to some kind of subscription model, it will lessen the blow with transitioning. Use Office 365 or Adobe CC or AutoDesk or even QuarkXpress. Just download and install it on your new A-Series Mac. No thinking about it or about buying boxed software or upgrading to the new version for optimizations. This is ain’t 2007.

Peripherals now might be tricky, but I look at it this way, if you are Air Printing from your iPad today, likely you will be doing that from your A-Series Macs too. But peripherals that you need to hook up for whatever functionality, at the end of the day, you are gonna need drivers for that. A company the size of Apple, I hope they at least have been actively researching and engaging with third party hardware vendors at a high level and the year ahead will give some hardware developers a little time to do so. But I don’t think it will be a deal breaker. Just don’t expect any drivers for your HP 840c printer.

So, Internally, everything at Apple is ready for this, the key developers are ready for this and of course, some lesser known ones are likely going to be informed and shipped hardware kits under extremely strict NDA to test their code.

On top of all of this, we likely won’t see the first A-series Mac until at least March of 2021. So, this will give developers some time to start the work to transition.

Take aways, it will be faster transition and a more open one. I am sure internally, they have weighed the pro’s and con’s and developer transparency is one I think they likely realize needs to be at the top of the list this year in order to make this really work.
 
The rumor that started this says Apple is designing its own gpu. Recall that they have a team that did gpus from scratch after the Imagination divorce, but recently they entered into some sort of new license with imagination again. I believe it may just be a patent license, and they are still using apple’s own designs. I suspect the reason for the license is that Apple wanted to use something that imagination invented for its Mac gpu.

keep in mind that gpus are much easier to design than cpus. unlike CPU’s, gpus can largely be designed with logic synthesis and automated tools.
Not disagreeing in general but I think you'll agree that designing the best in class chips (whatever class it is) is a challenge. There is a reason why we have only two major GPU vendors.
 
  • Like
Reactions: Atlantico
Sorry if this question has already been asked but what does this mean for intel Macs? If you've just bought a 2020 Mac will they become obsolete next year?
I think people have this all confused. I think this is going to be a much, much slower transition than PowerPC to Intel was.
I don’t think Apple is completely done with Intel processors. Hell, there’s rumors going around that a new iMac with a new design, and new Intel processors and AMD graphics is supposed to come out later this summer. I also wouldn’t be surprised to see an update to the 16 inch MacBook Pro with 10th GEN Intel processors.
Then, starting early next year, will get an Apple ARM powered MacBook. Then maybe later in the year, will get an ARM powered Mac Mini. Then slowly, over the next 2 to 3 years, we will start to expand to the bigger products.
But Intel won’t completely disappear.
people are forgetting that over the last seven years, Apple has been supporting devices for longer and longer. Between 2009 and 2012, Apple went on a huge Mac and iOS device killing spree.
Snow Leopard cut off all Power PC Macs, despite the fact that some we’re not even four years old yet.
iOS 4 cut off iPhone 2G and first generation iPod touch support.
Then, not even 9 months later, iOS 4.3 cut off the iPhone 3G and the second gen iPod touch.
Lion cut off all early 2006 MacBooks, Mac Minis and iMacs.
Then in 2012, Mountain lion cut off several 2007 Machines. iOS 6 cut off the first iPad, even though it literally had just turned two years old.
However, starting in 2013, they slowed down the support dropped list dramatically. Mountain lion, Mavericks, Yosemite, and El Capitan retained the same support list, and the latest release, Catalina, only cut off support for one machine from 2010.
so I bet money that these 2020 Intel Macs will still be supported with The latest software for at least the next 5 to 7 years.
 
If this is true, I've bought my last Mac.

Apple has broken backwards compatibility for the last time for me. I'm already holding at 10.14 on virtually all my machines because I can't give up 32 bit compatibility, and THERE IS NO REASON TO GIVE IT UP. 10.15 is a downgrade in every way.
As far as that goes, everything since 10.6 has had some sort of major downgrade in it. Not a day goes by where I don't at least once want my scroll arrows back.
MicroSloth writes sh*tty operating systems, but at least they know how stupid it is to not maintain backwards compatibility, software that's decades old will still run on Windoze 10. Apple used to be able to do that, the //e was in production for 10 years, but they've forgotten that "ooo shiny" isn't nearly as good as "it just works".
 
firewood,


Please carefully read my original post. I made no assumption.


richmlow

Please run that benchmark on a Pi4 with reasonable ram.
My 8GB Pi 4 isn't coming till next month but I will be happy to test it when it arrives. If you could do it now then we can have a result sooner.

You did made an assumption -- your assumption is a ARM11 Pi 1 with 256MB(or 512MB with model B) ram is a good representation of today's mobile CPU performance.

I guess a Xeon Platinum with 256MB ram will run as slow as that Pi in most situations.
 
Hi All,


Assuming that the ARM rumors come to fruition in the near future, it certainly will have different effects on users. This of course, depends on what a person typically does with his/her computer. If a person's workflow is already quite smooth on an iPad, then I don't think the ARM transition to Apple laptops/desktops will be a big deal. However, if a person cannot "get stuff done" solely on an iPad, then the ARM transition will indeed be disruptive (until all their major software is ported over to ARM). Of course, there will be some software which will never be ported over.

For myself, I use Mathematica extensively in my workflow. As it stands now, I would not be able to use Mathematica on an ARM platform. See picture below.


richmlow

Please run that benchmark on a Pi4 with reasonable ram.
My 8GB Pi 4 isn't coming till next month but I will be happy to test it when it arrives. If you could do it now then we can have a result sooner.

You did made an assumption -- your assumption is a ARM11 Pi 1 with 256MB(or 512MB with model B) ram is a good representation of today's mobile CPU performance.

I guess a Xeon Platinum with 256MB ram will run as slow as that Pi in most situations.

Richmlow is right to show healthy skepticism. But I agree the picture he posted isn't meaningful, since it references an ARM without specifying its provenance. I did some Googling, and to the best I can tell, that processor was used in the early Raspberry Pi's, specifically the A, B, and B+.

Much more meaningful, as suggested by MikeZTM, would be to perform the Mathematica (MMA) benchmark on both a known Intel processor and the current Raspberry Pi 4+. According to
with MMA 12.0, the Raspberry Pi 4+ takes 77.8 seconds to complete the benchmark (they didn't give the score, just the individual times, which I summed). By comparison, it takes my Mid-2014 BTO MBP w/ 2.8 Ghz i7 (4980HQ Haswell/Crystalwell) (also running MMA 12.0) 5.50 seconds*, which is 77.8/5.50 = 14.1 x faster.

So, to equal the MMA benchmark performance of a 2014 MBP, Apple's "upcoming ARM Mac" would need to be 14 times faster (on the MMA benchmark) than the ARM-based Raspberry Pi 4+.

*This corresponds to a score of 2.52.

N.B: Richmlow's benchmark was run with MMA 12.1. On 12.1, my MBP scores a 2.91 (apparently MMA 12.1 is better optimized than 12.0), so it's somewhat faster than Richmlow's older MacPro, which scores 2.60. However, since Wolfram benchmarked their Raspberry Pi 4+ with MMA 12.0, I also used 12.0 in my comparison.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.