Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ozdy

macrumors newbie
Original poster
Oct 5, 2022
6
1
I love all the possible benefits of ARM architecture in Macs as they are excellent computers for general use but...

For some professional software like CAD/CAM (CATIA, Solidworks), electrical engineering (Altium Designer, Proteus), CFD analysis (Ansys, etc...) everything is mostly Windows and in scientific computing Linux. Since backwards compatibility is not the strong suit of MacOS, I understand the reluctance of companies to produce Mac versions of their products. In the era of Intel Macs, Bootcamp was the workaround but currently virtualization is the only option and it is reported to be buggy. Even for a computer company like Apple, Windows is the sole option during chip design and production process (analysis, machining, design, factory production). Despite year by year growing sales of Macs, It is a shame that such a great quality product has not gained a broader ground like their sale graphic as iPhone vs Android phones.

1) Is there a possibility that the ARM be the dominant architecture, and finally we will be able to use native MacOs versions of professional software or via Windows or Linux through Bootcamp? I mean, not only in graphic design or music production, but also in broader areas like mechanical engineering, factories, science, robotics, or some proprietary software for some specific area. Will Apple Silicon Mac Pro encourage the developers to do so?

OR

2) Will most of the products be on cloud and finally the computer or chip architecture be irrelevant?
 
Well I believe once Apple brings out the new Mac Pro finally to Silicon I believe at that point all Apple Apps will go from Universal to pure Silicon we will see the end of Intel only software! Sure it will just like when went went from PPC chips to the Intel apps and many old programs never kept up and died on vine!
 
  • Like
Reactions: lostPod
1) For Bootcamp, Asahi has made good progress on Linux, but they still have a ways to go before it’s production ready. For Windows, it sounds like it’s not a technical limitation but a licensing issue with Qualcomm

2) More software is moving to the Cloud, but there are still valid use cases for running apps locally.

As for virtualization being buggy, I haven’t experienced that using my M1 Max and Parallels with Win11. If anything, my VM has been much more stable and fast than my previous dedicated Windows laptops. I’m sure it varies depending on the software run (I mostly use my VM for coding and some light gaming), though.

While I would love to see more dedicated macOS targeted apps, companies tend to move at glacial paces so I don’t expect the landscape to change much in the next 5 years or so.
 
1) Is there a possibility that the ARM be the dominant architecture, and finally we will be able to use native MacOs versions of professional software or via Windows or Linux through Bootcamp?
If that software becomes available on Windows or Linux for ARM, then you may be able to run it on AS using Parallels or other virtualization tools. Apple has given no indication they're bringing Boot Camp to AS.

But as far as having native AS versions of software that was previously Windows-only, I don't see why that would happen. After all, for many years both PC's and Macs used x86, and they still didn't port that software from Windows to MacOS. So why would they do it if both PC's and Macs ran ARM? I think your only chance would be if the Mac user base became much larger than it currently is (which is possible).
 
Last edited:
I suspect just because of where computing is going in the future that quite a few programs will be written to be specifically on ARM including Windows programs one would assume and in my case hope. Especially for me, just the piss poor game selection on Mac. I'm not asking for a lot, I am just wanting at least what worked with Macs -before- they required all applications to be 64-bit. I don't think that's really much to ask.
 
I suspect just because of where computing is going in the future that quite a few programs will be written to be specifically on ARM including Windows programs one would assume and in my case hope. Especially for me, just the piss poor game selection on Mac. I'm not asking for a lot, I am just wanting at least what worked with Macs -before- they required all applications to be 64-bit. I don't think that's really much to ask.

It would be on the individual developers to transition their apps to 64-bit...
 
  • Like
Reactions: Saucyken
Is there a possibility that the ARM be the dominant architecture...
Apps aren't written to CPU architectures any more. They are written to run on specific operating systems. That's why you'll find apps for macOS, Windows, and Linux. That won't change even if all three were running on the ARM CPU architecture.
 
  • Like
Reactions: Gerdi and wyrdness
Ultimately 2).
Give it 25 years at the very most, and even the OS's will be cloud-based. Computers running OS's natively on a built-in storage device will be a thing of the past. I'm unsure yet how much AR/VR will change the way we interact with future computers, or how fast, but that will impact it too.
 
Apps aren't written to CPU architectures any more. They are written to run on specific operating systems. That's why you'll find apps for macOS, Windows, and Linux. That won't change even if all three were running on the ARM CPU architecture.
Not entirely true.

There’s plenty of software which requires instructions which are unique to specific architectures.

This is why on Windows on ARM for example, not all software runs as it can’t be emulated if there is not a relevant equal ARM instruction
 
Ultimately 2).
Give it 25 years at the very most, and even the OS's will be cloud-based. Computers running OS's natively on a built-in storage device will be a thing of the past. I'm unsure yet how much AR/VR will change the way we interact with future computers, or how fast, but that will impact it too.
Not necessarily. It depends on performance and cost. Right now an AWS EC2 Mac2 instance, which gives you a 16 GB Mini, is $474 per month. That works for a large organization with highly variable compute demands, but is a ridiculously bad deal for a consumer, who can buy a 16 GB/1TB Mini for $1300. Who knows if that will change substantially in the future. Yes, cost per GB transmistted will come down, but the bandwidth of programs will go up, perhaps cancelling that out.

Having a laptop that you can take with you anywhere without having to worry about delays because of bad internet is worth a lot too. Sure, internet bandwidth will go up enormously, but so will the bandwidth demands of programs.

And for home use, what's the bandwidth needed to run three 360HZ@8k or 360Hz@16k displays (or whatever the standard is in 25 years) without compression?

Plus if you're running multiple high-resolution monitors, isn't your local device is going to need a fairly beefy GPU anyways?
 
Last edited:
Not entirely true.

There’s plenty of software which requires instructions which are unique to specific architectures.

This is why on Windows on ARM for example, not all software runs as it can’t be emulated if there is not a relevant equal ARM instruction
Apps are written in high level languages. Very little user facing applications are written in machine specific code. The reason some software won't run on Windows for ARM is because Microsoft has not yet produced a translation layer that works on all Intel instructions.
 
  • Like
Reactions: wyrdness
I think it's too early to predict how things will develop. Apple probably won't gain significant market space any time soon, that's simply not their focus, but they will pretty much own the business (and to a large degree workstation) mobile market. Macs attract professionals (they are already de-facto the standard for many domains of software development) and users with disposable income, so there will be an interest in providing commercial software.

As to the rest it kind of depends. Apple has a good potential of gaining strong position with the machine learning and data mining community once they improve the hardware a bit an streamline their software interface. There are also considerable gains to be made in 3D and rendering. I can definitely see Apple becoming a very popular choice here once they add some of the missing hardware features and improve the software compatibility.

Finally, one of the problematic areas in my opinion is lack of a top-tier Apple performance desktop currently. While it is true that most people buy laptops and the desktop becomes more of a niche, I believe that enthusiast-class desktops are an important psychological anchor that symbolises computing capability. Apple not offering very fast desktop is interpreted as Macs being "bad for performance" which harms adoption in the community that is a driver of change. But we'll see what Apple has to offer on this front.


2) Will most of the products be on cloud and finally the computer or chip architecture be irrelevant?

I wouldn't bet on that yet. There is a strong industry push for Cloud based software, sure, but many users are suspicious of this model (for a good reason, see Google Stadia) and I think that Apple can use that to their advantage. Strong on-device machine learning, excellent performance and great mobility of their products can give the users a sense of ownership and control they don't have with cloud stuff. Many services will move to the Cloud of course — which makes sense — but there are things where cloud software just doesn't give you much.

1) For Bootcamp, Asahi has made good progress on Linux, but they still have a ways to go before it’s production ready. For Windows, it sounds like it’s not a technical limitation but a licensing issue with Qualcomm

Just drop it. There won't be native Windows running on ARM Macs any time soon or ever. Too much work and investment with very little payout. Apple couldn't care less about Mac users running Windows natively and Microsoft already sells all the products it wants to sell either via Mac versions or via cloud offerings. And for those who do need to run Windows there are always virtual machines.
 
Not necessarily. It depends on performance and cost. Right now an AWS EC2 Mac2 instance, which gives you a 16 GB Mini, is $474 per month. That works for a large organization with highly variable compute demands, but is a ridiculously bad deal for a consumer, who can buy a 16 GB/1TB Mini for $1300. Who knows if that will change substantially in the future. Yes, cost per GB transmistted will come down, but the bandwidth of programs will go up, perhaps cancelling that out.

Having a laptop that you can take with you anywhere without having to worry about delays because of bad internet is worth a lot too. Sure, internet bandwidth will go up enormously, but so will the bandwidth demands of programs.

And for home use, what's the bandwidth needed to run three 360HZ@8k or 360Hz@16k displays (or whatever the standard is in 25 years) without compression?

Plus if you're running multiple high-resolution monitors, isn't your local device is going to need a fairly beefy GPU anyways?
We're both hypothesising but based on how fast technology is developing, I don't see any of that as a hurdle. Twenty five years ago here in the UK people were still using 33K/56K modems (if they were even on the net at all). Now look where we are: discounting my little forgotten part of England stuck in a timewarp with my 24megabit broadband, gigabit internet exists. So adding as many years again will see us connected to the Internet at what were once considered SSD speeds.

Power users (the definition of which will change) will likely be able to argue a case to have powerful computers running locally-stored OS's and apps, but for most users, that just won't be necessary and won't be the norm. You won't need to download the latest and greatest OSs and apps to your computer, you'll buy a licence probably through some kind of perpetual payment programme (eg Apple One) and run your OS's and computer programs from a centralised server via the internet, with everything automatically kept up to date. Of course you'll need a different subscription for each software provider, but we're heading that way now, the only difference is they still store the apps on your drive. When people have access to internet with speeds that rival current SSDs, that will no longer need to be a thing (for most).

We already started down the path with Chromebooks, where you're encouraged to do literally everything in the Google cloud.

Obviously all 100% hypothesis on my part. The way it's going at the moment, in 25 years time it's just as likely what's left of humanity will be trying to rebuild itself following WWIII.
 
We're both hypothesising but based on how fast technology is developing, I don't see any of that as a hurdle. Twenty five years ago here in the UK people were still using 33K/56K modems (if they were even on the net at all). Now look where we are: discounting my little forgotten part of England stuck in a timewarp with my 24megabit broadband, gigabit internet exists. So adding as many years again will see us connected to the Internet at what were once considered SSD speeds.

Power users (the definition of which will change) will likely be able to argue a case to have powerful computers running locally-stored OS's and apps, but for most users, that just won't be necessary and won't be the norm....
Except many people are going to want their internet speeds to equal contemporary SSD speeds before they switch to the cloud, not SSD speeds from 25 years ago. Otherwise, you get a big performance hit. So I'd expect only the lowest-end users would be happy to switch to "dumb terminals". It won't just be power users that want to stay local, it will be many of the the broad mid-range users as well.

I.e., in 2022, typical SSD speeds are ~1000 MB/s, and typical internet speeds are ~10 MB/s (=80 Mb/s). So you hear people arguing that in, say, 25 years, internet speeds will routinely be about 100x faster than they are now, thus equalling current SSD speeds, giving enough performance from the cloud. But that argument is wrong, because it forgets that, in 2047, you need to compare internet speeds in 2047 to SSD speeds in 2047, not to SSD speeds in 2022.

Without having further info., my default assumption is that SSD speeds will advance about as quickly as internet speeds, keeping the 100-fold peformance gap about the same.
 
Well, thank you for your insights! 🙂 Let's hope that future will be bright for all those areas... Certainly, as many of forum users mentioned before, competition benefits for overall customer experience in both Apple and Windows ecosystem. For me, my Mac is more than enough for my needs as a doctor... However, while I prefer not being a fanboish person, I would be happier if my machine would expand its spectrum of compatible areas :)
 
To be honest I am surprised that something like CATIA runs even on Windows.
For this kind of work, UNIX/LINUX is where is at and probably where it will remain.
 
Ultimately 2).
Give it 25 years at the very most, and even the OS's will be cloud-based. Computers running OS's natively on a built-in storage device will be a thing of the past. I'm unsure yet how much AR/VR will change the way we interact with future computers, or how fast, but that will impact it too.
Just as long as there is no
roaming fees / small roaming speed caps
caps
deprioritization after an small amount of ISP data
non free wifi on airplanes
over subed local nodes on ISP's
wifi networks that need login's (unless the devices have an os that is local with the software needed to login to that wifi network)
 
  • Like
Reactions: MajorFubar
Since backwards compatibility is not the strong suit of MacOS, I understand the reluctance of companies to produce Mac versions of their products.
What are you talking about? Rosetta 1, Rosetta 2. You know Apple is the master of silicon transitions, because they even reuse the same terminology. This problem is understood within Apple and they know exactly what to do.

PPC ➔ X86 ➔ ARM
 
What are you talking about? Rosetta 1, Rosetta 2. You know Apple is the master of silicon transitions, because they even reuse the same terminology. This problem is understood within Apple and they know exactly what to do.

PPC ➔ X86 ➔ ARM
Probably complaining about the one time removal of 32-bit support. Many applications were left behind because the developers of the 32-bit applications had abandoned their apps. From my perspective, it was mostly games that didn't get updated for 64-bit.
 
  • Like
Reactions: wyrdness
Probably complaining about the one time removal of 32-bit support. Many applications were left behind because the developers of the 32-bit applications had abandoned their apps. From my perspective, it was mostly games that didn't get updated for 64-bit.
Exactly… And there is no guarantee that won’t happen again in, say 10 years, due to a modification in their ARM chip architecture or some other transition like from ARM to RISC-V… I understand Apple’s way of leaving some older tech behind is more efficient for progress, but this is the reason why developers and even bigger companies become hesitant to port their software from other OS’es. And yes, I believe too that Apple is more interested in their margins and being a premium product rather than being the dominant, but somehow mediocre platform.
 
Last edited:
As to the rest it kind of depends. Apple has a good potential of gaining strong position with the machine learning and data mining community once they improve the hardware a bit an streamline their software interface. There are also considerable gains to be made in 3D and rendering. I can definitely see Apple becoming a very popular choice here once they add some of the missing hardware features and improve the software compatibility.

Yes, please...!

Finally, one of the problematic areas in my opinion is lack of a top-tier Apple performance desktop currently. While it is true that most people buy laptops and the desktop becomes more of a niche, I believe that enthusiast-class desktops are an important psychological anchor that symbolizes computing capability. Apple not offering very fast desktop is interpreted as Macs being "bad for performance" which harms adoption in the community that is a driver of change. But we'll see what Apple has to offer on this front.

I really think Apple could have a top-tier DCC workstation if they make desktop/workstation class SoCs specific to the ASi Mac Pro...
 
I really think Apple could have a top-tier DCC workstation if they make desktop/workstation class SoCs specific to the ASi Mac Pro...
top tier at X2 or more the price of other systems?
even more so if they lock you into there ram and storage (raid 0 only storage to boot)
 
Exactly… And there is no guarantee that won’t happen again in, say 10 years.
I can absolutely guarantee you, there won't be a 128-bit cpu ever. Nobody calculates with numbers that large. Your hardware will become obsolete for other reasons.

 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.