Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I understood your question - I don't think you understood my answer.

1. Benchmarks: For a quick back-of-the envelope comparision, I'd use Cinebench (I do 3d art, tile based rendering is critical). I would NEVER use benchmarks for a purchasing decision, however. For a purchase decision - I'd need to:

A. Have native software - which won't be happening anytime soon, not for what I do, anyway. No ability to run windows apps - no sale; Once you see what all is available outside of the walled garden, you aren't going back. Timmy doesn't seem to understand that Mac Pro owners use more than Final Cut and Logic.

B. Stress test the entire system. Benchmarks don't run long enough to stress a system. My software pegs the CPU for hours at a time. Can an ARM Mac handle that? Based on past decisions made by Apple, I would not trust the silicon to handle that kind of performance requirement.

C. Know what can not transition to the new machine - As an example, with the 6,1 Mac Pro it would have cost me an extra $2,000 to replace the functionality that Sir Idiot Boy removed. So, a base 6,1 would have actually cost me $6,000, and I would have seen no performance increase over what I already had with my flashed 4,1. Which is why a lot of us took a pass on the trashcan - and that was before the thermal throttling issues and the D700 GPUs dying by the bucket load. The same would apply to an ARM mac - if I have to replace $2,000 of capability, won't be buying.

2. x86 Comparison: I would compare the ARM CPU to either a top of the line Rzyen 9 or a midrange Threadripper. Things the ARM System will need - 128Gb ECC ram support (as a minimum); lots of PCIe 4.0 lanes (minimum - PCIe 5.0 by the end of 2021);

4 way SMT is coming on either Zen 4 or Zen 5 - Top of the Ryzen stack will probably be a 16 core/64 thread system. Intel is going a different direction - Each core will have an Atom CPU in place of the 2nd thread. Either way, we will see a significant performance increase by end of 2021

3. High end GPUs? Please. No one knows what direction Apple is going in - including Apple. Tomb Raider isn't graphically intensive. Can that ARM mac run Crysis? By the time a "high performance" ARM launches, we will be at least 2 generations past the current RTX series of cards from Nvidia, and probably be on RDNA 3 or 4 on the AMD side. A low end GPU will need to provide RTX 2080ti performance (AKA the performance of a game console). I don't see Apple delivering on that. Apple has a very, very long history of not caring about GPU performance. Oh, and I'd need the ability to drive multiple 4k monitors, as a minimum.

4. Mac Computer Purchases: I have been on Mac Pros (and Power Macs before the PPC-Intel transition). You don't appear to be very familiar with the Mac Pro saga, so let be bring you up to speed.......

2009 - Mac Pro 4,1 released.
2010 - Mac pro 5,1 released - by flashing the firmware of the 4,1, anyone can make their 4,1 a 5,1. Flash your firmware - replace your Nehalem CPUs with a pair of Westmere CPUs, and Bob's your uncle.

SATA III arrives - Apple takes a pass.

Intel has a performance breakthrough with Sandy Bridge - Apple takes a pass.

Nvidia's CUDA pushes GPU computing to a new level (especially in 3d art) - Apple takes a pass.

2013 - mac Pro 6,1 released - Ivy Bridge CPUs - thermally throttled, so you don't actually see performance increase over 5,1. Apple goes with thunderbolt 2, rather than SATA III - Trashcan users have a rat's nest of power bricks (of indeterminate quality), cabling, and extra boxes sitting on their desks. Thunderbolt 2 is EoLed - no support from Apple and no way to move to Thunderbolt 3. (whereas I can drop a Thunderbolt 3 card into my Ryzen system today - if I wasn't actually interested in performance, anyway.)

Haswell Xeon Family released - Apple takes a pass.
Broadwell Xeon Family released - Apple takes a pass
Skylake Xeon Family released - Apple takes a pass
Kaby Lake Xeon Family released - Apple takes a pass
Coffee Lake Xeon Family released - Apple takes a pass
Cascade Lake Xeon Family released - Apple takes a pass

Do you see a pattern here? - Can't buy a new mac pro if Timmy & Sir Idiot Boy take over 2,000 DAYS between releases.

2019 - Apple 7,1 announced - $6,000 gets you the performance of a $1,400 Ryzen 7 system. Every subsystem in it is obsolete on day 1. But hey, you do get a cool case to hold your 4 year old video card.

2020 - Apple announces ARM Macs - Investment in 7,1 not looking like such a good idea anymore. No new software for you.
 
  • Love
  • Like
Reactions: arkitect and R3k
I understood your question - I don't think you understood my answer.

1. Benchmarks: For a quick back-of-the envelope comparision, I'd use Cinebench (I do 3d art, tile based rendering is critical). I would NEVER use benchmarks for a purchasing decision, however. For a purchase decision - I'd need to:

A. Have native software - which won't be happening anytime soon, not for what I do, anyway. No ability to run windows apps - no sale; Once you see what all is available outside of the walled garden, you aren't going back. Timmy doesn't seem to understand that Mac Pro owners use more than Final Cut and Logic.

B. Stress test the entire system. Benchmarks don't run long enough to stress a system. My software pegs the CPU for hours at a time. Can an ARM Mac handle that? Based on past decisions made by Apple, I would not trust the silicon to handle that kind of performance requirement.

C. Know what can not transition to the new machine - As an example, with the 6,1 Mac Pro it would have cost me an extra $2,000 to replace the functionality that Sir Idiot Boy removed. So, a base 6,1 would have actually cost me $6,000, and I would have seen no performance increase over what I already had with my flashed 4,1. Which is why a lot of us took a pass on the trashcan - and that was before the thermal throttling issues and the D700 GPUs dying by the bucket load. The same would apply to an ARM mac - if I have to replace $2,000 of capability, won't be buying.

2. x86 Comparison: I would compare the ARM CPU to either a top of the line Rzyen 9 or a midrange Threadripper. Things the ARM System will need - 128Gb ECC ram support (as a minimum); lots of PCIe 4.0 lanes (minimum - PCIe 5.0 by the end of 2021);

4 way SMT is coming on either Zen 4 or Zen 5 - Top of the Ryzen stack will probably be a 16 core/64 thread system. Intel is going a different direction - Each core will have an Atom CPU in place of the 2nd thread. Either way, we will see a significant performance increase by end of 2021

3. High end GPUs? Please. No one knows what direction Apple is going in - including Apple. Tomb Raider isn't graphically intensive. Can that ARM mac run Crysis? By the time a "high performance" ARM launches, we will be at least 2 generations past the current RTX series of cards from Nvidia, and probably be on RDNA 3 or 4 on the AMD side. A low end GPU will need to provide RTX 2080ti performance (AKA the performance of a game console). I don't see Apple delivering on that. Apple has a very, very long history of not caring about GPU performance. Oh, and I'd need the ability to drive multiple 4k monitors, as a minimum.

4. Mac Computer Purchases: I have been on Mac Pros (and Power Macs before the PPC-Intel transition). You don't appear to be very familiar with the Mac Pro saga, so let be bring you up to speed.......

2009 - Mac Pro 4,1 released.
2010 - Mac pro 5,1 released - by flashing the firmware of the 4,1, anyone can make their 4,1 a 5,1. Flash your firmware - replace your Nehalem CPUs with a pair of Westmere CPUs, and Bob's your uncle.

SATA III arrives - Apple takes a pass.

Intel has a performance breakthrough with Sandy Bridge - Apple takes a pass.

Nvidia's CUDA pushes GPU computing to a new level (especially in 3d art) - Apple takes a pass.

2013 - mac Pro 6,1 released - Ivy Bridge CPUs - thermally throttled, so you don't actually see performance increase over 5,1. Apple goes with thunderbolt 2, rather than SATA III - Trashcan users have a rat's nest of power bricks (of indeterminate quality), cabling, and extra boxes sitting on their desks. Thunderbolt 2 is EoLed - no support from Apple and no way to move to Thunderbolt 3. (whereas I can drop a Thunderbolt 3 card into my Ryzen system today - if I wasn't actually interested in performance, anyway.)

Haswell Xeon Family released - Apple takes a pass.
Broadwell Xeon Family released - Apple takes a pass
Skylake Xeon Family released - Apple takes a pass
Kaby Lake Xeon Family released - Apple takes a pass
Coffee Lake Xeon Family released - Apple takes a pass
Cascade Lake Xeon Family released - Apple takes a pass

Do you see a pattern here? - Can't buy a new mac pro if Timmy & Sir Idiot Boy take over 2,000 DAYS between releases.

2019 - Apple 7,1 announced - $6,000 gets you the performance of a $1,400 Ryzen 7 system. Every subsystem in it is obsolete on day 1. But hey, you do get a cool case to hold your 4 year old video card.

2020 - Apple announces ARM Macs - Investment in 7,1 not looking like such a good idea anymore. No new software for you.

Would love to see all these “Timmy”-posting folks if they met Mr. Cook in real life.
 
Would love to see all these “Timmy”-posting folks if they met Mr. Cook in real life.

He wouldn't be the 1st - As one of my commanders once said:

Sir, don't ask SFC Bryan for is opinion if you don't actually want to hear it. He doesn't care what you think.
 
No, they weren’t addressed, because, yes, I watched it. Once Apple decides to dump the x86 emulation, it’s all going away. Developers won’t port to ARM, certainly not if there’s an emulation in there for a while.

The only people who are taken care of here are average users and people that replace their Mac every few years. They mostly don’t use heavy-duty engineering, design, or content-creation software, and certainly don’t have lots of third-party tools and hardware.

Apple will ditch Rosetta just as quickly as they think they can. Just like they did last time. So much stuff was lost last time. It’s taken ten years to fill in those holes.
Why will developers not port to Apple Silicon? Really curious about this....Targeting the customers with the deepest wallets seems obvious to me, and when Apple’s tools also allow access to an ipad user base of 300m+ at compile time, it’s even more so.

People are missing the point if they see only the Mac as the primary beneficiary for this transition. The iPad will likely go pro in a big way software wise, and very quickly once Apple Silicon Macs are released.
 
He wouldn't be the 1st - As one of my commanders once said:

Sir, don't ask SFC Bryan for is opinion if you don't actually want to hear it. He doesn't care what you think.

Expressing one’s opinion is different than disrespectfully using a diminutive for someone’s name. Tell Tim, or Mr. Cook, your opinions. Don’t be needlessly insulting.
 
Why will developers not port to Apple Silicon? Really curious about this....Targeting the customers with the deepest wallets seems obvious to me, and when Apple’s tools also allow access to an ipad user base of 300m+ at compile time, it’s even more so.

People are missing the point if they see only the Mac as the primary beneficiary for this transition. The iPad will likely go pro in a big way software wise, and very quickly once Apple Silicon Macs are released.

1. There is no indication as to what an ARM mac will be capable of. Very little of what I do (3d art) can be done on an iPad. That isn't going to change.
2. Macs are a very small part of the total PC market.
3. It is Apple - it won't be price competitive with PCs. An Apple products, by and large are, overpriced and underperform in comparison to their PC brethren.

Moving to ARM is about control, not performance.
 
1. There is no indication as to what an ARM mac will be capable of. Very little of what I do (3d art) can be done on an iPad. That isn't going to change.
2. Macs are a very small part of the total PC market.
3. It is Apple - it won't be price competitive with PCs. An Apple products, by and large are, overpriced and underperform in comparison to their PC brethren.

Moving to ARM is about control, not performance.
Well, that doesn’t really answer the question now does it?
 
Well, that doesn’t really answer the question now does it?

I thought it did - It is a chicken and egg situation.

No one is going to jump until everyone else has moved. It isn't like moving from PPC to x86. We aren't getting access to more software and more performance - we are getting access to Candy Crush and iFart apps on the desktop.
 
I thought it did - It is a chicken and egg situation.

No one is going to jump until everyone else has moved. It isn't like moving from PPC to x86. We aren't getting access to more software and more performance - we are getting access to Candy Crush and iFart apps on the desktop.

It is not more of a chicken-egg situation than with any transition before.
 
How about you get on the record with answers to these questions:
... What would be required for you to purchase an Apple Silicon-based system?

I'm a software developer, so (5) I'll try to get one as soon as units are shipping to my customer base, as I'll need one for testing my applications on the Macs that millions of potential customers will be buying over the next few years. I imagine tens to hundreds of thousands of macOS app developers are in the same boat.

Better for me simply means building large Xcode projects noticeably faster than current equivalently priced Macs, or perhaps equal with far better battery life for something smaller and lighter. A boost there would sell to millions of iOS developers and students.
 
Last edited:
  • Like
Reactions: cmaier
Incredible performance huh ?

I see this performance will be no different than iPhone/iPad already on ARM always has. What lead s anyone to beleive Macs being on the "same chips" but the only difference would be Mac performance would be ""somhow"better?

I don't think they will. You could say GPU would be.. but then. that's not exactly CPU..That's GPU
I go by whats already available,and use "that" as the template
 
Incredible performance huh ?

I see this performance will be no different than iPhone/iPad already on ARM always has. What lead s anyone to beleive Macs being on the "same chips" but the only difference would be Mac performance would be ""somhow"better?

I don't think they will. You could say GPU would be.. but then. that's not exactly CPU..That's GPU
I go by whats already available,and use "that" as the template

Apple said that they are making a “Family of soc’s, custom to the Mac” - it was specifically mentioned in the State of the union video. That implies that they won’t be using the iPad 8 core soc (4 efficiency/ 4 performance). As has been “leaked”, it’s likely a 12 core - 4 efficiency/ 8 performance at the bottom end working upwards from there.
 
It is not more of a chicken-egg situation than with any transition before.

It is only chicken-egg if a content creator is silly enough to stay with Apple:

Adobe CS - welcome to software as a service.
MS Office - welcome to software as a service.
Maya - welcome to software as a service. (assuming this is ported to ARM)
Cinema3D - welcome to software as a service. (assuming this is ported to ARM)

When I moved from OSX to Windows 10 - I didn't lose software; I gained it. I am not just talking Steam games. My copy of Adobe CS doesn't run on OSX anymore - My windows version does.

Moving to Windows, I only needed to replace ZBrush. It was time to upgrade and was worth every penny.

Let me give some other examples from my workflow.

I do 3d art - I am at the hobbyist level - not professional, but hobbyist. Bottom of the stack software.

Poser 12 - we should have this in the next couple of months - Won't see an ARM version for at least 2 years, if we see one at all. There aren't very many OSX users to begin with - they are limping along with either 4,1 5,1 or 6,1 Mac Pros. I don't see them moving to ARM, especially when Superfly (their Cycles based PBR) is moving to GPU compute - which rules out Apple. I have been suggesting to the development team to run with the AMD ProRender engine - it is available on a lot of 3d software and there is a Cycles plug-in for Blender.

I don't see AMD doing a lot of work for the ProRender engine for ARM.

Daz Studio - Render engine is also CUDA based (iRay). OSX users are also stuck with a CPU based solution.

My modeler is Hexagon - it is a crappy modeler, but I grok the interface and I have all of the key functions memorized. Runs just fine in Windows - no longer runs in OSX.

Marvelous Designer - No longer supported on OSX.

Apple isn't a computer company - they are a luxury phone company that dabbles (poorly, imo) in hardware and software.

Incredible performance huh ?

I see this performance no different than iPhone/iPad already on ARM always has. What lead s anyone to beleive Macs being on the "same chips" but the only difference would be Mac performance would be ""somhow"better?

I don't think they will. You could say GPU would be.. but then. that's not exactly CPU..That's GPU

Because they have seen that an A12z has a Geekbench 5 score of 4615 (about the same as a low power Intel/AMD laptop - or less than 1/3 of my current desktop). That 1 datapoint is what is driving the speculation.

P.T. Barnum may be dead - but the Reality Distortion Field is functioning.
 
Incredible performance huh ?

I see this performance will be no different than iPhone/iPad already on ARM always has. What lead s anyone to beleive Macs being on the "same chips" but the only difference would be Mac performance would be ""somhow"better?

I don't think they will. You could say GPU would be.. but then. that's not exactly CPU..That's GPU
I go by whats already available,and use "that" as the template
It won’t be the “same chips.“ Where’d you get that silly idea?
 
P.T. Barnum may be dead - but the Reality Distortion Field is functioning.


From everything you’ve said, you’re not really an Apple customer now are you? So you’re getting all worked up about a transition that doesn’t even effect you.
 
So, recently, I decided to replace that external fan with a 140mm hydraulic bearing model (connected via a USB to 3-pin adapter cable).

These are pretty incredible too (magnetic bearing):

I've had a few of them for years, cooling various things. They, like yours, don't move a ton of air as they run at a lower speed too, but they are pretty quiet. I also like that you can pull them apart and clean everything, then snap them back together.

Exactly. See this history. To be clear, he never responded to my post, because it makes it clear that his argument is wrong, and that would force him to stop raising the same points over and over. Oh well. :-D

Amplifying this point, while the current system ranked number 1 on the top 500 list is ARM SoC based, the next two are PowerPC based. Intel is not in the top 5 (did not look to see how low they fall).

And, he keeps ignoring the point that the G5 was faster than anything Intel had at the time.

Where are you getting these figures from? Idle conjecture doesn't help anything. You can't know how many users use Base Camp because Apple hasn't talked about it. I think 50% of users rely on Base Camp. I made up that number just like you did, so our statistics are equally valid.

Someone (AppleInsider?) did some kind of study that put at around 3%. That sounds a bit low, but I doubt a lot of people use BootCamp. What that study doesn't tell us, though, is how many people use the x86 aspect for virtualization and running Windows apps (ie. Parallels, VMWare, etc.). I'd bet that is a much bigger percentage.

... None of the other Macs can do much in the way of 3d art (Ask me how I know......) ...

I know people doing significant 3D work on Macs.

A. Have native software - which won't be happening anytime soon, not for what I do, anyway. No ability to run windows apps - no sale; Once you see what all is available outside of the walled garden, you aren't going back. Timmy doesn't seem to understand that Mac Pro owners use more than Final Cut and Logic.

B. Stress test the entire system. Benchmarks don't run long enough to stress a system. My software pegs the CPU for hours at a time. Can an ARM Mac handle that? Based on past decisions made by Apple, I would not trust the silicon to handle that kind of performance requirement.

The software aspect is a concern, especially for those of us running Windows software that isn't made at all for the Mac.

I hear you on the thermals. I hope this will help, but I fear Apple will just use any gain to make it smaller or something like that. They seem to reserve adequate thermals for the Pro models. Frustrating, for sure (and expensive, as I've killed a couple MBPs over the years).

3. High end GPUs? Please. No one knows what direction Apple is going in - including Apple. Tomb Raider isn't graphically intensive.

The point of that demo wasn't to show how well the new Apple Silicon Macs will run games. The point was to show a pretty intensive example of an Intel Mac app running on the Apple Silicon machine (pre-Machine, basically an iPad with more RAM).

Nvidia's CUDA pushes GPU computing to a new level (especially in 3d art) - Apple takes a pass.

I think the key is to find software that doesn't care about CUDA anymore.

Why will developers not port to Apple Silicon? Really curious about this....Targeting the customers with the deepest wallets seems obvious to me, and when Apple’s tools also allow access to an ipad user base of 300m+ at compile time, it’s even more so.

I'd guess most of the developers making Mac apps will eventually port to Apple Silicon (but, it will probably be the more specialized pro apps that will take longer). The problem is more all the Windows apps a lot of us Mac users run currently under Boot Camp or virtualization (ie. Parallels).
 
From everything you’ve said, you’re not really an Apple customer now are you? So you’re getting all worked up about a transition that doesn’t even effect you.

I have an iPhone and an iPad - I actually am a typical Apple customer. Mac desktop users are the outlier. It is why they don't report the number of units sold anymore.

I am annoyed that I wasted 3 years waiting on the 7,1 - If the 3 stooges had just come out and said they were exiting the workspace the Mac Pro used to occupy, I could have moved on in 2017, rather than waiting on a $1,400 computer with a $4,600 case.

I know people doing significant 3D work on Macs.

The software aspect is a concern, especially for those of us running Windows software that isn't made at all for the Mac.

I hear you on the thermals. I hope this will help, but I fear Apple will just use any gain to make it smaller or something like that. They seem to reserve adequate thermals for the Pro models. Frustrating, for sure (and expensive, as I've killed a couple MBPs over the years).

The point of that demo wasn't to show how well the new Apple Silicon Macs will run games. The point was to show a pretty intensive example of an Intel Mac app running on the Apple Silicon machine (pre-Machine, basically an iPad with more RAM).

I think the key is to find software that doesn't care about CUDA anymore.

Why would I want less performance for the money? I don't understand that mindset. What is available on OSX has been shrinking for a while now. Until you stick your toes in the Winpool, you really won't understand how far behind you are with what is actually state of the art software.

In render engines, you have 3 choices - CUDA with Nvidia GPUs, AMD ProRender Engine with AMD GPUs, or CPU driven engines, like LuxRender, which no longer has an SDK available for OSX, so I don't see it making a transition to ARM.
[automerge]1593321088[/automerge]
This is how worked up you are over a hobby? Sweet baby Jesus I can't imagine yourresponse if you made your income this way.

It can be an intense hobby - it is certainly expensive. I stopped counting when I realized I had crossed the 5 figure mark a few years back.
 
It is only chicken-egg if a content creator is silly enough to stay with Apple:
.....

Most of your examples are totally irrelevant if the question at hand is, why would'nt developer port to Apple silicon.
The question was not, if certain developer will support the Mac platform with their products.
 
There is a surreal, but humorous, mixture of uninformed panic, angry undue speculation, and uninformed technology-denial going on in this thread.

I will fully admit, I was very skeptical about the prospects of this (i.e. Macs using Arm/Apple SoCs) being more than a rumour, pre-WWDC. Not to the extent of assuming that Bootcamp is a mission critical tool for a large swathe of the audience, but I was quite skeptical. The source of the rumours was a big factor for me: every 'new' article about it all eventually referenced back to a single original one. Combined with the often referenced Bloomberg, the 'rumour' part seemed very unbelievable to me.

From the view of the technology itself, very short term this doesn't affect me at all, I have 2018 era hardware, and as we know, there's current + a few more Intel models available, which will be supported for the life of the hardware for business purposes. Medium term is probably a little less unsure - in 3 years time, an Intel Mac may be less appealing, Arm won't have taken over the datacenter (I imagine it's use will increase but it's not going to just replace x86 wholesale) and it's uncertain what level of tooling will be available.

So, what do I do? Do I continue to argue that it's the wrong decision (I did very much believe it would not happen, and tried to explain what would be lost) and, I dunno, hope to win some internet points for convincing other people Apple are wrong? Or do I accept a change is coming, and either adapt, or go elsewhere?

I think it's that last part that a lot of people can't seem to manage. The "ugh this <insert latest reason to claim to switch to windows> is it, I'm going to windows"... and then proceeds to repeat the same rant/whine for weeks if not months. Possibly after having switched, possibly not. But what's the point. If you've "left" why do you need to keep telling everyone about it? Is there another kind of internet points I'm not aware of? I assume they're called crying over spilt milk points??


Anyway. Yes, there is, as with practically anything Apple announces (or maybe Microsoft too, I don't know I don't really use their products so I dont know how their user base reacts to whatever it is they announce) a lot of uneducated hang wringing. The "I read that Apple is doing <X>, does that mean <technically unrelated Y>??" I can kind of understand, it's at least a question. It's the "Welp, now that Apple is doing <X>, it's only a matter of time before <unrelated, officially denied, claimed multiple times over and never eventuated Y>, so I'm going to become an Amish Hermit and rub sticks together in the mountains" deliberate overreactions that you read and can't help but go

iu

[automerge]1593322155[/automerge]
Until you stick your toes in the Winpool, you really won't understand how far behind you are with what is actually state of the art software.
Apparently not start of the art enough to prevent you constantly complaining about how a line of computers that you don't own, and haven't owned current models for at least a decade.

I'm starting to suspect that the 3d art is merely a crutch to assist in your true hobby, complaining about computers you no longer own or use.
 
Last edited:
we are getting access to Candy Crush and iFart apps on the desktop.
I know you're not the first or only person to make this correlation, but I'll address your comment because it's the most recent.

The CPU (change) isn't hugely important to a developer unless they're utilizing a hardware-related specific feature, such as SSE3, or part of IDE software development. A call in Swift (or any other programming language) to display an image on a screen isn't going to be different whether the device has an AMD, Intel, or Apple CPU. The same that the CPU doesn't care if a developer is using Swift, Java, Obj-C, etc because it doesn't know any of those. It's the compiler/interpreter that bridges the gap. What matters is that the language frameworks, libraries, core services, etc are compatible.

In other words...

Replacing an Intel Core i7 in a Mac with an Apple A13 doesn't mean you can then simply launch an iPhone application (.ipa) and have it operate the same as on an iPhone. The translated (i.e., compiled) code would now be generally understandable by the (new) CPU, but software dependencies before the compile are missing or wrong. So, the app won't execute properly anyway.

Catalyst is Apple's temporary resource to fix or rather prevent incorrect dependency calls that currently exist between iOS and macOS. However, as made evident with the demo of Big Sur, Apple is syncing the underlying frameworks of iOS, iPadOS, and macOS.

Basically, the replacement of Intel CPUs with Apple Silicon in Macs has little to do with app cross-compatibility, even in Apple's own ecosystem. That burden falls on their OS and Xcode engineers.

P.S. More experienced software developers/engineers feel free to politely correct any mistaken understandings.
P.P.S. Not trying to step on the toes of anyone who already has responded to this misunderstanding.
 
I know you're not the first or only person to make this correlation, but I'll address your comment because it's the most recent.

The CPU (change) isn't hugely important to a developer unless they're utilizing a hardware-related specific feature, such as SSE3, or part of IDE software development. A call in Swift (or any other programming language) to display an image on a screen isn't going to be different whether the device has an AMD, Intel, or Apple CPU. The same that the CPU doesn't care if a developer is using Swift, Java, Obj-C, etc because it doesn't know any of those. It's the compiler/interpreter that bridges the gap. What matters is that the language frameworks, libraries, core services, etc are compatible.

In other words...

Replacing an Intel Core i7 in a Mac with an Apple A13 doesn't mean you can then simply launch an iPhone application (.ipa) and have it operate the same as on an iPhone. The translated (i.e., compiled) code would now be generally understandable by the (new) CPU, but software dependencies before the compile are missing or wrong. So, the app won't execute properly anyway.

Catalyst is Apple's temporary resource to fix or rather prevent incorrect dependency calls that currently exist between iOS and macOS. However, as made evident with the demo of Big Sur, Apple is syncing the underlying frameworks of iOS, iPadOS, and macOS.

Basically, the replacement of Intel CPUs with Apple Silicon in Macs has little to do with app cross-compatibility, even in Apple's own ecosystem. That burden falls on their OS and Xcode engineers.

P.S. More experienced software developers/engineers feel free to politely correct any mistaken understandings.
P.P.S. Not trying to step on the toes of anyone who already has responded to this misunderstanding.
Just simple it will long long hour non-sleep.. for a developer in apple. Marketing expectation and real development always diff timing.
 
I know you're not the first or only person to make this correlation, but I'll address your comment because it's the most recent.

The CPU (change) isn't hugely important to a developer unless they're utilizing a hardware-related specific feature, such as SSE3, or part of IDE software development. A call in Swift (or any other programming language) to display an image on a screen isn't going to be different whether the device has an AMD, Intel, or Apple CPU. The same that the CPU doesn't care if a developer is using Swift, Java, Obj-C, etc because it doesn't know any of those. It's the compiler/interpreter that bridges the gap. What matters is that the language frameworks, libraries, core services, etc are compatible.

In other words...

Replacing an Intel Core i7 in a Mac with an Apple A13 doesn't mean you can then simply launch an iPhone application (.ipa) and have it operate the same as on an iPhone. The translated (i.e., compiled) code would now be generally understandable by the (new) CPU, but software dependencies before the compile are missing or wrong. So, the app won't execute properly anyway.

Catalyst is Apple's temporary resource to fix or rather prevent incorrect dependency calls that currently exist between iOS and macOS. However, as made evident with the demo of Big Sur, Apple is syncing the underlying frameworks of iOS, iPadOS, and macOS.

Basically, the replacement of Intel CPUs with Apple Silicon in Macs has little to do with app cross-compatibility, even in Apple's own ecosystem. That burden falls on their OS and Xcode engineers.

P.S. More experienced software developers/engineers feel free to politely correct any mistaken understandings.
P.P.S. Not trying to step on the toes of anyone who already has responded to this misunderstanding.

There are a couple of developer sessions from WWDC that goes through the steps in porting an app. But in essence you are correct. Probably the biggest effort will be re-running tests.

Port your Mac app to Apple Silicon

IPad and iPhone apps on Mac Apple Silicon
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.