Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And when you combine that with the vertical integration of hardware and software, you magnify the performance affect. One thing that Apple was able to do was to looks at specific tasks that the OS does a lot of and to optimize those operations in silicon. An example was reference counting. That happens with every operation. Apple said that that was an area where they made sure that AS was doing it as quickly and efficiently as possible.
I don't think many people know this, but intel actually has their own little linux distribution, called clear linux. It's not too impressive in that its software library is nowhere near ubuntu-levels, but it does one thing: it flies. It's mindblowingly fast and responsive, at least when you run it on recent-ish intel hardware (meaning CPU and GPU). Even Gentoo was pretty sluggish compared to that. Now ... the amazing thing is, that M1 MacOS is even snappier. But not by much. Really goes to show what you can do when you know the hardware you're coding for. And well, in your example it's basically the other way around. Apple really has a point with their entire integration thing.

The only potential problem I see in this: the more "secret sauce" they add to their hardware and operating system, the harder it will be to properly implement applications for it. MacOS doesn't have the same market share as iOS, not by a long shot. Curious to see how software support will develop.

Bit off topic: I know a dev working at a small Polish indie game studio. And their reaction to Apple Silicon, and he claims other studios basically mirror this, was pretty knee jerk: If you now port your game to intel MacOS you're doomed to run in lackluster pseudo-emulation forever, and your MacOS port will pale compared to your windows version. However if you now port it to ARM MacOS half if not more of the people can't run the game yet, meaning you'll never recover your investment, which isn't guaranteed with MacOS anyways. And doing both is just too expensive to begin with. So they will just wait.

My point is kinda this: If we are not talking simple applications (which are all electron apps anyways :rolleyes:) MacOS really is in a pretty weird state of purgatory right now, and Apple would do good to now stabilize the platform, go light on new features both hardware and software, and actively try to help developers port their application to ARM. Simply because they are such an island solution.

That's kinda the drawback this has. But yeah, the advantages are equally obvious.
 
Bit off topic: I know a dev working at a small Polish indie game studio. And their reaction to Apple Silicon, and he claims other studios basically mirror this, was pretty knee jerk: If you now port your game to intel MacOS you're doomed to run in lackluster pseudo-emulation forever, and your MacOS port will pale compared to your windows version. However if you now port it to ARM MacOS half if not more of the people can't run the game yet, meaning you'll never recover your investment, which isn't guaranteed with MacOS anyways. And doing both is just too expensive to begin with. So they will just wait.
Apple always plays the long game. I’m not excited for Apple Silicon because of what we have now. I’m excited because the real reason Apple made this transition is because of the products we will see 3-5 years from now. In 5 years noone is going to care about old Macs running Intel.
 
  • Like
Reactions: Tagbert
Apple always plays the long game. I’m not excited for Apple Silicon because of what we have now. I’m excited because the real reason Apple made this transition is because of the products we will see 3-5 years from now. In 5 years noone is going to care about old Macs running Intel.
The transition seems to be going fast in the sense that Mac sales volumes have been quite good by Apple standards since the introduction of AS Macs. (Since Apple themselves do not provide numbers, this is going from estimates obviously. But nevertheless.)
The old intel macs won’t disappear, they will migrate down the food chain, but if you are developing a game, it will take time before it gets to market, and it is probably a safe assumption that your target demographic at that time will be using AS Macs. (Particularly if it’s a somewhat demanding 3D game, in which case you would have problems catering to older intel integrated graphics models anyway. If it’s not, not having optimal performance on future AS Macs is a non-issue.)
 
So the timeline is looking something like this:

2022- A16 (4nm)
2022/2023 - M2 family chips (5np)
2023 - A17 (3nm)
2023/2024 - M3 family chips (4nm)
2025 - A18 (2nm)
2025/2026 - M4 family chips (3nm)

In other words if you own an iPhone 13 and already own an M1 based Mac - wait until the 2023 iPhone and the 2023/2024 Macs for the best bang for your buck upgrade.
I got a close to that timeline

2020 Sept - 2022 WWDC, A14, (5nm) M1, A15
2022 Sept - 2024 WWDC, A16, (4nm) M2, A17
2024 Sept - 2026 WWDC, A18, (3nm) M3, A19
2026 Sept - 2028 WWDC, A20, (2nm) M4, ...

September 2026 (2nm devices)
 
Bit off topic: I know a dev working at a small Polish indie game studio. And their reaction to Apple Silicon, and he claims other studios basically mirror this, was pretty knee jerk: If you now port your game to intel MacOS you're doomed to run in lackluster pseudo-emulation forever, and your MacOS port will pale compared to your windows version. However if you now port it to ARM MacOS half if not more of the people can't run the game yet, meaning you'll never recover your investment, which isn't guaranteed with MacOS anyways. And doing both is just too expensive to begin with. So they will just wait.

Tbh native gaming on Mac is dead to me, apart from tiny browser games. I now pay for the RTX3090 tier on Geforce Now cloud gaming and enjoy 3080 tier gaming on my 2015 MPB 15" and on all my other devices. And looking on Reddit, it seems that's what a lot of Mac owners are doing now. Only costs me $150-odd a year instead of $1500+ for a card.

Last year I was about to buy a M1 MPB 16" then realised the only functional advantage over my 2015 MPB 15" is better native gaming - apart from that I don't push my 2015 MPB 15" much - and brought the RTX3090 tier instead.

To be clear, Geforce Now does not cover all of Steam / Epic / GOG / Ubisoft. Far from it. But they host a decent selection from these stores and my backlog of purchased games / freebies is big enough to keep me busy for years.

So thanks to GeForce Now for saving me around $2000 on a M1 16". And also thanks to my ISP for rolling out cheap gigabit optic fibre to my street a few months ago which made GFN so much better. Not everyone has that I know.

It's a hard life being an indie studio. I can't blame them for not releasing a Mac native version. Hopefully they can get their game onto GeForce Now and make it accessible to Mac owners that way.
 
Tbh native gaming on Mac is dead to me, apart from tiny browser games. I now pay for the RTX3090 tier on Geforce Now cloud gaming and enjoy 3080 tier gaming on my 2015 MPB 15" and on all my other devices. And looking on Reddit, it seems that's what a lot of Mac owners are doing now. Only costs me $150-odd a year instead of $1500+ for a card.

Last year I was about to buy a M1 MPB 16" then realised the only functional advantage over my 2015 MPB 15" is better native gaming - apart from that I don't push my 2015 MPB 15" much - and brought the RTX3090 tier instead.

To be clear, Geforce Now does not cover all of Steam / Epic / GOG / Ubisoft. Far from it. But they host a decent selection from these stores and my backlog of purchased games / freebies is big enough to keep me busy for years.

So thanks to GeForce Now for saving me around $2000 on a M1 16". And also thanks to my ISP for rolling out cheap gigabit optic fibre to my street a few months ago which made GFN so much better. Not everyone has that I know.

It's a hard life being an indie studio. I can't blame them for not releasing a Mac native version. Hopefully they can get their game onto GeForce Now and make it accessible to Mac owners that way.
Just brought in the game studio to illustrate what the issue is for some developers. Could have exchanged that with audio sequencer or 3D Modeling Software ?

But since we are there…. Gfn is pretty amazing. Personally I’m not gaming that much, and what I do play (Factorio, Minecraft) isn’t exactly GeForce 30999 Ti material ?. But if one has really want to windows game on their Mac Moonlight is a pretty amazing alternative if you have a halfway decent PC at Home. Just stuff it in a cabinet in the basement and stream in home.
 
What he had to do to create an image of computational power progression was to continously reduce applicability. As you move to vector processing, to multicore vector processing, to GPUs to NPUs, you can demonstrate nice progression in computing ”power” or operarations per second. What you are also doing is reducing the applicability of the technology to ever more limited use cases.

There is a certain amount of co-evolution going on of course - if computing power is only significantly growing in a certain small area of computing, software guys are going to see if they might be able to leverage that for at least a portion of their code. That’s better than seeing almost no progress at all, but not only does it suffer from the narrowing applicability of the underlying hardware resource, but also from the mathematical fact of Amdahls’ law.

Bottom line - us oldtimers have already seen the rate of progress slow down tremendously and there is nothing really indicating that this ”pushing your way up an exponential rise” won’t be the future as well. Smaller improvements, slower, at higher cost.

To what exent this constitutes a problem depends on where your interest in the industry lies.

Excellent post thank you.

It's interesting to see where software is going. Like you, my first few computers were not internet connected, so everything was an app, and there was no such thing as a browser.

Now for most non-content-creation specialists, by far the heaviest app they run is the browser and the various tasks they do in it. It's remarkable to see how many individual apps now exist in both app form and as an in-browser window (mostly to cater for individual preferences and slightly different use-cases).

(Side note 1: Kind of a return to Job's original vision of an app-less device, everything running through the browser. It was the wrong time for the iPhone & thankfully he abandoned that vision. But looks to be coming somewhat true in disguised form.)

(Side note 2: Outside the office, looks like most of the world's computing is now on single-window devices - also a callback to computing's early days :)

Heavyweight specialist apps still exist for content creation, film editing, CAD (if it still exists on Mac OS) etc. But these are the specific usecases that benefit from the specialised on-chip hardware Apple is putting in. Nothing new about that, Intel & AMD have done it for ages, especially for video codecs.

What it points to is that likely there's nothing left in general purpose mass-market computing (*with one exception) that really pushes a M1 chip. Numerous reviews of the M1 machines have commented on how hard it is to make them chug. So from this point forward, what's left for Apple M-series chips? More on that later.

The exception? Games. All devices - desktop & laptop & mobile struggle with high-powered gaming, especially in 4K / VR.

(Side note 3: It's been a weird journey to see iOS become probably the biggest gaming platform in the world, while Mac gaming remains dead & will probably stay dead for the future.)

How is that being solved? Cloud gaming. Instead of spending $1500+ on a RTX3080 GPU, subscribing to a service like GeForce Now's 'RTX3090 Tier' gets you a RTX3080 level service on all your devices for $150-odd a year. I have a sub and it is excellent on cable, but improved dramatically when cheap gigabit fibre came to my street a few months ago with 1ms latency. I cannot see myself ever spending $1000 on a big GPU again. It's still early days for cloud gaming and it's not suitable for or accessible to everyone, but its been a big change in the last 12 months.

So what does the future hold for Apple's M-series chips? General mass computing is solved. Gaming is close to being solved by cloud gaming. Specialised content creation workflows are being boosted by specialist hardware paths in the chips. They always need more power, but they are coming close to being effectively solved.

The only 'hard' domains I can see left are power / battery consumption, AI, and VR.

VR: I have a VR headset and like it, but it is far, far from becoming a mass market general purpose tool. Apple has a long way to go on this. Check back in 5-10 years.

AI: Apple is pumping huge resources into on-board AI. I studied AI many moons ago & have forgotten all of it. AI today is like night and day. I type in 'cat' on my phone and it gives me all my personal photos that have cats in them. Fookin magic. General purpose AI that arranges my day for me & does my shopping for me is far from reality (but there are apps & frameworks that try to do it). Check back in 5-10 years.

Power / battery consumption: The M-series chips are a huge jump forward, but still a long way to go. I want an Apple Watch / iPhone / MacBook that can last a whole week without charging. They will come but will need insanely powerful, insanely power-sipping M-series chips with insane software and hardware integration. Apple's on the right path here. But check back in 5-10 years.
 
Just brought in the game studio to illustrate what the issue is for some developers. Could have exchanged that with audio sequencer or 3D Modeling Software ?

But since we are there…. Gfn is pretty amazing. Personally I’m not gaming that much, and what I do play (Factorio, Minecraft) isn’t exactly GeForce 30999 Ti material ?. But if one has really want to windows game on their Mac Moonlight is a pretty amazing alternative if you have a halfway decent PC at Home. Just stuff it in a cabinet in the basement and stream in home.
Audio sequencer - probably not cloud friendly, but for 3D modelling, cloud GPUs could work. I rented a high-powered cloud machine from PaperSpace for a while. Costs were too high for non-income-generating use and paying by the minute/hour was deeply annoying, but these things can only improve.

When I started work in IT, every small company had a server in the corner for email etc. When I left IT, that was all gone, moved into the cloud via O365, Dropbox etc. With WFH, if I was running a small distributed 3D team I'd rather give out cloud-based 3D-modelling machine logins, easier than being responsible for maintaining home machines everywhere and running around installing new GPUs and upgrades. Also easier than renting space for servers possibly several hour's drive away & trying to look after it. Pros and cons of course.

I actually do have a tiny windows gaming machine in a cupboard, holding a 1060 GPU & 1TB nvme SSD backed by a 4TB Steam HDD. After getting GFN, I haven't turned it on in months.

Frankly the windows machine was annoying. Windows & various Steam games complained so much about running headless - I was always having to remote in to sort out various issues, which took the fun out of it. Also was irritating to run over to the cupboard to turn it on & wait for boot up if I wanted to game. With GFN I can be into a game within seconds, and probably faster than the game loads on the Windows machine even if it was already booted.
 
In order to create sub-2nm chips we need to go from EUV to High-NA EUV machines. EUV machines cost $150m a piece and High-NA EUV machines will set you back over $200m a piece. This is the most incredible tech ever created by mankind.
 
What do you mean by this? 2nd hand market?
Yes, for instance. Or to relatives, or… they won’t be tossed in the bin, typically, but will have an extended service life doing something, somewhere.
If I were in the process of developing new software however, there is no question I would target the new architecture, and regard supporting intel Macs as a stretch goal.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.