Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You're assuming everything written for the Mac is written in Xcode. I can assure you... that that is not the case.

Java-based apps still exist, Electron-based apps are emerging (the Slack app is one such thing, Autodesk Fusion 360 is also Electron-based), etc... to name a few of the popular alternatives. Then we have a lot of the C++ based tools and code that are no longer maintained, and that cannot be very easily re-compiled for ARM without at least some small tweaks.

And then we have the various Mac drivers that must be compatible with all of your current Mac peripherals, and those are beyond Apple's control.

If an ARM MacBook comes out this year or the next, it'll most likely have:

1. Abysmal performance due to severe lack of native apps, as outlined above. There is no remedy for Java-based apps and Electron-based apps, and apps built on platforms that Apple does not control.
2. It'll have very poor or almost non-existent backwards compatibility because Apple is not in control of drivers. Manufacturers will have to go back in time and at least recompile all of their drivers for ARM if you just want to be able to connect your printer. How likely do you think they'll want to do that for hardware that's more than a year old?
3. You simply can't connect an external GPU to it. There is no support for that in ARM unless Apple basically invents a whole new connector standard.
4. Heck, you lose Thunderbolt 3 support altogether since ARM does not support Thunderbolt 3 at all. Intel is in control of
a portion of Thunderbolt 3 and they won't very easily let Apple or ARM implement it.

Those are major technical roadblocks that will "for now" hold back an ARM MacBook. This is not speculation but it's just the reality.

If you want basically an iPad Pro running a non-touch-based OS that can't run a lot of apps and can't connect to anything that you currently own, and can't even do what the cheapest MacBook can do right now then... sure, I guess that makes one. But I don't think the market will very easily accept something like that. Apple has a lot of kinks they need to work out.

I didn’t assume everything, but there is a lot written in Xcode. There is also a lot more code elsewhere that is chip architecture agnostic.

You appear to have missed that Thunderbolt 3 is part of the USB 4 specification after Intel made it royalty-free.
 
I could see where you are coming from. Apple has always been really secretive for the most part when new products are to release. Just earlier this month we were hit with the new iPad and the magic keyboard, and barely anyone expected that keyboard to even exist.

Kuo has had a 50-50 track record on being right about a product and its actual release date. And with this world pandemic happening, its probably been affecting the leaks and rumors throughout.

Aside from having a beefier heatsink and better fans, I thought that the thermals were also better in the new 16" MacBook Pro because they were using the new 7nm Navi GPU instead of the more power hungry 14nm 560x in the 15" MacBook Pro.

15" MacBook Pro GPU
Screen Shot 2020-04-27 at 4.58.42 AM.png


16" MacBook Pro
Screen Shot 2020-04-27 at 4.59.29 AM.png


The CPU seemed to be the main source of the heat. Aren't both still using the same 9th gen intel chips?

I was thinking that having an improved MacBook Pro 13" design or maybe 14" could follow the same path of the 16" since the 13" MacBook Pro also thermal throttled? Even the dual core ones weren't able to stop from throttling.

If they did keep the 13" design now, it probably would still be possible for a refresh to come really soon and to have the new 10th gen intel chips in these since those are available now, considering that new Airs got them.
 
2) There is no benefit to 14" - Apple didn't grow the 15" model to 16" for fun, they did it because they were struggling with thermal limitations in the 15" chassis. The 16" throttles less than its predecessor and has a much superior graphics part. This is why Apple didn't wait until the next redesign. A 14" Pro doesn't enable any meaningful change in performance or solve a major problem, so there is no point retooling mid-life.

An Ice Lake part would come with a significantly faster GPU and AVX-512 support. Could make a massive difference for some creative workflows, depending on software support.


Aside from having a beefier heatsink and better fans, I thought that the thermals were also better in the new 16" MacBook Pro because they were using the new 7nm Navi GPU instead of the more power hungry 14nm 560x in the 15" MacBook Pro.

The new Navi part is a more power hungry card compared to old Polaris (50W vs 35W TDP). But it is a new, much improved architecture, and it comes with GDDR6 that's over 3x faster. That said, the new GPU would also work in the old chassis (the Vega Pro 20 with similar TDP worked as well), but the new chassis is better equipped for workflows where the CPU and the GPU are used simultaneously.
 
  • Like
Reactions: dylin
1) Kuo says there is a 14" mini-LED Macbook being developed. That doesn't cancel out the possibility of increasing the size to 14" in May and adding mini-LED/ARM at a later date. It would follow the 16" design and differentiate the MBP from the Air.

2) Other reasons include the design (smaller bezels) and space for a larger battery.

3) It's much more likely that Apple incorporates ARM in non-Pro machines at first, like the MacBook, iMac, Mac mini for example. The Pro machines will probably be last.
I hope you are right! 🤞
 
Java-based apps still exist, Electron-based apps are emerging (the Slack app is one such thing, Autodesk Fusion 360 is also Electron-based), etc... to name a few of the popular alternatives. Then we have a lot of the C++ based tools and code that are no longer maintained, and that cannot be very easily re-compiled for ARM without at least some small tweaks.

Java apps are completely processor independent - java is compiled to byte-code for a virtual processor: in effect, all java runs under emulation.

Likewise, Electron apps are mostly written in Javascript (Electron itself is basically a version of the Chromium browser engine) - yes, some of the third party libraries include C/C++ code, but developers download that as source, and the vast majority of that is processor independent anyway.

Yes, that assumes that the Java and Electron runtimes get released for ARM MacOS - but both of them are already available for ARM64 so that isn't going to be the Manhattan project.

It's Apple's job to identify critical technologies like that and sweet-talk key developers. (There are, of course, a million ways for Apple to drop the ball completely stuff up the transition, but since this is all just rumours at the moment I'm not being cynical).

The reality is that even operating systems and drivers are predominantly written in C/C++ (and ObjC on the Mac) and the vast majority of C/C++/ObjC code is completely processor independent. The main thing that can cause problems is switching between 32/64bit and big/little-endian processors, or moving to a different compiler toolchain - none of which apply to an x86-64 to ARM64 transition. Heck, we wouldn't even be discussing this if MacOS itself had to be substantially re-written for ARM.

That leaves a few cases where applications involve processor-specific optimisation or parts written in assembly language for speed - which is going to be a tiny subset of exiting code (because even without a processor switch it is fragile, consuming and a last resort). Removing such code in favour of calling operating system frameworks (which will automatically support future hardware improvements) is generally a Good Thing anyway.

The reality is, if your workflow depends on abandonware, chances are it is going to be killed by some future software or hardware change anyway (if it wasn't already killed by Catalina).

The big problem is going to be pro apps with huge ecosystems of third-party plug ins and legacy code - which may indeed take a couple of years to sort out - but people relying on those aren't going to be switching to a 12" ARM MacBook on day one. Apple have just released a new Mac Pro so they can't drop Intel support for 3 years or so.

If any new ARM Mac looks likely to sell, developers will support it (they do like getting paid). For most, it's going to be on a par with testing and fixing their products for the annual MacOS upgrade.
 
I didn’t assume everything, but there is a lot written in Xcode. There is also a lot more code elsewhere that is chip architecture agnostic.

You appear to have missed that Thunderbolt 3 is part of the USB 4 specification after Intel made it royalty-free.

You still haven't addressed the vast majority of drivers for existing peripherals that are now pretty much "legacy". Even if they could just be "recompiled", how will Apple ask these companies to go back and re-compile all of these?

I did not miss Thunderbolt 3 being a part of USB 4 specs. You are still missing the fact that AMD (and maybe nVidia?) will have to provide drivers for all of their graphics cards for ARM if you want to use said external GPU.

Mac OS is more than just the apps you run.

Java apps are completely processor independent - java is compiled to byte-code for a virtual processor: in effect, all java runs under emulation.

Likewise, Electron apps are mostly written in Javascript (Electron itself is basically a version of the Chromium browser engine) - yes, some of the third party libraries include C/C++ code, but developers download that as source, and the vast majority of that is processor independent anyway.

Yes, that assumes that the Java and Electron runtimes get released for ARM MacOS - but both of them are already available for ARM64 so that isn't going to be the Manhattan project.

So you're saying somehow that the UI in both Java and Electron will already be ready on day one. I think that's a bit optimistic. Having JRE and Electron available on ARM64 doesn't mean they'll immediately be available on this mythical Mac OS ARM platform that doesn't even exist yet. Heck, neither Java nor Electron is available on Windows 10 on ARM and that has had... at least 10 years of development. Edit: yes, Electron > 6.0.2 is available on Windows 10 on ARM, but trust me when I tell you this: not all developers will want to go back and make their Electron apps compatible with Electron > 6.0.2 immediately. There are still a lot of Electron apps that are stuck on older versions because "if it ain't broke, don't fix it". i.e.: Slack:

It's Apple's job to identify critical technologies like that and sweet-talk key developers. (There are, of course, a million ways for Apple to drop the ball completely stuff up the transition, but since this is all just rumours at the moment I'm not being cynical).

I am being cynical because the fact is that Microsoft has had, again, at least 10 years of development for Windows on ARM, and even now, they have not achieved that. What is Apple doing that makes them so different from Microsoft?

The reality is that even operating systems and drivers are predominantly written in C/C++ (and ObjC on the Mac) and the vast majority of C/C++/ObjC code is completely processor independent. The main thing that can cause problems is switching between 32/64bit and big/little-endian processors, or moving to a different compiler toolchain - none of which apply to an x86-64 to ARM64 transition. Heck, we wouldn't even be discussing this if MacOS itself had to be substantially re-written for ARM.

Yes, I agree on this point. C/C++ code should be portable assuming no extra architecture-specific tweaks. But... as I have mentioned, you have these problems:

1. Code that is no longer maintained, meaning... there is no one there to re-compile them anymore.
2. Code that has a maintainer but that is more than one year old. Will Apple pay these maintainers to go back and re-compile their codes specifically for this mythical MacBook? Will it make... any sense at all, both for Apple and for the maintainers?
3. Code that is OpenGL-based that now needs to be transitioned over to Metal and/or Apple will have to provide legacy OpenGL compatibility because... oh... Apple has been STELLAR at supporting OpenGL, no? Sorry, didn't mention this, but it is actually something you can't overlook. There are still quite a number of apps that are OpenGL-based that haven't transitioned over to Metal yet.

My question is this: why would Apple go through all of this effort? Why not just spend the time on improving the iPad Pro and iPad OS instead?
 
You still haven't addressed the vast majority of drivers for existing peripherals that are now pretty much "legacy". Even if they could just be "recompiled", how will Apple ask these companies to go back and re-compile all of these?

Same way they recently asked them to go back and re-compile them all for Catalina after dropping 32 bit support.

You are still missing the fact that AMD (and maybe nVidia?) will have to provide drivers for all of their graphics cards for ARM if you want to use said external GPU.

Apple will probably need to include AMD drivers in MacOS - just like they do now. AMD will help because they want to sell GPUs to Apple. Last I looked, NVIDIA drivers were pretty much dead anyhow. Again, you're assuming that porting anything to ARM is some colossal undertaking.

So you're saying somehow that the UI in both Java and Electron will already be ready on day one.

No, I said nothing of the sort - anyway, phase one would almost certainly involve making a prototype system available to developers for 6 months or so before "day one" - and if the rumours are true, Apple will (or should) already be talking to key developers. That said, I wouldn't bet against Java getting depreciated on Mac anyway.

1. Code that is no longer maintained, meaning... there is no one there to re-compile them anymore.

...which, if it wasn't killed off by Catalina and the end of 32-bit support, is highly likely to stop working with a future MacOS release anyway.

3. Code that is OpenGL-based that now needs to be transitioned over to Metal and/or Apple will have to provide legacy OpenGL compatibility because...

Apple announced the end of support for OpenGL on MacOS 2 years ago - starting with MacOS 10.14 - so it is likely to be gone in the next year or two anyway.

If you want a platform that is likely to keep running abandonware for ever - I give you MS Windows, an OS that has been perpetually knobbled by its inability to clean out the dead wood.
 
Same way they recently asked them to go back and re-compile them all for Catalina after dropping 32 bit support.

You mean the same way they broke support and blind-sided major developers, forcing patches that barely work?

Apple will probably need to include AMD drivers in MacOS - just like they do now. AMD will help because they want to sell GPUs to Apple. Last I looked, NVIDIA drivers were pretty much dead anyhow. Again, you're assuming that porting anything to ARM is some colossal undertaking.

You're assuming things written for x86 can just be converted to ARM without any colossal undertaking. I'd say history is on my side in this particular argument.

No, I said nothing of the sort - anyway, phase one would almost certainly involve making a prototype system available to developers for 6 months or so before "day one" - and if the rumours are true, Apple will (or should) already be talking to key developers. That said, I wouldn't bet against Java getting depreciated on Mac anyway.

Apple won't be able to deprecate Java. Too many things depend on it (not on Apple's sides), and many modern apps are still written in Java for easy cross-platform support.

...which, if it wasn't killed off by Catalina and the end of 32-bit support, is highly likely to stop working with a future MacOS release anyway.

You do realize that from now until next year, it's basically a time span of... 1 year? That's almost a lifetime in the software development world.

Apple announced the end of support for OpenGL on MacOS 2 years ago - starting with MacOS 10.14 - so it is likely to be gone in the next year or two anyway.

It can't. Reasons:

Chrome will have to specifically use Metal only for one platform (Mac). That's insanity. And yes, this will break all Chromium-based platforms. Browsers and Electron alike. WebGL by extension will also run into problems as well.

If you want a platform that is likely to keep running abandonware for ever - I give you MS Windows, an OS that has been perpetually knobbled by its inability to clean out the dead wood.

So you're saying the current Mac OS plus the Java apps and Electron apps are abandonware? Coolio. I guess VS Code users need to use something else too, then.
 
Presumably they’re going to need to re-tool to support the slightly different case needed for the scissor switch keyboard and revised layout of touchbar etc, so I’m assuming they‘ll do the 14” screen at the same time because, why wouldn’t they?

As to the lack of rumours, I don’t think that means much on the Mac side of things as they seem to manage to keep that under much tighter control compared to iOS. Presumably due to the much smaller number of third party suppliers and manufacturers of things like cases.
 
Chrome will have to specifically use Metal only for one platform (Mac). That's insanity. And yes, this will break all Chromium-based platforms. Browsers and Electron alike. WebGL by extension will also run into problems as well.

Yes they will. I’m surprised that they are not using Metal yet to be honest. Unity and friends work in Metal for a while now. And yes, WebGL on Mac will be implemented via Metal (I’m sure Safari probably already does it). The upcoming WebGPU uses Metal from the start (and it’s API is heavily based on Metal).
 
Apple should better look into Ryzen 4000 mobile CPUs. I mean a Ryzen 4800U is on par with the i9-9880H of the 2019 MacBook Pro while being a tad more power efficient ! And that's not even the fastest CPU available.
 
  • Like
Reactions: Tekguy0
I am convinced that there will be a major refresh to the 13 inch line in 2020.

Why?

1. The keyboard issue is hurting Apple's bottom line and reputation and needs to be changed
2. The 8 series CPUs are starting to be withdrawn from sale by Intel
3. The proposition of an 8 series CPU vs a 10 series CPU already looks bad and will get worse (even though in use it's not all that different)
4. The design language of competing laptops has become more competitive - e.g. slim bezels, better screens
5. People are not going to be upgrading their 2016 to 2019 models for a light refresh and those owning 2016 and 2017 models are ripe as being customers for replacements - Apple doesn't want these people going to Dell etc. for their next laptop

All of the above was also true in 2019. By 2021 Apple will be looking like an also-ran in the core of the Mac lineup unless it gets moving.

Of course, Apple can disappoint - the time it took to update the Mac Mini and Mac Pro is shocking and the iMac design has been surprisingly stagnant.

In terms of whether certain features are ready or not, Apple can cut these if need be. I don't believe a 14 inch design has to be mLED to be worthy of inclusion.
 
You mean the same way they broke support and blind-sided major developers, forcing patches that barely work?

...yes, they totally blindsided them by announcing the depreciation of 32 bit years in advance, and adding "this software will stop working in future versions of MacOS" warning boxes in MacOS 10.14 /s. You're right - nobody saw that one coming.

You're assuming things written for x86 can just be converted to ARM without any colossal undertaking. I'd say history is on my side in this particular argument.

What history? Apple have successfully switched processor architecture twice - and also switched from Classic Mac OS to the not-remotely-compatible OS X (which was a far, far bigger deal than switching processor).

The Surface Pro X has only been around for 6 months, Chrome/Chromium were apparently ready on day one but delayed for some political reason as is Office (despite the widespread FUD to the contrary). Adobe have announced that they will be supporting CC. That's on the strength of one or two, fairly obscure, ARM Windows machines that aren't likely to replace x86 PCs any time soon - whereas if Apple releases an ARM Mac it will be (a) Front-page news and (b) probably come with an announcement of a deadline for the end of Intel Macs.

Meanwhile, the majority of the big Linux open source packages are all on ARM Linux - even the Raspberry Pi has everything from Libreoffice through Minecraft to Mathematica... when all but the latest Pis are built on industry surplus "appliance" chips that are barely adequate to run them.

You do realize that from now until next year, it's basically a time span of... 1 year? That's almost a lifetime in the software development world.

Plenty of time for developers to fix their software for ARM then...

It can't. (Drop OpenGL) Reasons:

They have: https://www.macrumors.com/2018/06/05/apple-deprecates-opengl-opencl-gaming/ - that's not a rumour, it was announced as depreciated (i.e. no more support) 2 years ago and the only speculation involved is how many more OS/hardware updates are left before it breaks or someone finds a critical security flaw. If Apple do a U-turn on that then they could also decide to include it in MacOS for ARM.

So you're saying the current Mac OS plus the Java apps and Electron apps are abandonware?

No, you're straw-manning again. I'm just observing that Java is just the sort of slowly dying but still-widely-used technology that Apple have a track record of killing off.

As for Electron/Chromium - if the developers had been burying their heads in the sand and ignoring the depreciation of things like OpenGL then yes it would be abandonware. However, from what I can tell that's simply not the case - there are already vestiges of forthcoming support - such as an unused "Metal" option in chrome://gpu and a cross-platform OpenGL-to-whatever implementation with Metal support "in progress" (https://github.com/google/angle). No, it's not "done" yet but - as you say - they've got a "lifetime in the software development world" before it is needed.

The "day one" business is pretty irrelevant too. Hardly a year has gone by without a new Mac OS release that comes with a long list of software incompatibilities on "day one". Best guess is that the first announcement would be a developers-only system, 6 months or so before the first ARM Mac becomes available - just like the PPC to Intel switch (and that was rushed because PPC development for desktop/mobile had pretty much stopped and the existing machines were already getting sand kicked in their faces). Plus, most of the rumours are for a 12" MacBook or other low-end ultraportable - the target market for which are unlikely to sweat the lack of Adobe CC, VS Code or Pro Tools on day #1.
 
  • Like
Reactions: DeepIn2U
...yes, they totally blindsided them by announcing the depreciation of 32 bit years in advance, and adding "this software will stop working in future versions of MacOS" warning boxes in MacOS 10.14 /s. You're right - nobody saw that one coming.

10.14 to 10.15 is barely a year. 10.13 is where this all started. "Years in advance" sounds like 2 to me, but sure, 2 is still plural.

Let's just say... agree to disagree?

What history? Apple have successfully switched processor architecture twice - and also switched from Classic Mac OS to the not-remotely-compatible OS X (which was a far, far bigger deal than switching processor).

The Surface Pro X has only been around for 6 months, Chrome/Chromium were apparently ready on day one but delayed for some political reason as is Office (despite the widespread FUD to the contrary). Adobe have announced that they will be supporting CC. That's on the strength of one or two, fairly obscure, ARM Windows machines that aren't likely to replace x86 PCs any time soon - whereas if Apple releases an ARM Mac it will be (a) Front-page news and (b) probably come with an announcement of a deadline for the end of Intel Macs.

You're ignoring what I wrote.

Windows has been in development for ARM for years.

Developers have had years, decades even, to work on their apps for ARM on Windows. So yeah, Google has had years to develop Chrome for Windows on ARM. Of course it would have been ready on day one. By the time the Surface Pro X came out, the Surface Go and (edit: sorry, did not realize the Surface Go was not an ARM device) the Surface RT that came before it were already in the market for years. Not "a year" but "years".


Windows RT has at least been around for 8 years now. Google did not just start developing 6 months ago. Also, Chrome is in a special situation where it can be open source and not at the same time, so it is entirely possible for Google to develop it in secret, as opposed to some fully open source projects.

Meanwhile, the majority of the big Linux open source packages are all on ARM Linux - even the Raspberry Pi has everything from Libreoffice through Minecraft to Mathematica... when all but the latest Pis are built on industry surplus "appliance" chips that are barely adequate to run them.

I think you also understand that ARM on Linux has been around a long time. Let's just say it's not a surprise that there will be ports of things. I have a PI 3 at home as well. Despite its usefulness, I wouldn't say it's running the desktop environment adequately. It's good for my small coding projects, though.

Plenty of time for developers to fix their software for ARM then...

Well, sure... assuming this mythical Mac OS ARM device exists now. But I don't think it does.

They have: https://www.macrumors.com/2018/06/05/apple-deprecates-opengl-opencl-gaming/ - that's not a rumour, it was announced as depreciated (i.e. no more support) 2 years ago and the only speculation involved is how many more OS/hardware updates are left before it breaks or someone finds a critical security flaw. If Apple do a U-turn on that then they could also decide to include it in MacOS for ARM.

Deprecation is not the same as ending support. I'm merely pointing out the fact that despite the warnings, some developers can't very simply rewrite their entire graphics stack to Metal. This is not the same situation as 32-bit to 64-bit. Ending OpenGL support takes time and requires a rewrite.

No, you're straw-manning again. I'm just observing that Java is just the sort of slowly dying but still-widely-used technology that Apple have a track record of killing off.

I'm not. You're the one basically saying Java is abandonware. Read your own sentence, please? Let's just say my work experience contradicts your statement. And yes, I do work in the software industry. I still see Java projects actively being developed. It's not going away anytime soon. If Apple decides to drop support, good for them but other platforms still fully support Java extensively. It's not the same situation as Flash where it was proprietary, had poor performance, and also poor multi-platform support.

As for Electron/Chromium - if the developers had been burying their heads in the sand and ignoring the depreciation of things like OpenGL then yes it would be abandonware. However, from what I can tell that's simply not the case - there are already vestiges of forthcoming support - such as an unused "Metal" option in chrome://gpu and a cross-platform OpenGL-to-whatever implementation with Metal support "in progress" (https://github.com/google/angle). No, it's not "done" yet but - as you say - they've got a "lifetime in the software development world" before it is needed.

I don't think you get it.

Electron is chromium-based, which means whatever they do must depend on chromium. ANGLE does not support Metal yet. There is effort to make a Metal translation layer for ANGLE but it's not merged into the main ANGLE branch yet, and it's not even complete, with only partial OpenGL 3.0 support, and not yet ready with 3.1 and 3.2.

Source:

Electron developers cannot do anything at all to support this mythical next version of Mac OS until this Metal effort is merged into the mainstream branch. This developer has asked for a pull request into the main branch for 6 months now. Still nothing.

...and I'm not going to entertain conspiracy theories about Google intentionally holding back things. But let's just say things move very slowly in large software stacks. Hence... "a lifetime".

The "day one" business is pretty irrelevant too. Hardly a year has gone by without a new Mac OS release that comes with a long list of software incompatibilities on "day one". Best guess is that the first announcement would be a developers-only system, 6 months or so before the first ARM Mac becomes available - just like the PPC to Intel switch (and that was rushed because PPC development for desktop/mobile had pretty much stopped and the existing machines were already getting sand kicked in their faces). Plus, most of the rumours are for a 12" MacBook or other low-end ultraportable - the target market for which are unlikely to sweat the lack of Adobe CC, VS Code or Pro Tools on day #1.

At least I can agree that a developers-only device makes sense with the current way things are going. Also, yes, I'd agree a 12" MacBook is more sensible.

See, my beef with this namely is actually not that I'm shooting down the idea of an ARM device running Mac OS. It's mainly that people are overzealous about the next MacBook Pro being an ARM device. Even if that were the case, it's not a transition that Apple can successfully pull off within a year. 3-4? Maybe.

The main problem is still how Apple can slowly steer the ship away from x86 and into ARM. One of the things that helped them with the last architecture change was that x86 turned out to be far more efficient and it also outperformed PowerPC multiple times over, which allowed Apple to successfully run PPC apps in a translation layer without too much of a performance hit. ARM is "almost as fast" as the midrange x86 now, but it's still a far cry from being able to run a translation layer with x86 apps and still won't incur a significant performance hit.

I'm sure you have seen this?

That's the state of what's currently possible. Assuming the A14 CPU that Apple will come out with next will be 2x faster again, then... sure, it may just run Leopard at "normal" speed with most apps. That sounds like it'll be ready for a 12" MacBook. But it remains to be seen if it's competitive enough to even go toe to toe against the MacBook Air currently.
 
Last edited:
Windows has been in development for ARM for years.

[...]

Developers have had years, decades even, to work on their apps for ARM on Windows.

[...]
Windows RT has at least been around for 8 years now.

Windows RT is a quite different beast from normal Windows. And I suppose there are massive difference in ABI between the ARM windows and the x86 Windows.

With Mac ARM transition the won't be much difference from the average developer's perspective. Size and alignment of the basic types are identical, APIs are identical. You are only in trouble if you are using CPU-specific intrinsics or inline assembly.

And as to OpenGL depreciation and Chrome/Chromium/Electron... once Apple stops shipping an OpenGL implementation (which will probably happen this year), I assure you that Metal support will appear in that stack literally overnight. Once there is a pressing need, issues will be solved quickly. It is not a major engineering issue to begin with and the most complicated part (shader transcoding) is already done.
 
I disagree about going to 16” purely for thermal reasons, and do see a 14” happening... but I also hope for a 18” MBP :)
 
Windows RT is a quite different beast from normal Windows. And I suppose there are massive difference in ABI between the ARM windows and the x86 Windows.

With Mac ARM transition the won't be much difference from the average developer's perspective. Size and alignment of the basic types are identical, APIs are identical. You are only in trouble if you are using CPU-specific intrinsics or inline assembly.

And as to OpenGL depreciation and Chrome/Chromium/Electron... once Apple stops shipping an OpenGL implementation (which will probably happen this year), I assure you that Metal support will appear in that stack literally overnight. Once there is a pressing need, issues will be solved quickly. It is not a major engineering issue to begin with and the most complicated part (shader transcoding) is already done.

There is a good tech video on how Microsoft accomplished Windows 10 on ARM:

And the emulator is probably also the most major difference between Windows RT and Windows 10 on ARM. Likely the emulator is also the reason why Windows 10 on ARM can't be retrofitted on to devices running Windows RT.

Also, even with native Metal support appearing overnight, there are still other issues... and yes, this one relates to architecture-specific intrinsics.

Some may argue "oh, it's just a recompile" but sometimes it's a bit more complicated than that.

I don't doubt that some things can be solved overnight, but I think you and many others are overestimating how fast the software industry is moving. If something is moving too fast, it risks bugs.

Even now, Catalina still has bugs that are not going to be fixed in the next update.
 
See, my beef with this namely is actually not that I'm shooting down the idea of an ARM device running Mac OS. It's mainly that people are overzealous about the next MacBook Pro being an ARM device.

Well, that's moving the goalposts a bit. I completely agree that it would be ridiculous for Tim Cook to stand up at virtual WWDC and announce the immediate discontinuation

ARM is "almost as fast" as the midrange x86 now,

You mean the A12X, an ultra-low-power, passively cooled chip designed for a tablet, is "almost as fast" as the 45W, actively-cooled (and alarmingly hot-running) i7 in a 15: MacBook Pro!? Whether that translates into real-world MBP 15" performance is moot, because the only Mac that a mere A12X (or A12Z now) is likely to turn up in (if at all) would be a 12" MB or 13" MBA replacement. Anybody want a MacBook Air that is "almost as fast as" a 15" MacBook Pro?

If Apple replace the Pro macs with ARM-based machines, they will most likely be running a new "Pro" processor with more/faster cores and extra acceleration gizmos on-chip.

That's the thing with ARM - it might not always offer faster per-core performance than Intel, but it offers similar performance at lower powers. Since almost all of the Mac range is thermally constrained (maybe not the Mac Pro, but even that has 'quiet running' as a selling point), same speed + lower power == faster.

I'm sure you have seen this?
That's the state of what's currently possible.

Now you're being ridiculous - that video shows Mac OS X running completely under a software emulator (not even hardware virtualisation) - the entire OS - which is always going to be horribly inefficient c.f. an ARM native version of Mac OS (which is almost certainly already running somewhere in the depths of the Apple flying saucer). On a native OS, even apps running under an emulator will be faster, since all the OS services and GUI will be running native. It's actually impressive that an iPad Pro can get that close to running x86 MacOS usably.

And the emulator is probably also the most major difference between Windows RT and Windows 10 on ARM.

That and the fact that Windows RT wasn't really Windows - it was completely locked down to the MS App store (which had stuff all applications at the time) and could only run UWP/"Metro" apps which (a) Everybody hated at the time because they totally dumped the familiar windows UI (this was around the time of the Windows 8 debacle) and (b) used a different API and could absolutely not be produced from existing Windows apps by just re-compiling. Windows RT was more like the Microsoft equivalent of iOS/Android except that iOS/Android both had better UIs and more software.
 
  • Like
Reactions: Moonjumper
Well, that's moving the goalposts a bit. I completely agree that it would be ridiculous for Tim Cook to stand up at virtual WWDC and announce the immediate discontinuation

Which, in all honesty, I know Apple has done before. So this is just us speculating against the unknown. I "know" immediate discontinuation would be bad because many apps are still dependent on OpenGL, and will be forced to be either rewritten or use MoltenGL. But that's a different issue altogether. Let's just say there are solutions right now but they are not that elegant.

You mean the A12X, an ultra-low-power, passively cooled chip designed for a tablet, is "almost as fast" as the 45W, actively-cooled (and alarmingly hot-running) i7 in a 15: MacBook Pro!? Whether that translates into real-world MBP 15" performance is moot, because the only Mac that a mere A12X (or A12Z now) is likely to turn up in (if at all) would be a 12" MB or 13" MBA replacement. Anybody want a MacBook Air that is "almost as fast as" a 15" MacBook Pro?

Yeah, I agree that it's moot to argue against all of these benchmark numbers, what they mean, what they translate to, etc... You and I both know it's a religion in and of itself. All I know as a software dev is... numbers lie and use case dictates what is most suitable. But I digress.

If Apple replace the Pro macs with ARM-based machines, they will most likely be running a new "Pro" processor with more/faster cores and extra acceleration gizmos on-chip.

That's the thing with ARM - it might not always offer faster per-core performance than Intel, but it offers similar performance at lower powers. Since almost all of the Mac range is thermally constrained (maybe not the Mac Pro, but even that has 'quiet running' as a selling point), same speed + lower power == faster.

I'm not going to pull out a crystal ball and start predicting how Apple's ARM design scales with higher thermal and power profiles but let's just agree that it's most likely not linear.

Also there have been cases where a benchmarked A15 actually doesn't beat x86 even at power consumption.

But... again, nitpicky. There are times A15 is indeed more efficient. Also... I agree, A15 is not necessarily that close to what Apple has. We'll see in the end.

Now you're being ridiculous - that video shows Mac OS X running completely under a software emulator (not even hardware virtualisation) - the entire OS - which is always going to be horribly inefficient c.f. an ARM native version of Mac OS (which is almost certainly already running somewhere in the depths of the Apple flying saucer). On a native OS, even apps running under an emulator will be faster, since all the OS services and GUI will be running native. It's actually impressive that an iPad Pro can get that close to running x86 MacOS usably.

I'm simply showing the fact that x86 emulation on ARM is not that "close" to what you'd wish it can be. Sure, it may be faster if you take off the extra overhead of having to emulate other aspects of the system as well, but jury is out on how much faster it'll be able to get.

And again, I'm not dismissing the fact that Apple's next A14 chip can run Mac OS just fine for a 12" MacBook. It's more that I don't see how it can replace a MacBook Pro yet... considering most apps may have to run on an emulation layer for a while. And even then, it'll be interesting to see how it compares against a MacBook Air.

Hardware virtualization can't happen with x86 to ARM due to architectural differences... but I guess... please feel free to prove me wrong. Perhaps there is a direct pathway that I'm not aware of.

That and the fact that Windows RT wasn't really Windows - it was completely locked down to the MS App store (which had stuff all applications at the time) and could only run UWP/"Metro" apps which (a) Everybody hated at the time because they totally dumped the familiar windows UI (this was around the time of the Windows 8 debacle) and (b) used a different API and could absolutely not be produced from existing Windows apps by just re-compiling. Windows RT was more like the Microsoft equivalent of iOS/Android except that iOS/Android both had better UIs and more software.

You could actually run non-MS Store apps on Windows RT. The "lockdown" that MS put in place was simply a soft lock on the kernel and not an actual limitation of the platform. Once the "lock" was lifted, it could technically run anything. The problem was... there was nothing available that could run like that outside of dev environments at the time. So app availability was what ultimately killed it. Or yeah, I guess you can also say it's because the dev platform made it too difficult. I'd argue it's mostly just that it wasn't worth developing for.

Now, I'd even argue that an ARM Mac OS may run into the same limitation, and may even have the same soft "lock" imposed by Apple because it's not like they haven't tried that before. Ahem...

But anyways, I think this discussion is going to devolve into the same thing that the ARM MacBook thread has gone to: it'll just be speculations past this point as to how Apple will approach the problem. I know these things as facts:

1. x86 emulation on ARM is horribly slow, even on Apple's more efficient ARM chips right now. Jury is still out on whether Apple will take that route. If they don't, that means the ARM MacBook is initially going to be less useful for the portion of Mac users who need access to legacy (or not-yet-updated) apps that they are using. Sounds like such a thing will only be helpful to developers.

2. Others have tried to produce desktop environments on ARM that could become "alternatives" to the regular Windows/Linux/Mac platforms, and... they have all failed to some degrees. At least as far as "consumer-ready" is concerned. I personally really love the Raspberry PI, but it's not my main desktop device and I can't see how I'd be able to convince my grandpa to use it over his beloved Mac. I'm hoping Apple will buck the trend, but let's just say the landscape is bleak.

3. Chrome/Electron/Java, etc... usual suspects still need to be updated. We'll see if it's "overnight". I'm inclined to believe it's not and you guys are perfectly entitled to say it could be, but honestly, you won't be able to convince me, just as I know I can't convince you. This is technically out of our hands and it's up to the developers maintaining those projects.

4. The vast majority of Apple engineers are still working from home as of this moment... all the way up until the end of May. And then we'll see if the situation remains the same in June and July. I don't know if they can still design hardware efficiently at home, so we'll see if this impacts anything in terms of timeline and overall readiness of whatever their next platform may be.

And there, those are the things I think we can all agree on. The rest I'd just say are all speculations and should be reserved for the ARM thread. I won't respond to specific points anymore past this point since I don't think I'll have any data to back anything up. But at least I have presented all of the data I could. You can choose to interpret them in any way you would like, and I'd agree that's still perfectly valid.
 
I disagree about going to 16” purely for thermal reasons, and do see a 14” happening... but I also hope for a 18” MBP :)

We might as well customise our screen sizes the rate it's going, I would like a 13.87463739' screen please :rolleyes:
 
We might as well customise our screen sizes the rate it's going, I would like a 13.87463739' screen please :rolleyes:
that's not very nice ✌
[automerge]1588113223[/automerge]
I disagree about going to 16” purely for thermal reasons, and do see a 14” happening... but I also hope for a 18” MBP :)
I remember inheriting a 17" MacBook Pro from work in around 2011. Omg those things were boat anchors! I actually gave it back it was so unwieldy!
 
  • Like
Reactions: DeepIn2U
I disagree about going to 16” purely for thermal reasons, and do see a 14” happening... but I also hope for a 18” MBP :)
Dell's upcoming XPS 17 uses a 16:10 display, and it's overall dimensions and weight are very close to the MBP 16" - so that's an indication of what might happen if Apple continue to follow the bezel minimisation route (eventually we could get 17" and 15" machines in a similar form factor to the current 16" and upcoming 14").
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.