Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
VMware Fusion has always been better for serious work than Parallels, so the announcement about the VMware Tech Preview with support for MacOS 11.0 Big Sur beta that's coming in early July is very encouraging. I suspect Parallels is further along in their development than VMware (probably smaller, more agile team).

For Windows 10 virtualization, Parallels has proven faster at certain operations than Fusion -- for specific versions -- vs. what Fusion had available at the time. Then Fusion would bake in some optimizations to even things up for their next version. But Parallels accelerated graphics performance has remained superior vs. Fusion across pretty much all versions for Windows guests. So for those brave souls doing Windows gaming in a VM, Parallels has always been the better choice. But for serious business use where 100% consistent stability is a highly prized attribute, VMware Fusion is the hands down winner. Parallels stability has improved in recent years, though.

I most often use virtualized Linux environments, but I do spin up a Windows VM once or twice a year for certain projects.
 
Last edited:
Windows 10 on the Surface Pro X is running great, except the apps and you can only run those from the Window Store. 90 percent of them are crap.

I hope this will push MS or even developers to make more apps work with Arm CPUS. Or a Rosetta like feature for Windows 10.

Gaming....I can definitely forget that, I been using bootcamp for that part and need full resources.
 
I think Apple will still launch Macs with Intel CPU in 2027 :)

For professionals the peace of mind of having the ability to run the odd work tool that does not run on macOS or runs badly its a value even if the ones end up doing it are a few compared to others.

I guess it will depend how much of a breakthrough performance Apple is able to present on their NextGEN macOS devices. If it triple the speed at peaks and constant double speeds at the current price it’s something the market will consider I guess.

The move to Intel was different because of that peace of mind. Many Windows dev users migrated because of it ... including myself.

“why not try the Mac? I can always run Windows if ...” - There is an intangible value to this.
 
Last edited:
Well, first you need to understand that in my "day job" I'm working with Linux servers much of the time, so my #1 priority for my desktop is a useful terminal window and a text editor! I'm not doing much of anything with producing PDFs, so I'm the wrong guy to ask there.

A lot of people like Inkscape, but I have no idea how you'd get stuff from OmniGraffle into it. Again, I'm not the guy to ask. My general approach for "replacement programs on Linux" has been to download a bunch of them, try each one for a week or two and see what meets my needs best. There's always the options of migrating to a cloud-based tool like LucidCharts, since those are more or less platform-agnostic.

I'm less of a Gnome fan than I used to be, but I still prefer it to KDE. Back when I was running desktop Linux regularly (more than a decade ago), I actually preferred Enlightenment over anything else... but that's never going to be a set-it-and-forget-it option for people. The nice thing about Linux, though, is you can install and try all these different window managers out yourself, and even switch back and forth... as long as you've got the disk space to install all their dependencies.

QuickLook - Gnome apparently has a tool called Sushi which provides this functionality in Nautilus. Not something I've used on a server, though, and it didn't exist back when I was using Gnome with any frequency.

There's also the option of running specific Windows software that does what you want, via Wine. The Codeweavers folks have helped Wine take great strides with regards to Windows app compatibility.

Thanks for your lengthy reply.

My biggest gripe with Linux is the missing stability of some of the core elements of the system. I know that it is 'boring' for a developer to simply maintain an existing code-base and it is far more exciting to try out something new, but for the user-base this becomes challenging before long (looking at you Apple). However, certain projects like LaTeX have shown that it can be done. I wish a lot more of the Linux-guys would adopt this mind-set.

The Linux desktop: there is simply too much choice and not enough stability for my liking. Whenever something seems to finally deliver on the promise - the developers get bored and start a side-project. Also maintaining dependencies between libraries is not as easy as 'having enough disk space'. I recently tried to install a current version of OclHashcat with AMD-driver support on Ubuntu. I must confess, failed miserably :(
For this reason I had given up on Linux and Windows and switched to the Mac.

I am looking for a fairly stable, low-maintenance compute-environment. I think I would really like to try out a FreeBSD Desktop. Unfortunately support for audio-video and 3rd party software is even more limited.
 
Those aren't virtualisation products, they're emulation products. Bytecode emulation is significantly slower than virtualisation as it cannot be hardware assisted.
Well fine line between emulation and virtualisation. They are both if you want to be pedantic.

Since Craig said all the demos was done on ARM Mac, which included Parallels.

If this article is correct, then Parallels was running on ARM natively, rather than Rosetta. So what I am saying still stands, there will be some software that is going to do that.
 
I use Parallels for Windows, but i'm not really worried about Apple going with ARM. They will still produce Intel Macbooks for at least 2 years, so if i buy a good MBP at the end, it will last me for 3-5 years, and who knows what will happen in 5-7 years with Parallels and Windows. Too early to panic imo.
 
Last edited:
I'm glad I don't have to think about this for a while. Happily using my Boot Camp-enabled Mac for the foreseeable future.

Maybe I'll be able to buy the 16" with 5600M used at one point.
 
I really hope we see the rise of some form of x86 PCIe accelerator cards for Mac Pros for virtualisation purposes (maybe thunderbolt based versions for non Mac Pros) so that we can still continue to run x86 virtual machines at full speed and have the best of both worlds.

The reality for me is that while I do use an x86 machine for business critical work almost every day, the software I use is not exactly high performance (payroll/accounting software) and I likely would barely notice a difference in performance from running in a virtual machine on my 2012 x86 Mac Pro compared to one on a 2021 Arm Based Mac Pro.

The thing that would make more of a difference to me at this point is if Apple brought back things like Target Display Mode so that I could easily run a Windows PC through my Apple display. Without that feature, it blocks me out of being able to buy any of their iMacs or Apple displays (until a compatible KVM is released).
 
  • Like
Reactions: simonmet
Everyone thinking that Parallels and VMWare can magically deliver x86 emulation by just modifying to their existing virtualization products better think again. It's a different, and much more difficult, problem. Their current codebase is probably 5% of what a final product would be.

It's like asking the copy boy to translate the copies to Chinese while he's at it.
 
Before panic sets in: First, there is six months until new Macs will be available. Companies selling VMs have six months time to take action.

Most important: This applies to a VM that is itself written in Intel code. The VM uses Intel virtualisation instructions, the code inside the VM doesn't (unless it tries to create its own VM, which won't work). So somebody writing a VM has to translate it into ARM code, using ARM virtualisation instructions which will obviously run fine, and then connect Rosetta so it translates Intel code _inside_ the VM. That's not trivial, but there is six months time to do this.

To elaborate: We have ARM MacOS with Rosetta built in, which can run X86 code. If someone built ARM Linux with Rosetta built in, that could also run x86 code. And then we could have an ARM VM running ARM Linux with Rosetta built in, and that could run x86 code. Now if you are the one building the ARM VM then you could allow access to the MacOS Rosetta instead of making a copy, which would avoid all copyright problems.
 
Last edited:
I don't believe Monday's WWDC Hype !

I suspect Apple's new custom Si is intended ONLY for new, lower-end versions of their MacBook Air & 13" MacBook Pro.

NOT a good fit for ANY of their other Macs (for multiple reasons) !

To increase market share, AAPL knows they need to offer cheaper Macs.

As such, it's fairly obvious they will pass the savings on to customers.

As such, the list prices could/should drop by $200 USD per Mac for those two new lower-end versions !

It's a (damn) good strategy !

Take your pick:

MacBook Air for $799 USD, OR 13" MacBook Pro for $1099 USD.

I think either one would be cool to have by year's end !

And please remember, NONE of the so-called Pro Stock Analysts who cover AAPL for a living have ANY Engineering OR Software Development experience ! ... all but a few are very easily Fooled by the Hype !

Apple pass on any cost savings?

As nice as that would be, can't see it ever happening.
 
  • Like
Reactions: Detektiv-Pinky
I wonder though once most developers are using ARM for their apps how long they will support Intel. Does anyone know how long companies supported PowerPC after the transition? Our office bought some 2019 iMacs and are wondering the longevity.
There's a difference: PowerPC and Intel were quite different. Bigendian vs. littleendian, alignment rules etc. Maintaining two versions was work. Maintaining Intel and ARM is zero work. Consider that most iOS developers run their iOS code on an Intel machine quite regularly, when they use the iOS simulator. Some even have an iMac Pro so they can ran twenty or thirty copies of their app simultaneously for testing. Using Intel code.
[automerge]1592990873[/automerge]
Edit: And I sure as hell am not taking any odds against them doing so. They'd have to be nuts to leave all that money on the table.
Especially if only ONE does it, because then all the virtualisation money is theirs. If VMWare decides not to create a VM, then Parallels can grab the whole market and charge good money. And vice versa obviously.
 
So why can't it run ARM version of Windows?

We don't know that it can't. Or, more to the point, we don't know that it won't be able to run it when it becomes an issue in 1-2 years' time. There's no fundamental reason not - and there's no fundamental reason why it won't be able to "bootcamp" ARM Windows, either - but we simply don't have enough info to know what the more technical issues might be.

You can certainly run Docker on ARM, but similarly to Docker on x86, it will use the ARM-Linux kernel, so your images will need to based on ARM-Linux.

...or it could run the kernel under x86 emulation/translation. Apple said that it will include "new virtualization technologies" - the MacOS Hypervisor framework isn't new so it's possible that this could include some sort of Rosetta2-based support for emulated x86 VMs. It won't be very fast, but if you're doing web/server development all you need is fast enough.

...or (and this is probably the real point) it's 2020 and there's no reason for VMs to be hosted on your desktop machine any more.

1) Why is x86 Linux virtualization "key"? If you just need a development platform, what is wrong with ARM Linux? (assuming you are not developing x86 Linux native apps).

There's nothing wrong with ARM Linux, Linux software that doesn't already compile for ARM64, ARM32, x86, PPC etc. is the exception rather than the rule. Linux/Unix have long been based on source-level compatibility across multiple processor architectures - even for "developing x86 Linux native apps" 90% of the work can be done on an ARM, and the lions share can be done on MacOS.

It's not as if developing on an x86 avoids the need for access to a variety of other hardware for testing.

Someone will have to build ARM versions of any upstream Docker (or similar) images that you use - but lots already exist and with the current interest in ARM servers from Amazon and others, that's already on the to-do list, and the (rough) Docker equivalent of "Universal binaries" are already a thing.

Seriously: the vast majority of development for iPhone/iPad/Watch/Android/Raspberry Pi/embedded systems is done on Macs/PCs with different processor architectures. Why do people think that it's impossible the other way around? (Ans: Windows - it has always been tied to x86 by it's need to run 30-year-old binaries).

2) A lot of development is moving to cloud platforms. I use AWS EC2 instances as my Linux sandbox. It only costs a couple of cents to run per hour, and performance is just as good as a local VM provided you have a good internet connection.

Bingo! Somebody gets it! ...and have you tried any development without a good Internet connection recently?

Meanwhile, if you really can't use the cloud, it's still 2020 and a box the size of a Mac Mini will give you all the x86 you need, and you can even VPN into it from Starbucks.

People are still acting as if Tim is going to break into their house tomorrow and steal their Intel Macs. Or as if there won't be any Intel Macs for sale in 2 months time when they need to kit out a new employee. To be fair, Apple does have a history of leaving people hanging like that, but from what was said at WWDC they're taking this transition seriously, and people now have 2-3 years notice to review and modernise their workflows.

Back in 2006, I found Parallels and BootCamp with Windows and Linux immensely useful. That's really diminishing now: for website work most of us can now ignore Internet Explorer (Caloo! Callay! Oh, frabjous day!!!) and Firefox/Chromium are pretty consistent across platforms. In the last couple of years I've wasted significant time chasing down what turned out to be parallels bugs affecting Firefox and Electron - plus there's the whole issue of testing things on a touchscreen. As for "traditional" Windows app development, with x86 Macs, if I needed to do that again today, first purchase will be a Windows PC. But on a 2-3 year timescale, Windows for ARM could (a) be a viable development platform and (b) something that had to be supported anyway. Modern Windows development (using .net or whatever it's called now) is CPU independent anyway.

Those aren't virtualisation products, they're emulation products. Bytecode emulation is significantly slower than virtualisation as it cannot be hardware assisted.

Not entirely true - Rosetta/Rosetta 2 works by - wherever possible - translating code to native ops in advance, which can be far more efficient than traditional emulation. It's possible that Apple have something up their sleeve that uses Rosetta technology to accelerate x86 "virtual machines".

All this non-news article is saying is that the existing x86 Parallels/VMWare/Docker binaries don't work with Rosetta (do bears relieve themselves in the woods?) - Apple showed some form of Parallels and Docker running on the new Macs (...and I don't recall them saying whether they were native of x86 - or saying anything about their performance).
 
  • Like
Reactions: biffuz
Everyone thinking that Parallels and VMWare can magically deliver x86 emulation by just modifying to their existing virtualization products better think again. It's a different, and much more difficult, problem. Their current codebase is probably 5% of what a final product would be.
I thought about it again. Rosetta is there. On the Mac that the VM is running on. No need to duplicate it.
 
Everyone thinking that Parallels and VMWare can magically deliver x86 emulation by just modifying to their existing virtualization products better think again. It's a different, and much more difficult, problem. Their current codebase is probably 5% of what a final product would be.

It's like asking the copy boy to translate the copies to Chinese while he's at it.
Indeed, emulation is a different, more difficult problem... a problem that can be solved with talented engineers and hard work. Connectix did it for PowerPC some years ago. Magic is unlikely to play a role. It is doubtful that those vendors would release an x86 emulation product as merely another version iteration of their respective existing virtualization products, though.

Most of the x86 CISC instructions are pretty straightforward but some of the newer ones are a mess (Intel's desperate attempts to eke out a bit more performance from their uninspiring processors).

It's also possible that, over time, Apple's SoC could offer some degree of direct, onboard emulation support. Remember that the Apple-designed chips are not ARM, they merely incorporate a number of ARM cores -- and a whole lot more.
 
I expect Apple to take the lead and (in time) make it possible for Apple apps to be run on Windows. They are moving towards Swift for Windows.

In a few years we will probably be able to develop "universal" apps: develop anything on the apple platform with Xcode in Swift/SwiftUI and compile for IOS, iPad, macOS or... windows. That way they would make it attractive for developers to start at Apple and "code once" for many platforms, including Windows. Much less need for a VM then...
 
When Final Cut Pro starts running circles around anything on the Intel side, I’m guessing the Mac will become a favored platform for video editing.

But this won’t happen. Not with any full complete comparable version. For all the arm talk I’ve not seen any major software maker make an arm version that is remotely on par with the real thing. It’s mostly toy like apps.

Make no mistake the only reason Apple is doing this is an attempt to leverage iOS, nail down App Store as the only way to install arm apps, and desperately turn macOS into a more suitable cash cow like iOS. But the software has a LONG way to go before it’s even remotely on par. Years.
 
This will mean either the current intel Mac I have for work is my last work Mac, or we need to find a new CAD software that runs natively.

I wonder if anyone is going to match or exceed the power of Microstation / Powerdraft for 2D drafting and make it available on the Mac? I've been waiting for years and years for this but no one has come close (and please don't tell me to use Autocad which I loathe).
 
Well, I suppose I’ll buy an XPS or a Thinkpad X1 Extreme when my MBP 16 comes out of service.

That sucks. But there’s just no way any of the apps I need are going to be rewritten for ARM, most of them don’t even exist for MacOS, but I get by with Bootcamp or VMWare Fusion.

I know a lot of other fellow engineers of various types will feel my pain here.

Ah, ****. I’m sorry that the ARM transition is going to screw with your workflow. Here’s to hoping they figure out something before your MBP 16 needs replaced (which hopefully won’t be for several years, at the very least. That’s a beast of a machine😎)
 
This will mean either the current intel Mac I have for work is my last work Mac, or we need to find a new CAD software that runs natively.

I wonder if anyone is going to match or exceed the power of Microstation / Powerdraft for 2D drafting and make it available on the Mac? I've been waiting for years and years for this but no one has come close (and please don't tell me to use Autocad which I loathe).

What if a VM can still run, with the only restriction that it will be 'emulated'? If Apple Silicon is as fast as Apple claims, then the end user may not even notice a big difference. I'd wait to see how this whole migration turns out. It might not be as bad as some of us now expect it to become.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.