Gaming too on macOS with these machines can be amazing, time will tell, devs seems getting interest, market share will speak, but it will takes time anyway. Still a great casual gaming or just few titles machines, without noise fans.I think when considering an Apple computer you have to look at the whole package. The design of the machine, the speed of the processor, coolness, quiet, battery life, the software that comes with the machine, the operating system, price.
You are going to lose out on some things. For example a 24” M1 iMac or M1 Mac Mini doesn‘t stand up to the latest Intel desktop machines in terms of brute processor speed. But its a vastly quieter, cooler, more elegant machine, while still being blazingly quick in most areas.
The M1 MacBook Air is excellent value and very fast for the price. Most laptops in its performance bracket are around 2000 euro’s here. And the M1 Pro MacBook Pro’s are cutting edge laptop technology bar none, as long as you don‘t care about gaming.
Gaming too on macOS with these machines can be amazing, time will tell, devs seems getting interest, market share will speak, but it will takes time anyway. Still a great casual gaming or just few titles machines, without noise fans.
Firstly, virtualization is completely different than emulation. Virtualization allows (more or less) direct access to the hardware while emulation makes you dependent on an emulator. And that's exactly what you do not want if you can avoid it.This sounds like an emotional reaction rather than an intellectual one. What’s the problem with “emulation”? Entire industries run in virtual machines and emulated environments. The web is based on “emulation“. More so, x86 is based entirely on “emulation” - x86 binary is just a convention that CPUs translate into native microcode. In the end of the day, why do you care whether your x86 application is translated in the fly to some sort of internal code that the CPU runs or whether it is translated in the fly to ARM code the CPU runs? I would think the main concern is correctness rather than what happens under the hood.
What happens if Apple eventually scraps Rosetta while you are still very much reliant on it? Apple scrapped the original Rosetta in 2011 btw, 5 years after its introduction.
I thought he was talking about running emulated x86 on a Window-on-ARM VM, running on Apple Silicon.
Perhaps you mean Rosetta, which I understand is essentially a translation layer that runs (mostly) at installation time, with some on-demand recompilation to ARM64?
I meant other kinds of emulation. (other types of machines)I thought he was talking about running emulated x86 on a Window-on-ARM VM, running on Apple Silicon.
Well said. I'm a software guy and don't care what the hardware really is -- as long as it runs what I want it to run, I'm happy.This sounds like an emotional reaction rather than an intellectual one. What’s the problem with “emulation”? Entire industries run in virtual machines and emulated environments. The web is based on “emulation“. More so, x86 is based entirely on “emulation” - x86 binary is just a convention that CPUs translate into native microcode. In the end of the day, why do you care whether your x86 application is translated in the fly to some sort of internal code that the CPU runs or whether it is translated in the fly to ARM code the CPU runs? I would think the main concern is correctness rather than what happens under the hood.
Which is why you're insisting that you need to run your Windows apps on an AS Mac, right?Well said. I'm a software guy and don't care what the hardware really is...
rosetta does not let you run a full x86 Windows VM. To my knowledge, doing that is not possible. You CAN run Arm Windows 11 in Parallels.One word: Rosetta. The issue here is whether certain software runs well enough and, in some circumstances, whether it actually runs better on an Intel based Mac than an Arm based Mac. With this perspective, an Intel i9/i7 may serve you better than an M1. For example, ATP from Fidelity runs better on the Intel based machines. There may be other examples.
Yes you can't rely on just one benchmark. OTOH the 580 is not a particularly fast GPU. The Vega and RDNA GPUs in later iMacs are much faster.Regarding the GPU, don't trust the numbers. My iMac 2017 5K had a Radeon Pro 580, which is supposed to be much faster than M1, but my girlfriends M1 Macbook Air did GPU accelerated tasks faster than my iMac, like photo editing.
Firstly, virtualization is completely different than emulation. Virtualization allows (more or less) direct access to the hardware while emulation makes you dependent on an emulator. And that's exactly what you do not want if you can avoid it.
You are right, emulated environments are very much a thing, but they are only a thing where they are absolutely required (eg for super outdated and critical software that would be too expensive and too much of a hassle to rewrite) and they are always a massive pain in the ass for everony working with it. It's not about the performance hit, it's about reliability and maintainability.
What happens if Apple eventually scraps Rosetta while you are still very much reliant on it? Apple scrapped the original Rosetta in 2011 btw, 5 years after its introduction.
You're right, I've never worked for a fortune500 company -- never said I did, but I haven't worked for mom and pop companies either. I'm talking from the vantagepoint of small to midlevel manufacturing, you know, the people who don't have a fortune500's budget and small margins in the U.S.! I laughed because of the absurdity of me working for a fortune500. I've never wanted to!He's not speaking from experience. Highly unlikely any fortune 500 company would risk using emulation for anything production or mission critical. Rather, it's the mom and pop shop that would resort to emulation and it'll be for things that don't risk liability like retiring 1980s Apple II hardware and resorting to emulation for making piano music scrolls.
Based on your needs you are probably at the high end of what the M1 is optimized for but the next jump up to the MBP M1 Pro may be a more expensive jump. Since Apples products are in flux right now, if you need to buy now, I would go with the M1 13” MBP or even the M1 Air depending on you budget. Consider this a short term decision for the next couple of years. By then all of the products will transitioned and moving onto their next revision of the M-series chips. Software will be fully adapted to AS. You will have more experience with the new platform and can make a more informed choice. Getting the M1 will give you a good tool at a reasonable price. In a couple of years, you can trade it in on something that meets your future needs.
It’s hard to know what Apple will do with the lower end AS as far as monitors go. The M1s are currently setup with display controllers for two screens. For the Air and iMac that is one internal and one external. For the Mini, both are external. There does not seem to be any architectural limitation but more of a product segmentation decision. They may decide to expand that in future verions or they may decide that the number of monitors is sufficient for purely consumer models and anyone with need for more monitors will need a “pro” model. Hopefully we will see what they are thinking when the M2 models come out later this year or early next.I’m aiming for those models, but I can hold off until the next generation. I grabbed my Mac mini at the right time, but the tech side of me wants to play with these new chips. My target machines for my upgrade will be the MBA, MBP 13-inch or iMac.
Unless the high-end mini can handle four monitors in the future ?
Just kidding. I would imagine the Macs in the future will be able to handle three screens minimum in the future.
rosetta does not let you run a full x86 Windows VM. To my knowledge, doing that is not possible. You CAN run Arm Windows 11 in Parallels.
Every time we have some kind of 'what should I buy' these "But I have to run Windows" people come out, even in threads like this where the OP wasn't asking about Windows compatibility at all. Frankly, it's a little tiresome. If you need to run x86 Windows, the days of doing that on a Mac are numbered. Sure, you can buy an Intel Mac and do it there (or run Bootcamp) but then you're buying a less capable Mac and hurting your macOS and Mac apps experience so you can run Windows.
Looking at the hardware as a tool, if you HAVE to run x86 Windows AND you want the best Macs, you want Apple Silicon Macs and a Windows box. If you can compromise your Mac experience, buy an Intel Mac for now, which pushes the decision out a few years. But the writing is on the wall.... At some point Macs will all be Apple Silicon and you will be choosing to use vastly inferior hardware (an older Intel Mac) to maintain the ability to run x86 Windows.
That makes no sense if one is primarily a user of Mac software. Think about it. It's 2026. You can buy a Mac with the M6, which is 5x as fast as the M1. But you're going to stick with your old 2019 Intel Mac... because of Windows? Where's the logic in that? Aside from jumping up and down saying "But I want to run it on Mac!!!!" what would be the argument against buying a Windows machine to run Windows apps?
This ^ . A lot of enterprise software is now implemented as web-apps that can be run on just about any client browser, or cross-platform frameworks. It's really only specialist industrial, engineering and scientific software that depends on powerful Windows client machines (plus lots of games of course!).To be honest, even as a Windows network administrator the requirement to run x86 windows VMs even for me is going away.
Why?
Web tools
Cross platform applications
Spinning up a Vm for test in azure is trivial
Spinning up a desktop if required in azure is trivial
Yes, some people probably need Windows. If that's the case you're probably better off with a Windows machine. If you really hate running Windows on the bare metal that bad, consider that a motivator to adjust your workflow to use/demand native Mac tools. Everyone running windows in a VM and not telling the vendor to port their stuff to native Mac applications is just furthering the problem. Email/log a ticket with the vendor requesting a Mac version. Sure it won't happen overnight but just running it in windows without requesting native - it will never happen. Ever.
Or just suck it up.
We have a few small web apps, but really, the gawd awful data entry capabilities of browsers just doesn't do it. Maybe when we have something approaching a fat client in capability, everyone will move to it, even me <g>, but now, no way, no how. It's a dying fad.This ^ . A lot of enterprise software is now implemented as web-apps that can be run on just about any client browser, or cross-platform frameworks. It's really only specialist industrial, engineering and scientific software that depends on powerful Windows client machines (plus lots of games of course!).
I used to run local Windows & Linux VMs all the time. I now just run them on cloud series (Azure or AWS mostly). I like to keep a Windows machine around "just in case" but have very little cause to use it often. Some of my hobby astronomical software is Windows only, and this needs to to interface with equipment so a cloud instance isn't possible.
Thanks for the reply. I'm curious about what is lacking in the data-entry capabilities of web-apps for your usage?We have a few small web apps, but really, the gawd awful data entry capabilities of browsers just doesn't do it. Maybe when we have something approaching a fat client in capability, everyone will move to it, even me <g>, but now, no way, no how. It's a dying fad.
Large spreadsheets (e.g. Excel online) are less functional than the fat clients currently, but I expect this to improve.
One of the big bonuses of the M1 is also being able to run iOS apps on the Mac. It's really hard to use anything at this point other than the Apple ecosystem because everything just works so ridiculously well together. I've all but given up on PC gaming