Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
wow~~~, that is exciting. Using a 2019 iMac 27 inches now and my last MacBook air is back to 2011 (i7, still working all right). It will be time to upgrade my MacBook Air! ;)
 
Wow, I am so impressed by these benchmarks! That's a MacBook AIR that's benchmarking slightly faster than my 12-core fully upgraded 2009 MacPro 4,1/5,1 (so very nearly 42Ghz across 12 cores). Of course, it's still somewhat likely that doing something like a Handbrake conversion (once Handbrake is optimized for ARM) will overheat even the M1 MacBook Air and throttle like heck, but if not? Wow. Just. Wow.

Imagine a world where 5 years from now we have Bootcamp again because Windows has largely migrated to ARM due to the performance offered? Imagine being able to throw down your MacBook, plug in your HMD, boot into Windows and run the latest and greatest VR experiences rather than being stuck building a dedicated VR rig?

I won't drink the kool-aid until I we really see how this all shakes out, but at least I'm reasonable hopeful now that this won't just be a gigantic fustercluck.

It's likely I'll have my hands on a bunch of MacBook Airs with the M1 through an educational channel and the CARES act in the next 6 weeks or so. I'll be sure to test out how badly they thermal-throttle (if any tools are available to do so at that point).
 
  • Like
Reactions: Fefe82
The 2020 MacBook Air was the machine they are referring to as 3x faster - for the new m1 air.

Pay attention :D
Which Model MacBook Air? I'm curious because I have the same machine as you and I run VM's on it. It runs ok, but my test of an improvement is going to be Chrome + Google Meet video conferencing running on a Win10 VM in either Parallels or Fusion. Currently, this works, but **** it taxes the machine. Ironically, my old 2013 MBA i7 was only slightly slower.
 
These numbers look great. Will we ever be able to run Windows using VMware on Apple chips? I use an older version of AutoCad for Windows.
Parallels have already made an announcement in this regard.


I use Parallels all-day-every-day on a current 2020 i7 MBA so I"m very keen to see what performance gains will be achieved.

I also have and user VMware on the same machine, but I've found Parallels to be considerably faster at the things I need it to do. VMware is still the more flexible product in terms of its range of supported guess OS's.
 
RIP Intel/AMD.

GG WP Apple.

I don't think there are ANY x86 processors that match this single-core speed.
No. In real terms this isn't even competitive against hardware that is a generation behind.

Single core blazing speed is dandy and a great bragging right, but at the end of the day it's application is limited as nearly every piece of software is optimized around being processed with multiple threads. Even longtime holdouts like videogames are beginning to better utilize multiple cores and threads.

The only reason anyone would opt for this CPU against any current AMD or Intel offering is because you have no other option.
 
No. In real terms this isn't even competitive against hardware that is a generation behind.

Single core blazing speed is dandy and a great bragging right, but at the end of the day it's application is limited as nearly every piece of software is optimized around being processed with multiple threads. Even longtime holdouts like videogames are beginning to better utilize multiple cores and threads.

The only reason anyone would opt for this CPU against any current AMD or Intel offering is because you have no other option.
cc: Intel Marketing - for this past year prior to Zen3.
 
cc: Intel Marketing - for this past year prior to Zen3.

We learned years ago in terms of absolute productive usage that cores, instructions per cycle, and memory are the defining factors. Single core speed was important to the realm of gaming where it was harder to optimize. Those days are passing.

It's baffling to me that a corporation like apple would pass that off as faster, and that media outlets like this would eat it up. There is going to be a big issue when they try to pass that crap off on workstations where compiling and rendering time are mission critical and it runs people off again. It's a repeat of the power of days
 
  • Haha
Reactions: theotherphil
We learned years ago in terms of absolute productive usage that cores, instructions per cycle, and memory are the defining factors. Single core speed was important to the realm of gaming where it was harder to optimize. Those days are passing.

It's baffling to me that a corporation like apple would pass that off as faster, and that media outlets like this would eat it up. There is going to be a big issue when they try to pass that crap off on workstations where compiling and rendering time are mission critical and it runs people off again. It's a repeat of the power of days
I could see your point of view if this was Apple’s 95 watt desktop cpu in the iMac but this is the 10 watt mba version for Pete’s sake. This is remarkable with a big IF the benchmarks pan out in real world tasks like editing batch files of 500 RAW 45 megapixel files or 4K video editing. I would love a MBA form factor that gives my much heavier mbp 16 a run for the money in those tasks!
 
  • Like
Reactions: Spock1234
I could see your point of view if this was Apple’s 95 watt desktop cpu in the iMac but this is the 10 watt mba version for Pete’s sake. This is remarkable with a big IF the benchmarks pan out in real world tasks like editing batch files of 500 RAW 45 megapixel files or 4K video editing. I would love a MBA form factor that gives my much heavier mbp 16 a run for the money in those tasks!
Sure, but they didn't even compare to a current gen cpu. They are setting the expectations with this benchmark and faster then i9 claims that they won't be able to satisfy.
 
  • Disagree
Reactions: Spock1234
I have a question for those who are deep into chipset design :)!

Apple was involved with Arm early on, and so it's natural that they should've chosen to develop Arm designs for historical reasons if nothing else. But there is an alternative, RISC-V and I wonder if there are any design advantages of one over the other? Is there anything in the instructions that allow for superior designs based on RISC-V vs Arm?

Because if there is really no difference as in "six of one half a dozen of the other", then cool. But if RISC-V allows for superior development, then a company - perhaps Intel or AMD, or NVIDIA or X - might plump for RISC-V based designs and eventually catch up to and lap Apple's inhouse ASi designs.

Again, I'm not deep into the chipset design world, so this is probably a silly question, but I'm still curious! TIA!
 
No. In real terms this isn't even competitive against hardware that is a generation behind.

Single core blazing speed is dandy and a great bragging right, but at the end of the day it's application is limited as nearly every piece of software is optimized around being processed with multiple threads.

The only reason anyone would opt for this CPU against any current AMD or Intel offering is because you have no other option.
M1 slower than last year's Intel chips!!?? You sound so sure of yourself - Can you show us some benchmark scores to back that up? Your claim that single-core scores are irrelevant since 'every software' is optimized for multi-core shows that you know nothing about software.

I could say that the only reason anyone would buy Intel/AMD over this is because they are insecure Apple-haters, but unlike you I don't know every computer user on the planet (or their needs), so I will refrain from making such broad generalizations.
It's baffling to me that a corporation like apple would pass that off as faster, and that media outlets like this would eat it up. There is going to be a big issue when they try to pass that crap off on workstations where compiling and rendering time are mission critical and it runs people off again. It's a repeat of the power of days
Its baffling to me that some keyboard jockey would pass off his delusions as facts, and expect us to eat it up. There's going to be a big issue when he tries to pass off more of this crap as 'fact' on this forum.
 
Last edited:
Your average Joe is probably not particularly interested in whether a chip is theoretically "faster" or whatnot. He/she is interested in the practical outcomes. Who cares if ASi is "the fastest".

Here's what will move the needle for Joe - (1) battery life. I certainly will appreciate a laptop I can work on all day as long as I want, and then when I'm not using it, because I have to sleep, I plug it in and then I have a whole day use again. In this scenario 15-20 hours doing ordinary use (surfing the net, watching videos, listening to music, email, word processing etc.) is absolutely ideal.

Next, (2) - is it "fast enough" for my use scenarios? If yes, then someone saying "chip X" can do your task 0.0001 second faster is meaningless. I just don't want spinning beachballs, I want instant launching of apps, I want instant on/off and so on - never keep me waiting.

Next (3) - heat. I don't want my lap fried or my hand burned on the laptop. If it's cool under all circumstances, even if I'm sitting in the garden or on a hot beach in the summer - fantastic, that's all I need heatwise.

Next (4) - noise. I want it silent. Absolutely silent. I'm listening to music or I'm in a very quiet room - I don't want to hear my laptop chugging along.

And those are the key requirements that Joe wants from a processor. If it's "fastest" is irrelevant. As long as it delivers on this, we're all good. At some point I suppose ALL laptops - including Windows - will reach that state, and then we'll stop talking about processors altogether (i.e. the average Joe). They'll ALL be equally good from the point of view of the 4 criteria I outlined above - as far as THE PROCESSOR is concerned. Then they'll have to differentiate on different things - screen, keyboard etc., but processor will no longer be something talked about.

It is possible that the M1 has reached that stage for the Air. Clearly it has not reached that stage for all laptops - the Pro has to have active cooling after all - but it looks as if we've got it with the Air, or very close to it. Now obviously, this will not be true for pros who need massive 3D modelling and such computationally intense tasks, but for average Joe... it may just already be there! YMMV.
 
They are setting the expectations with this benchmark and faster then i9 claims that they won't be able to satisfy.
How could you possibly know this?? Another baseless claim, based on information you could not possibly have. You have zero credibly left. If you don't know what that means, it's like 'lives' in a video game. When you're out of lives, you have quit the game and go away. Too bad this rule does not apply to this forum or real life.
 
If the thermal envelope is the primary determinator of performance differences between the M1 MBA and the MBP, I'm wondering how long it'll be before we start to see 3rd-party "Active Cooling" solutions for the new M1 MBA.

I can almost hear the claims of "Only Our Active Cooler increases your MBA's performance by 200%!"........ while DIY'ers start strapping ice-blocks to the underside of their MBA... :p
 
No. In real terms this isn't even competitive against hardware that is a generation behind. [...]

The only reason anyone would opt for this CPU against any current AMD or Intel offering is because you have no other option.
Silliness on stilts. The new processors are more than competitive with what they're replacing.

Every person who buys these will have other options, so we'll see if you're right if they sell zero units.
 
I’m not so sure that there is a limit on throughput, it’s just how Apple is choosing to differentiate it’s “low end” products. The models these M1 devices replaced only ever had 16GB and 2 ports. These are for your average joe who only browses the web/ does some email, sorts their photos and some light video editing. Great for students etc.

The higher end models had 4 ports and a greater range of memory, as well as faster CPU’s which I expect to continue when that specific Apple Silicon is released. These are the models video editors, developers and other “professionals” actually use.
Why make a Mac mini weaker that the unit it replaces? We can see why the price decrease. Despite the impressive nature of the M1 chip my speculation remains. Apple will have a silicon chip for the pro line-up. My instinct is that it won’t be M1 but rather Mx and that will be the chip in the MacBook pros with more than 2 ports and maybe more RAM. Certainly a Mac Pro will need user upgrade able RAM, 10gb Ethernet and other slots for expansion.
 
Apple's Final Cut could have been a great demo, and it's a 1st party app which Apple controls and can have ready. I just think it was a huge missed opportunity to show the world it's a true leap forward.
I did wonder why in the press event Da Vinci Resolive was shown twice and Final Cut Pro wasn’t shown at all and if it should be interpreted to mean Ape is stepping back from it.
 
I did wonder why in the press event Da Vinci Resolive was shown twice and Final Cut Pro wasn’t shown at all and if it should be interpreted to mean Ape is stepping back from it.
I noticed that too; seems unusual to call out a competitor’s product and not their own.. I‘m not making any purchase decision until I see these machines running the software that I use or am interested in.
 
Man I really, really cannot wait to see what the M1 will do in a desktop environment a la iMac, even Mac Pro. The numbers this chip is putting out with such a low power draw is insane!

I wonder if Apple could possibly make a “lower end” fanless iMac and then the standard iMac with active cooling. Not all of us are going to be running full 4K/8K video editing programs all the time. With enough heat sinks and thermal paste it could be possible, right?

Anyway even if that doesn’t happen the iMac should be amazing and though I’m tempted to get a laptop again (love what I’m seeing from the new M1 Air) this is the one I’m most excited for.
 
The fact that the M1 requires Big Sur (or later) is a non-starter for me. 'm a former FanBoy but Apple continues to push me away--- Big Sur is awful imho. Looks bad, privacy issues, buggy---so I uninstalled and went back to Catalina---which was far from an easy, erase and clean install. The more I think about it, it's clear to me that Apple is doing with the Mac line what they have done with iPhone. I think they are trying to build in obsolescence on the hardware side such that you must buy new hardware every couple of years if you want the newest OS (really iOS). They will do this, in part, by blurring the lines between iOS to OS such that they will seem nearly the same. The iPad Pro and MacBook M1 with Big Sur will look very similar. Just my 2 cents.
 
It’s mostly an issue of drivers. The low level code you’d need to support eGPUs is not the sort of code you’d want to be running through a translation layer.

I predict that at some point in the future eGPUs will be supported, but certainly not before the high end (and 16”) MBPs are released, and even then, only for a small number of external graphics cards. It may take quite awhile, though - eGPUs are not that popular, and Apple still sells intel machines for those who really need eGPUs, and the engineers have their hands busy.
It doesn’t matter if eGPUs aren’t in that mass market numbers.

Those small number of people using it are influencers.
 
I did wonder why in the press event Da Vinci Resolive was shown twice and Final Cut Pro wasn’t shown at all and if it should be interpreted to mean Ape is stepping back from it.
Resolve is cross platform, FCP is apple only, I don't think there's anything to be read into this choice- resolve is a better example to use for comparisons.
 
  • Like
Reactions: Minxy
Man I really, really cannot wait to see what the M1 will do in a desktop environment a la iMac, even Mac Pro. The numbers this chip is putting out with such a low power draw is insane!

I wonder if Apple could possibly make a “lower end” fanless iMac and then the standard iMac with active cooling. Not all of us are going to be running full 4K/8K video editing programs all the time. With enough heat sinks and thermal paste it could be possible, right?

Anyway even if that doesn’t happen the iMac should be amazing and though I’m tempted to get a laptop again (love what I’m seeing from the new M1 Air) this is the one I’m most excited for.
Apple easily could have replace the current low end iMac with a M1 version. Since they didn't' I'll assume there won't be a low level iMac (also no iMac Pro) next year.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.