Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In latest test online the M1 Macbook Pro takes 36 minutes to compile Mozilla Firefox. While it's no bad for a thin and light notebook (and even very good), it's nothing exceptional.
 
  • Disagree
  • Angry
Reactions: ikir and NetMage
With macs stubbornly stuck at 8% market share for the last two decades - Intel has nothing to worry about. And AMD has never been and never would end up in a Mac any way. People are pretty much set in their ways. Camp Windows or Camp Mac.
I think that could turn on software. Macs have long been pretty, well-built hardware with a clean OS alternative. Now, with drastically improved performance at the same price point, they’re a particularly good value for someone looking for economical performance. Especially if native software support increases. They didn’t previously have this price/performance advantage.
 
25W CPU beats a 10 or 15W CPU in multi-core and still loses in single-core. That's not impressive.
Exactly!
Just to clarify, the article I linked is in reference to the denial wave that was alluded in recent comments. This is one of those articles and I don’t agree with it at all, they say things like “decimates” the M1 (as in, it’s 10 times faster), avoid mentioning watt usage (as if it weren’t important for laptops, longevity, etc), misleadingly treat it as an 8 core CPU, etc.

And when I meant I was giving the article a read, I meant the one linked by macrumors, which I did (plus the CISC vs RISC one) and it’s great. Looks to me RISC was the future many decades ago yet it didn’t find its way into mainstream consumers computers (besides tablets and phones that is) for some reason.

But then Apple came with such a hammer in which the ARM CPU part is just ONE part of the whole picture... if these benchmarks were to take advantage of the included coprocessing chips it would smoke eveything hands down. It would be like the difference between trying to render 3D of a game via a CPU software rasterizer or via a dedicated GPU.
 
There is no doubt that Apple as taken a good lead over AMD and a huge lead over Intel. I will order a Macbook Air soon.

That being said I feel peoples are missing one of the major point of the article. The M1 is super fast in some task like video editing because Apple has put some special hardware for those tasks in the chip. Inversely it means that tasks for which they have not done so will likely be fast too but you will not see the domination you see in video editing.

As a concrete example I am a competitive chess player and I have a chess engine that analyze billions of chess positions. I expect the M1 to be good in that respect but not better than a high end Intel chip. I will know when I receive my laptop.
Do let us know how it goes! If the chess engine of yours uses Apple’s library kits (like the ML core, fxplug, etc) and built for ARM chances are it will unleash all the power and coprocessors available on the M1 by default.
 
  • Like
Reactions: WigWag Workshop
You can now officially state that for all things multimedia, apple M1 and up will be the one to use. But again, price is the limiting factor here. PC's are just too awesome to give up; if you don't understand this then you are better off with an apple system. PC folk don't like everything closed and apple is the epitome of a closed system.

I can't even buy a color enclosure of my choosing and that's the most trivial selection a customer can have. I have plenty of apple products and I like them for what they are but I also have plenty of powerful PC's for gaming and encoding, but now I will use apple processor for encoding work instead.
These are all very fair points and I agree with them. The single-vendor closed system makes Macs an automatic no-go for a huge number of industries.

As far as workstations and high-end gaming rigs are concerned, I don't see how Apple is going to make any sort of dent in the market unless there's a huge paradigm shift in industry, developer, publisher and consumer sentiment, as well as a return to a more open platform (embracing open-source development more and having more user-serviceable and customizable devices like during the early OS X days, which is counter to the current trajectory). That and an actual product roadmap as opposed to the "Oooh, what will they release next?" model.

One area where the M1 Macs WILL kick butt is with the notebook market. With laptops that can presumably run heavy workloads for much longer and faster than the equivalent Windows machines, Apple will probably gain a fair bit of market share here.

(Especially since typical PC advantages such as customizability and upgradability don't apply as much here)
 
People who are really serious about software should make their own hardware.” - Alan Kay

Steve Jobs would be beaming over the M1.

This so much. M1 Macs are deadly silent that rarely uses the fan if at all, the chip and software design are all tightly controlled and integrated with each other, and the laptops are thin with great battery life. It’s the epitome of what Jobs would’ve wanted with mac hardware design.
 
it’s impressive that it’s not even a real comparison at this point. The M1 is insane. Anyone arguing that Intel is better is in denial.
A large number of PC and tech blogs are attacking Apple over this is so many ways.
One of the dumber attacks I've seen was from Toms Hardware. They blamed an apparent shortage of Intel and AMD cpus on Apple because Apple took up all the fab space and didn't leave enough for Intel and AMD.

The tech enthusiast industry is rushing up to attack Apple over M1. Their hatred for Apple minds them from the fact that Apple is better now because Apple has innovated in this space for a decade whereas Intel and AMD have not innovated anywhere near as much.

It's easier for them to attack who they hate then admit who they love is not as good as they should be.
 
People who are really serious about software should make their own hardware.” - Alan Kay

Steve Jobs would be beaming over the M1.
Nintendo are in the same boat. It's exactly why they are still around today. Sure Nintendo's IPs are amazing but the hardware integration is also top notch. How you play Mario or Zelda or whatever with the Nintendo developed hardware really just feels so fun.

I feel that Jobs knew this with the few platform changes Apple has done.
68K-> PPC -> Intel
Each time Apple anted a better solution and I'm sure if Apple could, they would have doen it themselves. Apple finally in a position to do it themselves, yeah Jobs would be really pleased with M1.
Not only because of what M1 is but because Apple as a company is still able to move on to newer and better technology and live in outdated legacy land, like Intel and others currently are. Moving to better tech, not because you have to, but because you want your own stuff to be even better is a really Apple thing to do.
 
A large number of PC and tech blogs are attacking Apple over this is so many ways.
One of the dumber attacks I've seen was from Toms Hardware. They blamed an apparent shortage of Intel and AMD cpus on Apple because Apple took up all the fab space and didn't leave enough for Intel and AMD.

The tech enthusiast industry is rushing up to attack Apple over M1. Their hatred for Apple minds them from the fact that Apple is better now because Apple has innovated in this space for a decade whereas Intel and AMD have not innovated anywhere near as much.

It's easier for them to attack who they hate then admit who they love is not as good as they should be.

Given that Intel uses its own fabs, how could Apple’s order with TSMC affect them?
And if AMD can’t get fab capacity it’s nobody’s fault but AMD. They used to own their own fabs, but now they rely on the kindness of strangers.
 
You say this
PC's are just too awesome to give up

Then you say
I also have plenty of powerful PC's for gaming and encoding, but now I will use apple processor for encoding work instead.

Looks like that supposedly awesome PC you have was just given up, replaced by M1 for encoding work.

You also say
For gaming apple's platform is a joke. In fact a PS5/Xsex will be an outstanding gaming machine at a fraction of any apple computer and/or PC.
The Switch beat them all, it terms of quality of games and number of gaming consoles sold.

Actually the M1 can game really well. Sure it's not bleeding edge 2020 games but for a low end Mac, the performance is amazing. Take a look for yourself.


 
Last edited:
  • Like
Reactions: ikir and NetMage
It still isn't clear to me from the either the MacRumors post, or the original article by Erik Engheim on Medium (I read both), how the M1's unified memory is different from that on an Intel/AMD chip with integrated graphics (which I'll abbreviate as "IAC").

Yes, I understand from Engheim's article that the M1's CPU and GPU can simultaneously address the same memory. But is that not the case on a modern IAC?

For instance are modern IAC's configured such that, while the CPU and GPU share RAM, the RAM is partitioned such that both the CPU and GPU don't simultaenously have access to the same RAM addresses, while they do have such access in the M1?

And/or is another difference that the M1 also unifies memory for the CPU and GPU at the cache level, while IAC's do not? [From Anandtech: "The M1 also contains a large SLC cache which should be accessible by all IP blocks on the chip."]

Etc., etc.
 
Last edited:
All these are issues listed have been known. What surprised us all that Apple could make on the first attempt a processor that beats the Intel and AMD by 60% on first try! They also include encryption, video codecs, etc.
It's not Apple's first try though. The A Series has been doing really well for a while now.
 
It won't matter because PC will still remain dominant within 5 years especially with the <5nm Ryzen processors forthcoming. It all comes down to price point. Sure you may have a really ASIC like apple processor but reality is if the prices are not comparable then it won't matter. BUT apple will take the niche market share and make the most profit from that just like they are doing with smartphones.

You can now officially state that for all things multimedia, apple M1 and up will be the one to use. But again, price is the limiting factor here. PC's are just too awesome to give up; if you don't understand this then you are better off with an apple system. PC folk don't like everything closed and apple is the epitome of a closed system.

I can't even buy a color enclosure of my choosing and that's the most trivial selection a customer can have. I have plenty of apple products and I like them for what they are but I also have plenty of powerful PC's for gaming and encoding, but now I will use apple processor for encoding work instead.

For gaming apple's platform is a joke. In fact a PS5/Xsex will be an outstanding gaming machine at a fraction of any apple computer and/or PC.
How true! We have also learned the a closed system can be optimised and an open is not. The PC has compensated by high power consumption needed to drive the modular and generic CPU and GPU architectures.
 
All intel Macs are now obsolete yesterday's junk. Apple has really nice business going on! That said, I love my MacBook Air Silicon.
Obsolete? Hm my Intel Mac still works quite fine despite M1. My Mac also runs lots of mission critical software natively...
 
  • Like
Reactions: theorist9
It still isn't clear to me from the either the MacRumors post, or the original article by Erik Engheim on Medium (I read both), how the M1's unified memory is different from that on an Intel/AMD chip with integrated graphics (which I'll abbreviate as "IAC").

Yes, I understand from Engheim's article that the M1's CPU and GPU can simultaneously address the same memory. But is that not the case on a modern IAC?

For instance are modern IAC's configured such that, while the CPU and GPU share RAM, the RAM is partitioned such that both the CPU and GPU don't simultaenously have access to the same RAM addresses, while they do have such access in the M1?

And/or is another difference that the M1 also unifies memory for the CPU and GPU at the cache level, while IAC's do not? [From Anandtech: "The M1 also contains a large SLC cache which should be accessible by all IP blocks on the chip."]

Etc., etc.

The article is wrong in that “simultaneously” doesn’t really mean “simultaneously,” but anyway... (they probably can read simultaneously, but not write - depends how many read and write ports the RAM has)

Anyway, the difference is that when the GPU and CPU share the RAM in a CPU with normal integrated graphics, the GPU has it’s dedicated portion of RAM, but that same portion of RAM isn’t used by the CPU. The system treats the integrated GPU as if it was a discrete GPU with its own RAM. (This is an extreme oversimplification). The system still shuffles data back and forth between the CPU’s part of RAM and the GPU’s part of RAM, and typically has to modify the values as they shift from one direction to the other.

As for SLC cache, that’s for flash, not for the memory subsystem.

There is an L2 cache, and L1 caches, though. Not clear how these are synchronized with the GPU. Typically each CPU core has its own L1, and they share the L2. This means if an L1 is dirty (because someone wrote to it), and if the change is not reflected in the L2 or in main memory, then the GPU wouldn’t see the change. There are various ways around this (including “write-through” caches that always write all the way through to main memory or at least to L2, buses that advertise what addresses are dirty, etc.). Not much is known about what Apple is doing here.
 
Apple can just make a slim 2 in 1 iPad/MacBook hybrid in one unified body (slim modern bezels please) with a M1 processor and it should run all iPad and MacOs apps natively with full screen and split screen support....call it The MacPad Pro and Im game......makes perfect sense....don't know why Apple still doesn't do 2 in 1s...please don't give me the market and profit crap...just give the customers what they want for a change and bring the price point below 1 grand.
 
Last edited:
  • Like
Reactions: maerz001
That's why some developer believed that M1 won't be restricted to just 13-inch MacBook Pro.

1A5CB1FD-41EA-45B1-BFA3-272D52E33963.png
 
Last edited:
There are articles like this surfacing around: https://wccftech.com/intel-and-amd-...apples-m1-in-cinebench-r23-benchmark-results/

They say things like 11th generation Intel or latest AMDs “trash” the M1 chip. Quite misleading and pinning real 8-core CPUs against the 4+4 that the M1 actually is. And by “trashing” is maybe 10% faster in some tests. However, for the layman, that will read just as that: M1 are indeed trash ☹️

What is really solid is the comments sections, quite a few people have been calling it out. With the assorted “I don’t care about TDPs”, “PC master race”, etc comments.

Anyways, interesting article, giving this rundown a read.
Note that the first benchmarks that saw M1 significantly behind Ryzen actually tested A12 vs Ryzen.
 
How can they call this RISC when they now have a more complex instruction set? Adding instructions for encryption, graphics, signal processing, etc. This is more the definition of CISC.
RISC = Regular instruction set computer. All instructions 4 bytes. And there are no instructions added, there are complete processors added.
 
Problem with Risc has never been technology. Lots of RISC processors in the past have blown away their CISC competitors.

The problem has always been “does it seamlessly run Windows and existing Windows apps?”

BYOD, the mobile Arm hegemony, and Apple’s expertise at supporting multi-architecture code has finally broken the glass.
Apple does not need their hardware (M1 etc) to run Windows in order to succeed. In the past? Yes, they did but no more. They are big enough to plot their own path and give the finger to MS. If MS wants to make their offering for ARM CPU's run on M1 then fine but they (Microsoft) have an awful lot of work to do to catch up with where Apple are now.

A few years ago, many commentators were saying that Apple wanted to get rid of the Mac line altogether and that because it was such an insignificant part of their revenue, it should go the way of products like X-Serve. What we have seen here is a leap forward and a huge commitment by Apple in the Mac brand.

There is already a project running (well just starting) to make Linux run natively on M1 hardware. It remains to be seen if MS has enough of an interest to do the same or will they just let things carry on as now and hope more and more people get sucked into their subscription model forever.
If M1 (or a subsequent chip) does run Linux natively then I'll be right there with the early adopters.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.