Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They got complacent relying on enterprise sales.
Fat margins makes most companies fat and lazy willing to coast so as not to upset the status quo, the hookers or the blow. But I digress.

I find it telling that neither Intel nor NVIDIA wound up in the last two iterations of the PlayStation or the Xbox. Neither has much penetration into the SoC market, especially mobile devices, neither seems to care, especially. I wonder if that is a good long term strategy. Maybe their enterprise sales make it a moot point, maybe not. Time and quarterly profits will make the determination.
 
“So, what's the likely catch? I'm willing to believe this processor is exceptionally powerful per watt, and give Apple credit where credit is clearly due, but there must be a tradeoff somewhere. Intel, AMD, even IBM or Qualcomm, know a lot about CPU design and have been fighting over the best engineers for decades”

1) for once, Intel was the big one thanks to his brain guy that now works at Apple for over 7-8 years. Thats why Intel was what it was and now it is what is it.
2) this is the result of controlling your own hardware and your own software. That m1 is just for the apple devices and thats it
No one has, for now its own soft and own chip like apple
Microsoft has windows but it must collaborate with qualcome for the arm windows device and their emulation is a joke compared to rosetta2
So ..this is the result when you steal the best chip guy today, when you build your own soft+hardware that works together and for now apple has monopoly for this desktop class architecture. Lets hope others will come soon for competition otherwise antitrust will try to trash Apple
No wonder android was made
Cant believe someone actually managed to wangle ‘Apple has a monopoly‘ in this thread 🤣🤣 Jesus.
 
Are you sure about that? The RAM appears to be in the M1. What exactly are you saying is going off-chip?

very simply it’s not likely that every single bit of code can be resident in the 16gb currently on die.

so you will need to at minimum be sourcing data from solid state storage which is much much much slower in every respect than the on-chip / on-die memory.

Basically the M1 is a mobile focused chip that has been beefed up by dumping a massive amount of memory right into the same silicon as the rest of the architecture. This obviously presents significant advantages. But it also requires a rethinking of development and data management.

I guarantee that performance of apps in Rosetta (due to the nature of the architecture and design being emulated) will not be directly on par or even meet the expectations being set by these geek bench scores.
 
Sitting here on a newly purchased 2019 MBP 16" and wondering if I should return it and get the MBP 13" now :confused::confused:
Probably not if you needed the 16" MBP in the first place. M1 is limited to 16 GB of RAM, which is not enough for pro work. I regularly have all 4 Thunderbolt ports in use, and there are only 2 in the 13" M1. While I have no doubt the integrated graphics are a significant step up from Intel's, there is nothing to indicate how it performs against a Radeon Pro 555X with 4GB. That's another big consideration as I'm usually driving external 4k displays.

There is a lot more about the 16" MBP that makes it the best laptop for pro type work than just the processor performance. Now, I am very much looking forward to the silicon they rollout for the 16".
 
  • Like
Reactions: Rokkus76
can someone explain the benefit of buying a MacBook pro, if MacBook air has the same chip which means same cpu/gpu etc.. and is much lighter and nice looking. Why would anyone pay more and buy a heavier MacBook pro? just for the extra battery life?

we have to see the real-life usage if the fans on the MBP provides bigger performance
 
So, what's the likely catch? I'm willing to believe this processor is exceptionally powerful per watt, and give Apple credit where credit is clearly due, but there must be a tradeoff somewhere. Intel, AMD, even IBM or Qualcomm, know a lot about CPU design and have been fighting over the best engineers for decades.

It strikes me as unlikely that Apple has simply beaten all of them in all use cases, with less power, on their first desktop class CPU. It's not that I'm calling BS, just that engineering doesn't usually work that way; there's usually a tradeoff made somewhere.
Apple's advantage = less bureaucracy in the software and hardware development teams. Unified Goals.
Intel/AMD + M$ + et al = very compromised development path. Herding cats.
 
This is so fast it might actually make the 16GB of RAM tradeoff worth it. Holy crap! And in a MacBook Air that cost a third as much as my 2019 16" MBP.

Must stay strong and not upgrade yet. I don't actually need it and not all the software is going to run at full speed yet…
 
  • Like
Reactions: Coolkiwi
Apple comprises a very small market share for them, this means nothing to just about the entire computing market, considering apple won’t be selling chips to anyone else.

The chip is nice, but ultimately no one outside of the apple ecosystem is going to care because it’s not viable for them.
Sure, Apple leaving Intel is something Intel can weather in and of itself. Apple’s one of Intel’s largest single customers, but there are plenty more.

The issue for Intel is that — if this proves successful for Apple both in terms of the products they can offer and getting people to actually buy them — other manufacturers likely will be looking into developing their own silicon or otherwise shifting their silicon business elsewhere. It can have major ripple effects and send Intel into a tailspin.

If it works out, that is. If this is any indication, well…it probably will.
 
There is no doubt that AMD and Intel are going to try to emulate some of what Apple is doing with their silicon, in an effort to reduce the gap between them in power per watt, but Apple will have protected their IP well enough to be more than just a hurdle for competitors to overcome. And as we all expect, the next level of chip for their desktop and higher end MBP's, whatever it ends up being called, will likely leapfrog performance over the M1, let alone be far, far, ahead of AMD and Intel.

As much as desktop chips don't have the same power constraints that mobile devices do, Apple will still focus on efficiency so that they can reduce heat and potential for throttling, which continues to be an issue for the big chip makers.

The learnings that Apple has gotten from designing, making and selling hundreds of millions of A series chips is something few other manufacturers have. It's real world testing and validation in numbers that have clearly made a significant impact on Apple's IP and will continue for the foreseeable future. It also allowed them to think differently about laptop and desktop chips and not be hindered by the old way of doing things that unfortunately keeps their competitors somewhat stuck in the past, imho.
 
I’m even more excited now that I’ve found out we can use iPad Pro apps on the Mac. Affinity have not long tweeted this.

View attachment 1660718

Now if I can just get Lumafusion on the M1 that would be awesome. I’m so tempted to sell my 2019 13” MacBook Pro and either get the new Air or the Pro with Apple Silicone.
Affinity are no doubt referring to the mac versions of their apps with this tweet.
 
I’ll point out to the people saying that Intel/AMD will have to catch up in the next year or two, that’s impossible with their business model. They’re beholden to volume sales, they don’t have a business model that would allow them to develop anything like M1 without a MASSIVE purchase from a client. That’s TENS of billions in tooling and R&D and there simply isn’t manufacturing capacity available to do such a thing unless they dropped their currently business model entirely, which simply isn’t going to happen.
 
  • Disagree
  • Like
Reactions: NetMage and Stella
My wife and I are picking up 2 of the Mac Air‘s as replacements for our 2013 ones so it was a slam dunk for us, given what we do on them - browsing, video, video/picture editing, GarageBand, writing, misc stuff. From the specs, it looks like the combination of speed, size, display, keyboard and battery life is as good as we’ll get for the money, regardless of the OS. Me, I went back to a Dell notebook a while back but will be selling that and coming back to the Mac side.

ps, got the base config, didn’t think we’d need the memory bump on this system.
 
Last edited:
EVERY 13 inch MacBook Air or Pro owner who only has 8 GB is thinking 100 times / day about it. And every serious user with 16 GB wishes they had 32 GB. Memory is the only thing you really need. Since Apple has made decent amounts prohibitively expensive, most people don't buy enough, and therefore this is really just a wrong statement (although of course you can think what you want).
Rubbish. My girlfriend has no idea and couldn’t careless about what ram is in her mac. When I try to talk about it with her she still couldnt care less.
 
Apple's advantage = less bureaucracy in the software and hardware development teams. Unified Goals.
Intel/AMD + M$ + et al = very compromised development path. Herding cats.
Apple is not better than any single one of those mentioned companies. The amount of feature creep, lack of coordination across their different apps and the staggering amount of bugs in their rushed releases is the best proof of that.

Their hardware department however overdid themselves once more. It's the one area that really works like a charm at Apple.
 
Only someone dumb would try to justify comparing the M1 with a Ryzen 5600X because it was the CHEAPEST AMD new processor. The 5600X is a $300 DESKTOP chip that sits at the higher end of the mainstream tier.
The cost of a Risen 5600X plus the cost of the electricity to power the extra wattage plus active cooling for, say, a four year lifespan, is greater than the entire cost of a new Mac mini. Great comparison. Not.
 
very simply it’s not likely that every single bit of code can be resident in the 16gb currently on die.

so you will need to at minimum be sourcing data from solid state storage which is much much much slower in every respect than the on-chip / on-die memory.

Basically the M1 is a mobile focused chip that has been beefed up by dumping a massive amount of memory right into the same silicon as the rest of the architecture. This obviously presents significant advantages. But it also requires a rethinking of development and data management.

I guarantee that performance of apps in Rosetta (due to the nature of the architecture and design being emulated) will not be directly on par or even meet the expectations being set by these geek bench scores.

You mean GB, not gb. And it's not "on die." It's in the package. It's not on "the same silicon as the rest of the architecture." It's just moved into the SoC package instead of a separate package/DIMM.

So what you are referring to is virtual memory paging.
 
No catch. Reduced instruction set. That is how the industry should have been going from the beginning. Instead they went to the hardware way and the Intel and AMD instructions stay the same. Now they really look pathetic and they should.

Reduced instruction set required more intelligent software design. Why bother when you can have simple code executed on an expensive CISC processor? /s

In all fairness the potential of RISC has yet to be fully utilized by any operations outside of raw polygonal calculations. And that’s only happening because of the fractal nature of the calculations. Efficiently translating random / abstract data into multiple pathways of equivalent length and then assembling the outputs into a container with predetermined attributes (as you’d expect) is effort and architecture that bleeds into multidimensional processing.

in simpler terms. Humans are doing as well as they can given the circumstances. LOL
 
So what, how many "hardcore" (I hate this word, I just can't think of a better description) gamers are there who want to play on Mac? Certainly not enough to port games to an entirely different architecture, even if the performance is on point.
It's not about hardcore gamers. It's about total revenue. iOS gamers are a multi-billion dollar market. If a game dev can target the revenue of both iOS and desktop gaming customers with one code base, vs. only one, given sufficient gfx performance, they'll follow the money.
 
  • Like
Reactions: Coolkiwi
I was hoping that someone could answer this question. If the Air according to the initial benchmarks of late is to be believed , and assume the the 13"MBP is going to be faster, are you really going to have to jump up to the 16GB model, for most users? With the M1 are you going to need a dedicated Graphics, if apple's claims are correct...for most users? Is M1 graphics equivalent to the current 16inch AMD Radeon Pro 5300M for Graphics? no one seems to be talking about this?What can we compare the Graphics in the M1 too???
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.