Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The thing that the Apple fanboys don't understand is that there is NO competition right now for the new Apple Silicon Macs... but these systems aren't actually mind blowing. This is something that other RISC/ARM players can do. They can easily do it.

Microsoft has wanted the market to shift to Arm for a decade! They couldn't do it themselves. Even Apple needed a really good reason to do it and they are in a very unique situation that by controlling the entire ecosystem they play in they can make that change work where Microsoft failed.

But Microsoft is already ready for a RISC/ARM change over. You better believe that about a year before the Xbox that follows the series x comes out that Microsoft will be pushing ARM big time! That XBOX will be ARM or RISC.

People keep saying Intel this and Intel that... Intel is in trouble but not really from Apple... they are in trouble from AMD, NVIDIA, TSMC, Microsoft, Dell, Amazon, the list just keeps growing... Apple isn't on the radar.

What Apple did just do... is they finally kicked off the consumer PC move to RISC based processing. RISC was inevitable... it couldn't compete 20-30 years ago. But it was always going to eventually happen.

These processors are NOT special... they are NOT unique... there will be plenty of vendors making ARM processors... and Apple doesn't own ARM... they don't have a non ARM independent RISC system... they are licensing the tech from someone who will sell the license to ANYONE... and companies can quickly use that tech to custom design their own RISC systems...

So...

This will cause Microsoft to finally get what they want which is a RISC based consumer market.
 
The thing that the Apple fanboys don't understand is that there is NO competition right now for the new Apple Silicon Macs... but these systems aren't actually mind blowing. This is something that other RISC/ARM players can do. They can easily do it.

No they can't.

They don't own the software ecosystem and aren't interested in developing it themselves.

As has been discussed, ARM isn't really RISC anyways. And x64 is RISC anyways too.

What Apple has achieved is a great chip but also a level of integration that nobody else has done. I do not agree that anyone else could do this. Maybe Microsoft but they are still way behind. And, of course, they have a much bigger legacy issue.
 
  • Like
Reactions: Fomalhaut
Tell us what integrated gpu build into the chip can come close to M1
Yes. It is the fastest iGPU, but it is also the most expensive iGPU. It offers great performance and power efficiency, but poor value for money!

MacBook Air (7 core GPU) @ $999
Vs
Asus Zenbook 14 with Intel Xe Graphics @ $899

The cheaper MacBook Air wins on graphics performance and is only slightly more expensive. Win for Apple.

MacBook Air (8 core GPU) @ $1249
Vs
Asus Zephyrus G14 with Nvidia GTX 1660Ti @ $1249

The more expensive MacBook Air gets GRAPHICALLY STOMPED at the same price point. Loss for Apple.

I own a MacBook Pro (Intel) and an Asus Zephyrus G14, and I think they are both great. However, I doubt very much if people who want a powerful GPU in a laptop are going to be happy with the M1. I am already regretting my GTX 1660Ti purchase and should probably have bought the G14 with the RTX 2060 Super.
 
  • Like
Reactions: wumpalumpa
Sorry, but I find that hard to believe. And I really don't see why they would be incompatible. Maybe with M1 but what about future chips? Does anything prevent Apple from designing an ARM CPU that would work with 3rd party GPUS??
It's not like it would be impossible, just that Apple wants it this way.
 
PS5 10.28 teraflops

in short, M1 is far far far away

but amazing for a Phone and/or Tablet
🤣🤣🤣 I got negatives only to write truth

is shocking how much fanatism people can hold

anything stamped with an apple needs to be the best, otherwise they get angry

when it is far more than clear that those are in spefications among the worst laptopts new gen laptops

but hey, has an apple...

if I want a tablet I get a tablet if I want a computer I get a computer

thats an overpriced mixture that only blind rich followers are gonna be happy to pay

no one likes less for more price

MacOS Operating System is not that good as to ask for that money gap, okay is based on FreeBSD and with more support for comercial apps than Linux nowadays....

and?

MacOS lacks in a bunch of things

if I want to be that limited I direclty buy a cheap ARM nonapple laptop and install Linux or Windows

having an apple stamped doesn't solve anything. Yet, some people would like to believe they are in front of a highend next gen computer

* here a MacOS user for years
 
Last edited:
  • Disagree
Reactions: RoundaboutRider
They are not compatible with Apple Silicon. All Macs with AS will have Apple GPU's.
No reason they couldn't be. Whether Apple will use third party GPUs going forward is an open question, especially given how much work and effort it seems they're pouring into their own GPUs, but the switch to ARM is not a barrier to them using AMD or NV GPUs if they want to.
 
What was the point I missed?
What's significant GPU HW functionality that you believe is missing from the M1 but is present in nV and AND GPUs?

Again you missed the point, not once did I say the M1 GPU missed any hardware features.
 
Some of you here are seriously demanding. I for one am seriously impressed if true, that the new Macbooks can match a GTX1050ti.

I build PCs for fun. I have 6 boxes of varying sizes here, all except one have discrete GPU ranging from GT1030 to RTX2060S. The smallest being a tiny 3-liter case which I modded a GTX1650 to fit into. I do have a GTX1050ti running 24/7 in a HTPC (remember those??). Someone mentioned that Intel and AMD both have an integrated GPU that tops the GTX1050ti. I would sure love to know which it is. Even the Ryzen 5 4650G has only half the performance of a GTX1050ti.
Exactly. The critics don’t know how impressive Apple’s igpu is compared to the igpus from Intel and AMD. Someone else mentioned an Nvidia MX450, which is a dedicated gpu, which probably won’t be as fast as the M1 igpu.
 
Running a processor faster isn't just cranking up the clock.
THe processor was manufactured and designed with speed goals in mind.
Exceeding those normally isn't possible due to all kind of issues related to transistor speed, wire on the dies, etc.
That's a simplified explanation.
Okay techwhiz
 
Again you missed the point, not once did I say the M1 GPU missed any hardware features.
Is English not your native language? Because I don't know how a reasonable person interprets

"Quite impressive for a iGPU. Though we need to see if Apple can keep up with things like shaders, ray tracing, and other graphical niceties. 300 FPS is great, but not if that means the game has to miss out on tessellation."

except as a statement that the M1 GPU is MISSING tessellation, shaders, ray tracing, and "other graphical niceties"...

I don't want to fight you on this; I simply want to point out that you made an incorrect statement, that I corrected. I don't know why you are arguing this, especially if you agree that the original statement as incorrect.
 
No they can't.

They don't own the software ecosystem and aren't interested in developing it themselves.

As has been discussed, ARM isn't really RISC anyways. And x64 is RISC anyways too.

What Apple has achieved is a great chip but also a level of integration that nobody else has done. I do not agree that anyone else could do this. Maybe Microsoft but they are still way behind. And, of course, they have a much bigger legacy issue.


no they can't what?

Apple's ecosystem is the reason why switching over to ARM quickly is feasible.

Microsoft beat them to the table with an ARM based laptop that runs x86.

Microsoft did try and fail. Apples achievement in this situation will pave the way for Microsoft to succeed.

Within 2 years I would expect there to be Windows based ARM laptops that beform right with the Mac lineup but most likely always a bit behind on the processor but probably just a small amount.

Now RISC vs not RISC... if you are going to pick something apart have a better idea. You are talking about semantics. In this case your understanding is highly dated.

In the past ARM has stood for advanced RISC machine and Acorn RISC Machine.

RISC is not a brand or a ruleset. It is just a loose term describing certain behaviors. ARM is RISC. ARM has a reduced amount of instructions when compared to x86 and more importantly it "load and store". Because of those 2 properties it is indeed a RISC processor. Since that is the current acceptable definition of RISC it is indeed RISC. If you are trying to say it isn't RISC because it differs from what RISC meant 30 years ago then you need to update your understanding.

If you are someone who owns a bunch of Apple stuff then you should excuse yourself from the conversation as your opinion will be one sided.

I own a Mac Pro several PCs an iPad and 2 Samsung phones... I have no love for Apple or Microsoft.

I'm just aware that any major player can make their own ARM chip. Apple has some enhancements above and beyond whats licensed from ARM but the other players will get there.
 
no they can't what?

Apple's ecosystem is the reason why switching over to ARM quickly is feasible.

Microsoft beat them to the table with an ARM based laptop that runs x86.

Microsoft did try and fail. Apples achievement in this situation will pave the way for Microsoft to succeed.

Within 2 years I would expect there to be Windows based ARM laptops that beform right with the Mac lineup but most likely always a bit behind on the processor but probably just a small amount.

Now RISC vs not RISC... if you are going to pick something apart have a better idea. You are talking about semantics. In this case your understanding is highly dated.

In the past ARM has stood for advanced RISC machine and Acorn RISC Machine.

RISC is not a brand or a ruleset. It is just a loose term describing certain behaviors. ARM is RISC. ARM has a reduced amount of instructions when compared to x86 and more importantly it "load and store". Because of those 2 properties it is indeed a RISC processor. Since that is the current acceptable definition of RISC it is indeed RISC. If you are trying to say it isn't RISC because it differs from what RISC meant 30 years ago then you need to update your understanding.

If you are someone who owns a bunch of Apple stuff then you should excuse yourself from the conversation as your opinion will be one sided.

I own a Mac Pro several PCs an iPad and 2 Samsung phones... I have no love for Apple or Microsoft.

I'm just aware that any major player can make their own ARM chip. Apple has some enhancements above and beyond whats licensed from ARM but the other players will get there.

You wrote that they can do the same thing referring to the chipmakers.

I wrote that they can't.

Because they don't control the software.

I suspect that Apple has done more than just build a chip.

I built my first computer in the 1970s.

Did porting on VAX, Alpha, MIPs, Solaris, etc.

We had a discussion on RISC, ARM and CISC. And people here convinced me that ARM is more CISC than RISC.

X64 has a decoder and breaks CISC instruction into RISC bits for execution.

The other players won't get there. We can just see that from the phone and tablet markets.
 
Microsoft has wanted the market to shift to Arm for a decade! They couldn't do it themselves. Even Apple needed a really good reason to do it and they are in a very unique situation that by controlling the entire ecosystem they play in they can make that change work where Microsoft failed.
Microsoft, as always, has great ideas but then screws up monumentally when it comes to execution.

The problem with their ARM transition is that it wasn't a transition in the sense of the word. Yes, they released windows for ARM, but when you are still selling it alongside intel devices, with no clear communication on what your future roadmap will be like, why would you expect developers to support your platform when you yourself don't even show any confidence in it?

Market share is Microsoft's strength, and its biggest weakness, because Microsoft has too many considerations to take into account whenever they want to do something.

See what Apple has done here? They have made it exceedingly clear to the the entire industry that it's only a matter of time before all their Macs are running on Apple silicon, and probably sooner than later. You can choose to update your apps now, you can choose to do so later, you can choose not to do it at all (and risk losing market share to other competing alternatives), but Apple will not stop or reverse course for anybody.

It's their way or the highway, and this is the lesson which Apple keeps teaching and that others keep ignoring - if you want to bring about meaningful change in a market, you have to force change. Boldly and unapologetically.
 
Many things suggest that Apple plans to split the mini line into "basic mini" model (the one you see today) and a mini pro model with the same sort of IO as the current Intel model.

The mini pro is waiting on the next step (an M1X?) that will presumably have 8 large cores, support at least 32GiB DRAM, and support at least four USB/TB ports; ie something that will match the current (non-low-end) MBP, iMac, and intel "mini pro".
none of the future, or hypothetical ideas about how they are approaching it changes that the base model has gone from 4 down to 2 usb-c ports.
 
Is English not your native language? Because I don't know how a reasonable person interprets

"Quite impressive for a iGPU. Though we need to see if Apple can keep up with things like shaders, ray tracing, and other graphical niceties. 300 FPS is great, but not if that means the game has to miss out on tessellation."

except as a statement that the M1 GPU is MISSING tessellation, shaders, ray tracing, and "other graphical niceties"...

I don't want to fight you on this; I simply want to point out that you made an incorrect statement, that I corrected. I don't know why you are arguing this, especially if you agree that the original statement as incorrect.
M1 has hardware Ray Tracing?
 
M1 has hardware Ray Tracing?
It definitely has *some* functionality towards that. Exactly what counts as "hardware ray tracing" vs "does ray tracing well in a shader" starts to become more an argument about words than anything else.

But it's especially unclear because of the on-going IP sharing between Apple and IMG. IMG's two big improvements this year were to their buffer compression (which is certainly implemented on A14/M1) and "Level 4" ray tracing. Did early iterations of that stuff get to Apple and into the A14 in time? Will it not get to Apple until it's ready to ship in Series C GPU's? Who knows.


It's possible that A14/M1 only have Ray Tracing at the MPS level right now (like the A13). But it's also possible that "more" Ray Tracing hardware is there, but won't be visible until a new, more appropriate, set of APIs is written and shown at WWDC next year. Either way I think it's as certain as anything can be that the IMG Level 4 Ray Tracing will be in the A15.
 
Well they are benchmarking against old generation AMD, ATI cards

those cards had been released at 2017

Nowadays both AMD, Nvidia play muuuch higher

amazing chipsets for a mobile phone, no doubt

Xbox Ones Series S = 4 teraflops at 300 euros (and it includes other hardware apart from the graphic card)

Xbox One Series X = 12 teraflops

Custom PC = 20 teraflops (and isn't the highest you can get with a single video card)
They're comparing to everything. Flops isn't a great thing to compare by, though.
 
They're comparing to everything. Flops isn't a great thing to compare by, though.

It appears that AMD and nVidia make high-end cards first and then get around to low-end cards eventually. I was looking for a low-end card that could do 3x4k and the only one that made sense was the GTX 1050 Ti - it did what I wanted it to based on the Amazon page. It was difficult to find something that could do what I wanted without coming along with gaming performance characteristics that I didn't need nor want.
 
The GTX 1050 series was a budget card in 2016. I don’t remember it being recommended for gaming even at the time, unless budget was a serious constraint.

In all fairness, Apple are nearly always at the root of every spat. They simply don’t know how to meet other companies half way and while that’s worked out, it doesn’t make it right to go blaming other companies for Apples historically poor graphics offerings.
1050ti is a big step up from 1050. Fun fact, it was considered the best performance/$ for machine learning for a while, factoring in typical costs of other computer parts. That is, maybe a cheapo GPU is better per $, but you need a mobo, CPU, RAM, and case for every 4 or so that you buy.
 
Last edited:
  • Like
Reactions: stringParameter
It appears that AMD and nVidia make high-end cards first and then get around to low-end cards eventually. I was looking for a low-end card that could do 3x4k and the only one that made sense was the GTX 1050 Ti - it did what I wanted it to based on the Amazon page. It was difficult to find something that could do what I wanted without coming along with gaming performance characteristics that I didn't need nor want.
Yep, and it comes in a compact single-fan form factor. That's what I have.
 
The GTX 1050 Ti is NOT a budget card:

Screen Shot 2020-11-18 at 8.21.39 PM.png
 
🤣🤣🤣 I got negatives only to write truth

is shocking how much fanatism people can hold

anything stamped with an apple needs to be the best, otherwise they get angry

when it is far more than clear that those are in spefications among the worst laptopts new gen laptops

but hey, has an apple...

if I want a tablet I get a tablet if I want a computer I get a computer

thats an overpriced mixture that only blind rich followers are gonna be happy to pay

no one likes less for more price

MacOS Operating System is not that good as to ask for that money gap, okay is based on FreeBSD and with more support for comercial apps than Linux nowadays....

and?

MacOS lacks in a bunch of things

if I want to be that limited I direclty buy a cheap ARM nonapple laptop and install Linux or Windows

having an apple stamped doesn't solve anything. Yet, some people would like to believe they are in front of a highend next gen computer

* here a MacOS user for years
Your facts are what, this sucks because it's slower than a dedicated desktop or semi-mobile GPU? There's a reason most people aren't using Thinkpads and ASUS gaming laptops, and it's not that they're all idiots. People care about fast integrated graphics, and AS integrated graphics appear to be insanely fast.

Apple Silicon Macs are the first Apple products I've cared about since 2013. Thankfully they're once again not just a phone company.
 
Samsung made large phones, high resolution screens, OLED's, wireless charging, reverse charging, multi-cameras, face unlocking, multi-tasking, pen support, underscreen fingerprint sensor, water-proofing, 5G etc before Apple...of which Apple now "tends to follow"
Thank you for the correction - I wasn’t even thinking about most of these, as for me a lot of those would have just been the general progression. But, someone has to take the first step. It’s really like three kittens (Google, Samsung, Apple) running around chasing each other. Once company does something and everyone copies in their own way.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.