Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Something I'm puzzled about.
It was always the cheap low end machines that used to have Graphics systems that had to share graphics memory with main memory.
Of course, you also then have the problem of the memory you wish to hold programs in, being taken to be used by the graphics chip.
We then moved onto higher end graphics cards which then had their own super fast dedicated graphics to stop them hitting the processor.
So the processor could get on with what it was good at, with it's own memory, and the Graphics cards to storm ahead with their own dedicated memory also.
This allowed graphics performance to storm ahead.
Now we seem to be going back to the shared memory again.
Can anyone explain this, why this is not going backwards?
There simply is not enough information available yet to give a clear answer to your question, but I'll try to cover a bit:

First of all, that's not the timeline. If you go back far enough (while still staying in the era of 3D), integrated graphics acceleration didn't exist, only separate GPUs. Then integrated graphics started to become a thing on the low end and ultralight end of the spectrum. That latter part is extremely relevant to the M1.

The MacBook Air, for example, has never had a GPU with independent graphics memory. The last 13" MacBook Pro to have a dedicated GPU was I believe 2010, a decade ago. The last Mac Mini that had a dedicated GPU was released in 2011.

The M1, at this point, is only present in those three product lines. So at this point the integrated M1 GPU has replaced only the integrated Intel GPU. Whether integrated graphics might have some advantages or if dedicated GPUs are better in every possible situation is a moot point at this stage, because Apple hasn't shipped anything with a dramatically different GPU/CPU architecture than what they have for nearly a decade.

You can't go backwards if it's the same as what you had before--the only question right now is whether the M1 can outpace the Intel integrated GPU in the products it is replacing an Intel CPU in. (I'm leaving out the eGPU support issue here, which is separate.)

Now, if you want to get into hypotheticals, the M2, or M1X, or P1, or whatever ends up in the 16" MacBook Pro and high-end big-screen iMac, or eventual pro products, may or may not come with a dedicated 3rd party GPU. If they have dedicated GPUs, again it's a moot point. If they don't, then we can get into whether this is a huge step back, whether it's a modest step back that only affects a tiny sliver of pro users, or if there are genuine advantages to it in at least some use cases.

One thing to note on hypothetical advantages or disadvantages of the M1 architecture: GPUs don't just generate 3D graphics anymore. They are often used as general-purpose coprocessors for certain operations they're very good at. For this sort of general purpose computing, you need to move data from the CPU to the GPU in order for the GPU to work on it, which induces overhead.

In a fully integrated system like the M1, if the architecture is designed to take advantage of it, that latency disappears, because everything is using the same RAM. So hypothetically speaking, there could be performance gains in some areas of GPU computing thanks to the shared RAM. How much will also depend in some part on just how fast the RAM integrated into the M1 is.

I don't know whether Intel integrated graphics currently take advantage of that capability or not on MacOS. I actually don't even know whether the M1 does for sure, but I've read that's the case.
 
Last edited:
I was planning to buy an Intel iMac 3.1 GHz if Apple Silicon didn't impress
Me too. Need to replace my ageing desktop - doesn’t support Big Sur - and had psyched myself for the £many of an good iMac but Mini very tempting as a stop-gap.
 
  • Like
Reactions: Homy
It looks like blowing the doors off everyone is reason enough. This move crushes Microsoft and Google’s efforts to turn the clock back to 1998 where Apple ended up depending on future competitors supporting their platform to survive. Now they will own the stack for the fastest and most efficient computers on the market. Developers will come in groves. Especially now that it supports iOS Apps and the ability to translate them quickly.
I believe this is indeed their plan. Essentially they're leveraging their dominance in the mobile market and turning the table against Windows. But just like Microsoft couldn't successfully leverage its dominance in the PC market to carve out a significant enough share of the mobile market with Windows Phone, Apple might not find too many converts relying solely on the strength of their in-house chips.

ASi's really shines in terms of battery life but other than that, do people who buy MacBook Air really care about the single-core Geekbench score of the M1 chip? They probably care more about compatibility and price. And just like Windows 10 Mobile, how many people who owned a Lumia actually found it indispensable that they be able to run their mobile apps on their desktops and laptops?

M1 solved the perennial problem of short battery life of laptops but it didn't solve the problem of price for Apple. Most Apple devices are still very expensive. Furthermore, the advent of ASi reduces interoperability between Mac and Windows machines. Even though only a small minority actually uses Bootcamp and VMs, but just knowing that their machine can run Windows reduces the initial psychological hurdle of purchasing a Mac for most who were on the fence.

This is not to mention that the advantages of ASc, namely power efficiency, are of less importance on a desktop. The inability to support eGPU, however, reduces the attractiveness of a Mac as a high-end desktop. It remains to be seen if Apple can compete with the best of them when it comes to graphics performance on an iMac Pro or Mac Pro-class desktop computer.
 
  • Like
Reactions: Grey Area
They just realized very early which way to go. If others (AMD, Intel) saw writing on the wall earlier, we would have had competition. This way, it will take years for them to catch up.
I wouldn't disregard AMD, they have been tuning the tables on Intel for a while now , I would say it was their move Last year that may have prompted Apple to do the same thing for their Mac line up as their IOS devices. AMD’s new chips are huge leap over Intel and they are still on 7nm with 5nm coming early next year, plus both Apples M1 and AMD chips are made by the same company TSMC. I think the real trick might lie in Apples expertise with optimising the system for an ARM based processor. I think AMD seem already ahead of the curve with huge multicore CPU performance for the PC and servers, that will take Apple a while to “catch up“ but we’re comparing Apples & Oranges here (excuse the pun) so hard to compare. Whatever way we look at it it’s great news for us all :) I do feel sorry for all those that bought an Intel Mac this year however, that’s got to have dropped their value through the floor.
 
Me too. Need to replace my ageing desktop - doesn’t support Big Sur - and had psyched myself for the £many of an good iMac but Mini very tempting as a stop-gap.
Yeah, I'm pondering about it. Is the mini M1 super value for the 16GB RAM 512GB SSD version? It seems to me it's more value than the mini has ever been at £1099. Super speed. Reasonable graphics. Low noise?
 
Intel has the anchor of backwards compatibility with code back to 1978 8086 cpu. Then decades of cruft on top of an operating system called windows which has backwards compatability with 16bit windows 3.1 code. All this adds up to massive efficiency issues.

Until Microsoft and Intel partners together to work on a new from scratch 21st hw/software architecture you will see little progress.
This is actually done intentionally. There are a lot of government agencies throughout the world, not just in the U.S., that rely on this thing called "backwards compatibility" of the Windows operating system to provide essential services to its citizens. A "new from scratch 21st hw/software architecture", while exciting in name, will probably entail a not insignificant budget increase, loss of jobs (IT who maintain these old machines, for example), and a disruption that has the potential to cost some elected officials their seats in the legislature.

Apple has the luxury of actually not having many of their desktops used in government, business, and engineering firms. They've also been losing grounds to Microsoft and later Google in education since the 2010s. Any company that gets big enough in these aforementioned four sectors will have to contend with backward compatibility.
 
  • Like
Reactions: Captain Trips
But what are all the Apple-haters and Me-doubters going to complain about if that’s the case?

Oh, I know. They’ll fall back to “but it doesn’t virtualize x86.”

Anyway, that’s actually more of a Rosetta-speed hit than I expected, but we’ll see when we get real world data.
I mean, that IS why I'm going to be buying an Intel Mac shortly...

But yeah, for running native Mac applications, this would seem to be a good improvement.
 
There are already a number of better solutions. Parallels and VMWare work great and Parallels is just about to get its Big Sur update.
Parallels and VMWare are virtualisation solutions. To run Windows on an ARM Mac would require emulation. I believe the Parallels Big Sur update will allow you to virtualize ARM Linux VMs, and maybe ARM on Windows.

I hope I’m wrong and Parallels do release a x86 Emulation solution.
 
  • Like
Reactions: NetMage and throAU
There simply is not enough information available yet to give a clear answer to your question, but I'll try to cover a bit:

First of all, that's not the timeline. If you go back far enough (while still staying in the era of 3D), integrated graphics acceleration didn't exist, only separate GPUs. Then integrated graphics started to become a thing on the low end and ultralight end of the spectrum. That latter part is extremely relevant to the M1.

The MacBook Air, for example, has never had a dedicated GPU with independent graphics memory. The last 13" MacBook Pro to have a dedicated GPU was I believe 2010, a decade ago. The last Mac Mini that had a dedicated GPU was released in 2011.

The M1, at this point, is only present in those three product lines. So at this point the integrated M1 GPU has replaced only the integrated Intel GPU. Whether integrated graphics might have some advantages or if dedicated GPUs are better in every possible situation is a moot point at this stage, because Apple hasn't shipped anything with a dramatically different GPU/CPU architecture than what they have for nearly a decade.

You can't go backwards if it's the same as what you had before--the only question right now is whether the M1 can outpace the Intel integrated GPU in the products it is replacing an Intel CPU in. (I'm leaving out the eGPU support issue here, which is separate.)

Now, if you want to get into hypotheticals, the M2, or M1X, or P1, or whatever ends up in the 16" MacBook Pro and high-end big-screen iMac, or eventual pro products, may or may not come with a dedicated 3rd party GPU. If they have external GPUs, again it's a moot point. If they don't, then we can get into whether this is a huge step back, whether it's a modest step back that only affects a tiny sliver of pro users, or if there are genuine advantages to it in at least some use cases.

One thing to note on hypothetical advantages or disadvantages of the M1 architecture: GPUs don't just generate 3D graphics anymore. They are often used as general-purpose coprocessors for certain operations they're very good at. For this sort of general purpose computing, you need to move data from the CPU to the GPU in order for the GPU to work on it, which induces overhead.

In a fully integrated system like the M1, if the architecture is designed to take advantage of it, that latency disappears, because everything is using the same RAM. So hypothetically speaking, there could be performance gains in some areas of GPU computing thanks to the shared RAM. How much will also depend in some part on just how fast the RAM integrated into the M1 is.

I don't know whether Intel integrated graphics currently take advantage of that capability or not on MacOS. I actually don't even know whether the M1 does for sure, but I've read that's the case.

Perhaps I'm going back a bit too far, as I was amazed, as was all of the industry, at the Voodoo 1 add on PC card for 3D Graphics.
I bought a Voodoo 1 and then, following that a Voodoo 2. It was mindblowing back then how the addition of this separate card transformed what a PC could do with gaming, having a separate card dedicated to 3D graphics.
Things just got better and better from then on.

Something I am wondering though. If we take the latest RTX 3080 GPU's from Nvidia, they have 28 billion transistors just for this monster GPU, whilst the whole M1 chip has only around half that for everything, coming in at 16 billion.

Now I know numbers are not everything, but these products seem worlds apart.
Almost feels like you could throw an M1 chip inside a 3080 and do an all in one :)

Now that Nvidea is buying ARM for $40 Billion
And I'm sure no-one would suggest Nvidea are stupid, and don't have very skilled chip designers.
It does perhaps make you wonder what could happen if they decided to get to work on their own Custom Arm CPU in combination with their graphics expertise?
 
I’m sorry, but I take issue with the dismissive nature of this comment. Virtualizing x86 IS a big deal. I have to deal with several proprietary Windows applications for work, and being able to run a VM with Windows is a pretty big deal for that and is the difference between whether or not I need a separate computer for work. Plus I like gaming on my MBP. The ability to virtualize Windows- or run Boot Camp- might literally be what tips me into buying a 16” last gen Intel MBP over an upcoming M1 based one.

I’m hopeful for a third party solution though. Back in the PowerPC days, VirtualPC was originally an x86 emulator in addition to virtualization. Maybe Parallels will move to adding in x86 emulation to a future product.

Also, the fact that you are disappointed in these Rosetta benchmarks is kind of astounding to me. They are phenomenal by emulation standards.
Whilst it's completely natural to look at it through a personal lens, I think we can infer from Apple's actions that virtualising x86 is not a big deal. They clearly have the view that the benefits to be achieved by taking this path are critical to the ongoing success of the Mac platform, and if x86 virtualisation is a casualty of that then so be it. If x86 virtualisation was a big deal then they wouldn't be doing this.

x86 virtualisation (and Boot Camp) is surely a bit like eGPU support - only relevant to a tiny fraction of Mac users. Apple is a company known for its desire to focus on doing only a few things (and famously said "There are a thousand no's for every yes") - it's not difficult to imagine internal discussions about things like Boot Camp and eGPU, asking whether Apple should really still be doing these things and whether they are core to their mission and success.

It also wouldn't surprise me if Apple has an additional view that if this change forces a wider industry effort to remediate legacy x86 Windows apps then that's no bad thing too.

This post comes across as a bit of an Apple apologist, sorry, that's not my intent - just looking at it logically.
 
Personal attacks are not permitted on these forums. My reaction score is almost 20,000 on here, so I guess many people like my posts. I’m happy to address any on-topic objections you have to what I’ve written.
If I may:
While rjp1's accusation ("most toxic") is very much unwarranted, to me as a casual reader your first post in this initially exuberant thread did come off as introducing a bit of needless aggression.

I am aware that perception of tone and intent can be quite subjective when interpreting the written word. Also, I do not know what past arguments you have had on this topic, and maybe you have good reasons to hold grudges against certain people.

But to someone without that knowledge, preemptively lashing out against "Apple-haters and Me-doubters" may appear bitter and weirdly personal, and it risks painting those who remain a bit cautious about the new architecture with a broad brush. E.g. I am skeptical because I am always skeptical about marketing claims and benchmarks, and indeed x86-virtualization is something I use daily. But neither am I an Apple-hater, nor really a "You-doubter," given that before this thread I was not actually aware of any specific poster's opinion. Again, this may not have been your intention at all, but I think it is easy to get the impression, which may then color the perception of the subsequent discussion.
 
This might inadvertently hurt Mac sales in the short term. If the reviews are too glowing of AS then far more people are going to wait for AS iMacs and higher end products. They'd better get a move on...
I guess Apple are expecting that. In the US, the two higher end intel iMacs are currently shipping in 4-5 weeks, and in the UK, they are shipping in about 4 weeks. The 13inch and 16 inch intel MBP's are taking 2-3 weeks. Could be down to Covid, or perhaps they are simply making them to order as they retool their production lines for the M variants.
 
Whilst it's completely natural to look at it through a personal lens, I think we can infer from Apple's actions that virtualising x86 is not a big deal. They clearly have the view that the benefits to be achieved by taking this path are critical to the ongoing success of the Mac platform, and if x86 many is a casualty of that then so be it. If x86 virtualisation was a big deal then they wouldn't be doing this.
When you look at the trajectory since the switch from PowerPC to Intel, Apple yearly Mac sales jumped for the first few years. But since I think 2012 or so, the sales have been flat year over year, about 18 million units a year. This means there was limited benefit to be had by the switch to Intel. Yes, it brought many people over that wanted X86, but it has been going nowhere since. There is no uphill trajectory in sales. There will not be more X86 people coming over to the Mac year over year. There is no marketing pitch to be had for Apple to promote Mac over PC. It's flat. Now I figure Apple thinks that the people they can reach in the new marketing pitch for the Apple M Macs will be greater than the people leaving Apple because it will no longer do X86. It's a bet. And it's one that might pay off, because the install base of the Mac has grown since the PowerPC days, and many of those new people will not hesitate to move to Apple M1 because they don't do virtualization or bootcamp.
 
  • Like
Reactions: Unregistered 4U
This is actually done intentionally. There are a lot of government agencies throughout the world, not just in the U.S., that rely on this thing called "backwards compatibility" of the Windows operating system to provide essential services to its citizens. A "new from scratch 21st hw/software architecture", while exciting in name, will probably entail a not insignificant budget increase, loss of jobs (IT who maintain these old machines, for example), and a disruption that has the potential to cost some elected officials their seats in the legislature.

Apple has the luxury of actually not having many of their desktops used in government, business, and engineering firms. They've also been losing grounds to Microsoft and later Google in education since the 2010s. Any company that gets big enough in these aforementioned four sectors will have to contend with backward compatibility.
All absolutely true, but those worlds are realising that kicking the can down the road indefinitely, and perpetuating technical debt in this way is crippling their ability to move forward and be agile, and exposing them to cybersecurity and other risks too. Many of them are moving workloads to the cloud (at varying paces of course depending on their competence and ability to invest) and see that as the vehicle for remediating those apps and getting them into an evergreen model (and SaaS-based where possible).

There are two ways this can go. Either Apple decides to start building-in better backwards compatibility, as you describe, and then becomes hamstrung in the way that Intel and Microsoft have been for many years, or they say "screw that" and tell their customers that if they want to be a user of Apple technology (and get the benefit of what appears to be the best performance per watt in the industry) then they are going to have to get with the programme and keep their apps and infrastructure current. My bet is on the latter.
 
After transition of the complete Mac line up to Apple silicon, Apple should release a gaming console based on Apple silicon.
High end gaming on any Apple platform has been lacking, and apple should be able to launch a competitive gaming console based on Apple silicon within the next 3 years.
 
  • Like
Reactions: MikB
After transition of the complete Mac line up to Apple silicon, Apple should release a gaming console based on Apple silicon.
High end gaming on any Apple platform has been lacking, and apple should be able to launch a competitive gaming console based on Apple silicon within the next 3 years.
Can't the AppleTV eventually take over that?
 
  • Like
Reactions: NetMage
When you look at the trajectory since the switch from PowerPC to Intel, Apple yearly Mac sales jumped for the first few years. But since I think 2012 or so, the sales have been flat year over year, about 18 million units a year. This means there was limited benefit to be had by the switch to Intel. Yes, it brought many people over that wanted X86, but it has been going nowhere since. There is no uphill trajectory in sales. There will not be more X86 people coming over to the Mac year over year. There is no marketing pitch to be had for Apple to promote Mac over PC. It's flat. Now I figure Apple thinks that the people they can reach in the new marketing pitch for the Apple M Macs will be greater than the people leaving Apple because it will no longer do X86. It's a bet. And it's one that might pay off, because the install base of the Mac has grown since the PowerPC days, and many of those new people will not hesitate to move to Apple M1 because they don't do virtualization or bootcamp.
I'm not sure you can necessarily ascribe the trajectory of Mac sales figures so directly to their underlying platform choice - there are likely many other factors contributing to those too, including wider macroeconomic factors - but overall, yes, I agree. Apple needs to do something to trigger another wave of Mac sales growth and get out of this rut. When I watched the event one of my first thoughts was that this is going to have wider impact on the industry, as the performance characteristics (especially as Apple announce the higher-end options in due course) may be so far ahead of the competition that it drives greater adoption of Mac in market segments that are still largely Wintel-oriented.
 
  • Like
Reactions: The Man
If you need windows, then you will eventually likely need a windows machine. Luckily for Apple, only 1% of users use bootcamp, and something like 5% use VMs, so even if they lose those customers, they will more than make up for it with new buyers who want to run iOS software on their laptop or desktop.

I think it's far more than that actually. A LOT of companies use Macs for development and a lot of them are dependant on VMs and virtualization like Docker. Losing those customers is probably more significant than you think.
 
Of course it's possible. The company is just choosing not to go to the effort of doing it, choosing instead to do other things that it deems higher priority, and taking on business risk in the process of doing so. And at some point in the future something will happen to that inventory and database that stops the business from functioning and everyone will then run around saying things like "Dammit, why didn't we replace this years ago?" and "How could we/you let this happen?!?". Anybody that has been around enterprise IT for any length of time will have seen this scenario play out.
I guess i’m stuck with a crappy app developer.
 
The MacBook Air, for example, has never had a dedicated GPU with independent graphics memory.

well if don’t know if the memory was shared or not....

but the second generation MacBook Air had a dedicated NVIDIA GPU in addition to the Intel integrated GPU... and it was great.

good old times when Apple used Ni día gpus....
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.