Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is making their own GPU. They’ve been quite open about that, and discussed it briefly during one of the WWDC sessions.
Didn't hear that session. By "making their own GPU" did they mean they're making something to replace their AMD dGPUs, or merely their Intel integrated GPUs?
Well they are definitely going to be discontinued, so get used to the idea of buying another laptop. Though I don't understand that argument since you aren't going to find multi-platform compatibility that includes MacOS if you buy a dell or whatnot.
If the apps you want to run are a combination of "universal" (Windows + Mac) apps and Windows-only apps (which I think is the most typical situation), then to run both you only need multiplatform capability on the Mac; you don't need it with a PC. That's likely why the poster only mentioned needing multiplatform capability with the Mac.
 
Last edited:
Didn't hear that session. By "making their own GPU" did they mean they're making something to replace their AMD dGPUs, or merely their Intel integrated GPUs?
Nobody outside Apple knows. However it looks as though already the graphics part of the A12Z runs circles around Intel’s current iGPUs, and Federighi said something to the effect of “wait until we’re actually trying” when speaking about actual production Macs.
So I’d say they’ve surely got “good enough” iGPUs today compared to the current products that have corresponding parts. How they solve the issue of replacing or adapting dGPUs is anyone’s guess at this point.
 
My interpretation from the mentioned State of the Union talk: Apple uses its own GPUs, no more Nvidia, AMD or Intel. Think iPad Pro scaled up
 
Last edited:
  • Like
Reactions: Katiebm
Didn't hear that session. By "making their own GPU" did they mean they're making something to replace their AMD dGPUs, or merely their Intel integrated GPUs?

If the apps you want to run are a combination of "universal" (Windows + Mac) apps and Windows-only apps (which I think is the most typical situation), then to run both you only need multiplatform capability on the Mac; you don't need it with a PC. That's likely why the poster only mentioned needing multiplatform capability with the Mac.

Pretty sure that, at least for sealed boxed (unliked Mac Pro) there will be one and only one gpu, though they didn’t say it. Multiple GPUs is a design kludge.

as for the latter point, *my* point is why put the onus on Mac? Why not bitch and moan that Dell doesn’t make a box that can run windows+mac. I guess if you have no mac-specific apps you need to run, then fine, but people who spend the extra $$ for mac Probably have some mac-only apps they like to run, or at least prefer mac versions of those “universal” apps.
 
  • Like
Reactions: Katiebm
Pretty sure that, at least for sealed boxed (unliked Mac Pro) there will be one and only one gpu, though they didn’t say it. Multiple GPUs is a design kludge.
Why a kludge? I thought it was a great idea -- a low-power GPU for when you're mobile, to conserve battery life, combined with a high-power GPU for when you're plugged in and need the performance. Conceptually, it's identical to having both low-power and high-power CPU cores, which Apple is rumored to be planning.

as for the latter point, *my* point is why put the onus on Mac? Why not bitch and moan that Dell doesn’t make a box that can run windows+mac. I guess if you have no mac-specific apps you need to run, then fine, but people who spend the extra $$ for mac Probably have some mac-only apps they like to run, or at least prefer mac versions of those “universal” apps.

Here you're making a separate point. That wasn't what you said in response to the poster; there you were essentially arguing a PC also wouldn't allow him to run the all the apps he needed, which is typically incorrect--typically it would. If you read the posts of those who complain about losing the ability to run Windows, overwhelmingly they're people who have to run typical office tools that are readily available for both Mac and Windows (e.g. Office, Adobe, etc), as well as specialty Windows-only software.

There are a lot of professionals whose companies will buy them only one computer, and whose jobs, in addition to needing the typical office tools, also require they be able to run custom software written specifically for that company; and that custom software is typically Windows-only.

Also, I don't think they're bitching and moaning; they're just stating their needs, and their needs are legitimate. I think the stronger argument for Apple dropping Windows (if such is unavoidable) is that some things may need to be sacrificed for the Mac to move forward, and provide greater functionality to a larger group of people, which is an argument you've also made. But it think that argument can be made without denigrating those who have these needs.

I myself have been in that group -- I've been given a choice of a Mac, Windows, or Linux for my research, and chose the Mac because I strongly prefer working in the Mac OS, and because it gives me access to a native Unix terminal for coding (which Windows doesn't). At the same time, there was some Windows-only software I needed to run for certain projects, and having Bootcamp available was highly functional for me.

More broadly, when you write "but people who spend the extra $$ for mac Probably have some mac-only apps they like to run, or at least prefer mac versions of those “universal” apps", I think you miss what's going on here -- you're focusing on the unusual cases, and missing the general one. Overwhelmingly, (again, if you read their posts) people aren't spending the extra $$ for Macs because they like the Mac version of apps. They're buying Macs because they prefer Mac OS over Windows OS -- they want to be in the Mac environment. Bootcamp/virtualization allows them to stay in the environment they like, yet still run the Windows apps they like (or that their job requires).

"why put the onus on Mac?...Why not bitch and moan that Dell doesn’t make a box that can run windows+mac." If you're talking about Dell not being able to run Windows and Mac OS, then here the onus is on Apple, because the only reason a Dell (or any PC) can't run Windows + Mac is that Apple won't allow it.

And if you're talking about being able to run Windows-only and Mac-only apps, logically, given the above (that they've bought the Mac b/c they like the OS), it would make no sense for them to complain that a Dell wouldn't have that capability, since that would do them no do them no good -- they'd be stuck in Windows!

The only people that should be complaining about a Dell not being able to run both Windows-only and Mac-only apps are those that want to be in a Windows environment and need to run Mac-only apps-- a very different group from what you'd expect to encounter on MacRumors!
 
Last edited:
Why a kludge? I thought it was a great idea -- a low-power GPU for when you're mobile, to conserve battery life, combined with a high-power GPU for when you're plugged in and need the performance. Conceptually, it's identical to having both low-power and high-power CPU cores, which Apple is rumored to be planning.

A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery.

Two GPUs requires lots of complication. And there have been numerous problems in Apple’s laptops caused by this over the years. Things like color rendition not matching when it switches, power-induced failures, etc.
 
  • Like
Reactions: theorist9
A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery.

Two GPUs requires lots of complication. And there have been numerous problems in Apple’s laptops caused by this over the years. Things like color rendition not matching when it switches, power-induced failures, etc.

16inch Macbook Pro successor would still need dGPU alongside its own SoC, otherwise, it will most likely be a step back in performance. I can see them still using AMD for dGPUs, at least until they can scale up the GPU to match them.
 
16inch Macbook Pro successor would still need dGPU alongside its own SoC, otherwise, it will most likely be a step back in performance. I can see them still using AMD for dGPUs, at least until they can scale up the GPU to match them.

Hard to know. There has been a secret massive project to design a GPU going on at Apple for years, and they licensed a very interesting technology about 6 months ago. I expect to be surprised. And I doubt there will be support for third party GPU in their MBPs.
 
A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery.

Two GPUs requires lots of complication. And there have been numerous problems in Apple’s laptops caused by this over the years. Things like color rendition not matching when it switches, power-induced failures, etc.
I see -- because a GPU has so many cores, it can be made scalable. By contrast, a single CPU core can't be made scalable, which is why, for a CPU, you need both low-power and high-power cores.

I'm inferring this scalability is not available in current AMD dGPUs. Because if it were, then Apple wouldn't need to utilize a separate integrated GPU to save power in its current MBP's. Is that correct?

And if it isn't available in current AMD GPU's, then I further infer Apple is trying to offer something AMD can't (since, given the obvious benefits, if AMD could make their mobile GPUs power-scalable, they would). That would seem indicate that producing GPUs that can scale from integrated-GPU power to mobile dGPU-power will represent a significant technological challenge (and, if successful, achievement) for Apple.
 
Last edited:
Hard to know. There has been a secret massive project to design a GPU going on at Apple for years, and they licensed a very interesting technology about 6 months ago. I expect to be surprised. And I doubt there will be support for third party GPU in their MBPs.

Interesting tech? From Imagination or some other tech company I am not aware of? Apple has been licensing their tech since 2014. Apple did attempt to break away from them but probably realized it is impossible to find a walkaround solution wihtout violating their patents.
 
Interesting tech? From Imagination or some other tech company I am not aware of? Apple has been licensing their tech since 2014. Apple did attempt to break away from them but probably realized it is impossible to find a walkaround solution wihtout violating their patents.

Imagination. They actually terminated the license and imagination never had a basis to sue (after all, apple is likely infringing Nvidia and AMD patents too, and those two are likely infringing each other’s patents, etc.). The reason they signed a new license is for, among other things, access to imagination’s ray tracing technology.
[automerge]1593308527[/automerge]
I see -- because a GPU has so many cores, it can be made scalable. By contrast, a single CPU core can't be made scalable, which is why, for a CPU, you need both low-power and high-power cores.

I'm inferring this scalability is not available in current AMD dGPUs. Because if it were, then Apple wouldn't need to utilize a separate integrated GPU to save power in its current MBP's. Is that correct?

And if it isn't available in current AMD GPU's, then I further infer Apple is trying to offer something AMD can't (since, given the obvious benefits, if AMD could make their mobile GPUs power-scalable, they would). That would seem indicate that producing GPUs that can scale from integrated-GPU power to mobile dGPU-power will represent a significant technological challenge (and, if successful, achievement) for Apple.

No, AMD doesn’t do it because even if they made it low power it would still be uncompetitive with integrated graphics. Apple can do something AMD cannot - apple can build the discrete GPU right into it’s CPU package/module. (AMD can do that for its own processors if it wishes, but then it limits the market to people willing to buy that. - i should note that this is different than chiplet technology, which has its own positives and negatives and differs in thermal dissipation)
 
Last edited:
Apple can do something AMD cannot - apple can build the discrete GPU right into it’s CPU package/module. (AMD can do that for its own processors if it wishes, but then it limits the market to people willing to buy that.)

But as you said before....
A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery.

Two GPUs requires lots of complication. And there have been numerous problems in Apple’s laptops caused by this over the years. Things like color rendition not matching when it switches, power-induced failures, etc.

Given this, why wouldn't laptop manufacturers who currently buy both AMD CPUs (which have integrated graphics) and dGPUs instead prefer a "far more elegant" solution that is also less prone to complications?
 
But as you said before....


Given this, why wouldn't laptop manufacturers who currently buy both AMD CPUs (which have integrated graphics) and dGPUs instead prefer a "far more elegant" solution that is also less prone to complications?

I am not sure i understand the question. Sure, an AMD cpu+GPU is more elegant than some sort of amd cpu+ some sort of integrated graphics+some sort of discrete graphics. Again, I may be misunderstanding your question.
 
I am not sure i understand the question. Sure, an AMD cpu+GPU is more elegant than some sort of amd cpu+ some sort of integrated graphics+some sort of discrete graphics. Again, I may be misunderstanding your question.

Let me give it another try!:

Your prediction was that Apple would, for all its sealed boxes, be using a scalable dGPU built into its CPU package/module: "Pretty sure that, at least for sealed boxed (unliked Mac Pro) there will be one and only one gpu, though they didn’t say it." When I asked how, with a single GPU, they'd offer both low power on battery and high performance when plugged in, you explained they'd be using a scalable GPU: "A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery."

So this sounds like a great idea with obvious benefits and reduced downsides (eliminating the problems associated with switching). Which raises the obvious question: If it's so great, why doesn't AMD offer this? There are two possibilities:

1) AMD doesn't know how to make this (integrating a scalable GPU into its CPU package) (in a commercially acceptable way, i.e., in sufficient volume at an acceptable cost and with adequate reliability).
2) AMD does know how to make this, but they don't have sufficient customers for it (as you said, "it limits the market to people willing to buy that").

You said the answer wasn't no. 1 ("AMD can do that for its own processors if it wishes"), which leaves no. 2. But that doesn't make sense to me, because if it is so great, and given they have laptop manufacturers that currently purchase both their CPU (with integrated graphics) and dGPU, why wouldn't those customers want a significantly improved solution?

So, unless I'm missing something, I think either AMD hasn't figured out how to make this, or this approach isn't all honey and roses. Because if they could do it (again, in a commercially acceptable way), and it were all honey and roses, I don't know why their customers wouldn't flock to it.

I hope this explains it better.
 
Last edited:
Your prediction was that Apple would, for all its sealed boxes, be using a scalable dGPU built into its CPU package/module: "Pretty sure that, at least for sealed boxed (unliked Mac Pro) there will be one and only one gpu, though they didn’t say it." When I asked how, with a single GPU, they'd offer both low power on battery and high performance when plugged in, you explained they'd be using a scalable GPU: "A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery."

So this sounds like a great idea with obvious benefits and reduced downsides (eliminating the problems associated with switching). Which raises the obvious question: If it's so great, why doesn't AMD offer this? There are two possibilities:

1) AMD doesn't know how to make this (in a commercially successful way, i.e., in sufficient volume at an acceptable cost and with adequate reliability).
2) AMD does know how to make this, but they don't have sufficient customers for it.

You said the answer wasn't no 1., suggesting it's no. 2. But that doesn't make sense to me, because if it is so great, and given they have laptop manufacturers that currently purchase both their CPU (with integrated graphics) and dGPU, why wouldn't those customers want a significantly improved solution?

So, unless I'm missing something, I think either AMD hasn't figured out how to make this, or this approach isn't all honey and roses. Because if they could do it, and it were all honey and roses, I don't know why their customers wouldn't flock to it.

I hope I've explained it better!


Thanks, that’s more clear. The reason I say “2” is that its customers (who aren’t you and me - but the computer companies) like to mix and match hardware. They want to upsell you to a discrete card, and then up sell you to a better discrete card, and maybe an NVIDIA card or maybe not. Do Dell’s customers want to be locked into a product that’s availabel in one configuration (because it’s in the chip package)? Maybe maybe not. But Dell would prefer to be able to sell a range of options.

Whereas Apple doesn’t give a damn about that. They are happy to control the entire package. They’ll sell you one with 32 graphics cores or you can upgrade to 48, but it will be the same chip (perhaps with some cores turned off by blowing fuses), and won’t sweat it.
 
  • Like
Reactions: theorist9
Thanks, that’s more clear. The reason I say “2” is that its customers (who aren’t you and me - but the computer companies) like to mix and match hardware. They want to upsell you to a discrete card, and then up sell you to a better discrete card, and maybe an NVIDIA card or maybe not. Do Dell’s customers want to be locked into a product that’s availabel in one configuration (because it’s in the chip package)? Maybe maybe not. But Dell would prefer to be able to sell a range of options.

Whereas Apple doesn’t give a damn about that. They are happy to control the entire package. They’ll sell you one with 32 graphics cores or you can upgrade to 48, but it will be the same chip (perhaps with some cores turned off by blowing fuses), and won’t sweat it.

OK, that makes sense. Thanks for the added explanation.

I could then ask why AMD couldn't also offer the same mix-and-match options with an integrated solution (level 1 CPU with level 1, 2, or 3 scalable graphics, level 2 CPU with level 1, 2, or 3 scalable graphics; level 3 CPU with level 1, 2 or 3 scalable graphics). To which your answer might be that PC manufacturers like to offer more possible combos than Apple does, making it cost-prohibitive (wrt manufacturing efficiency and economies of scale) for AMD to cover a sufficiently wide range of combos—unless they could, like you say, just blow some fuses to adjust the graphics level.
 
Last edited:
OK, that makes sense. Thanks for the added explanation.

I could then ask why AMD couldn't also offer the same mix-and-match options with an integrated solution (level 1 CPU with level 1, 2, or 3 scalable graphics, level 2 CPU with level 1, 2, or 3 scalable graphics; level 3 CPU with level 1, 2 or 3 scalable graphics). To which your answer might be that PC manufacturers like to offer more possible combos than Apple does, making it cost-prohibitive (wrt manufacturing efficiency and economies of scale) for AMD to cover a sufficiently wide range of combos—unless they could, like you say, just blow some fuses to adjust the graphics level.
That, and computer manufacturers like second-sourcing and playing nvidia against amd.
 
And if it isn't available in current AMD GPU's, then I further infer Apple is trying to offer something AMD can't (since, given the obvious benefits, if AMD could make their mobile GPUs power-scalable, they would). That would seem indicate that producing GPUs that can scale from integrated-GPU power to mobile dGPU-power will represent a significant technological challenge (and, if successful, achievement) for Apple.

This is not a technological challenge but a economic question only. Typically you would not like to make an integrated GPU too large because you would need to sell the same die to customers, who need either no integrated GPU or need much less performance than you would offer from the integrated GPU.
 
This is not a technological challenge but a economic question only. Typically you would not like to make an integrated GPU too large because you would need to sell the same die to customers, who need either no integrated GPU or need much less performance than you would offer from the integrated GPU.
Also true. Though i guess if the integrated GPU is not actually integrated (i.e. it’s a “chiplet” or whatnot) it’s less of an issue. But the chiplet package is actually not that advanced, thermally, and couldn’t handle the scaling up of power, so you’d end up with two different packages in any case.

it’s all just economics. But graphics are very scalable - that’s why adding a second video card is a thing. It scales pretty linearly. The more graphics cores, the smaller amount of information each has to work with. And you can turn off cores and eliminate almost all of their power consumption. So an Apple GPU running at 30fps on battery could then turn on more cores and run at 120fps on the power main. Or whatever. Ray tracing may only work when plugged in, etc. etc. But having all the cores be identical is a lot easier than having to simultaneously support an intel (or apple) “integrated” gpu and a second AMD/nvidia discrete GPU.
 
If Apple does offer this on their laptops, and it operates well, that could have the effect of popularizing this technology. If this happens, I think there's a reasonable chance PC laptop manufactures will wish to follow suit, and that AMD will thus eventually offer this as well (assuming the only barrier is economic). NVIDA might also, using an ARM-based CPU (since they've already begun to collaborate on GPU-ARM integration for the server market).
 
Last edited:
I know. The stress level as a teacher is better (not necessarily less, but much more manageable).

Ironically I was quit apt at finding even the weirdest bugs back when I was "just" a developer; I think that maybe suited me better as a person. ;)

To keep this (slightly) OT: I'm looking forward to see what Apple can do with their own Apple Silicon, I think maybe we're in for a treat... I might even consider ditching my (newish) 2019 MBP if the new machines are as good as I expect them to be.

Intel has disappointed over and over again the last couple of years, and this anecdotal evidence of (very) poor quality control only amplifies that.
No I know what you mean, and that’s good you found something better suited to you!

I agree this will be an interesting next couple of years not just for Apple, but the industry as a whole. Computers are exciting again!
 
Lets be real here, the only reason Apple is switching is because their devices will be cooler with less power usage while giving similar Intel performances, and they will gain a lot by unifying the architecture with their iPhone cash cow. It also helps that they will save money by not paying the margin of profit to intel.

There is nothing wrong with Intel processors and if it is then we will see other manufacturers abandoning them, if Intel was the problem then why not use AMD? Or make your own x86? I want to see Apple's Silicon ARM beat Intel most powerful competitive x86 chip on Windows/Linux, only then I will be convinced.

There is not much to innovate really, Intel is still the most powerful chip for the most part, otherwise why would anyone still get Intel over AMD? We are no longer in the 90s, CPU speeds are more than enough for 95% of the use case and those who need more power have professionally designed system to pump as much Mhz and Gigaflops as they want.
 
There is nothing wrong with Intel processors and if it is then we will see other manufacturers abandoning them, if Intel was the problem then why not use AMD?

1) AMD does not compete in every market segment
2) AMD is only a short-term solution. They have no history of being ahead of intel for more than a year at a time, once every 10-15 years.



Or make your own x86?

Only a handful of companies have access to the x86 license. Even fewer have access to the x86-64 license. So short answer: you’d get sued.


There is not much to innovate really, Intel is still the most powerful chip for the most part, otherwise why would anyone still get Intel over AMD?

AMD must be the most powerful, otherwise why would anyone buy AMD instead of Intel?

See how that works?
 
Lets be real here, the only reason Apple is switching is because their devices will be cooler with less power usage while giving similar Intel performances
...And Apple will be subject to their own roadmap only:
- How many times have people on this forum complained of the lack of new computer models in the last few years, when the underlying problem was that there was no suitable CPUs to put in a new computer model?
- I suspect Apple's design department isn't very happy with the current state, where Intel promised a certain performance per Watt for processor generation X, and then couldn't live up to it, resulting in us consumers swearing at Apple for making things too thin to perform properly, when all they did was design cases based on Intel's promises.

I want to see Apple's Silicon ARM beat Intel most powerful competitive x86 chip on Windows/Linux, only then I will be convinced.
Fair enough.

Intel is still the most powerful chip for the most part, otherwise why would anyone still get Intel over AMD?
Because of corporate momentum. For example it will be two years before we invest in the foundations of a completely new virtualization farm, and only then will we look at alternatives, since we're stuck with choices made long ago.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.