Thanks!Apple is using FPGA to accelerate video processing and is loading the code depending on codecs used. This speeds up calculations dramatically. It’s selling this in a card for mac pro.
Thanks!Apple is using FPGA to accelerate video processing and is loading the code depending on codecs used. This speeds up calculations dramatically. It’s selling this in a card for mac pro.
Didn't hear that session. By "making their own GPU" did they mean they're making something to replace their AMD dGPUs, or merely their Intel integrated GPUs?Apple is making their own GPU. They’ve been quite open about that, and discussed it briefly during one of the WWDC sessions.
If the apps you want to run are a combination of "universal" (Windows + Mac) apps and Windows-only apps (which I think is the most typical situation), then to run both you only need multiplatform capability on the Mac; you don't need it with a PC. That's likely why the poster only mentioned needing multiplatform capability with the Mac.Well they are definitely going to be discontinued, so get used to the idea of buying another laptop. Though I don't understand that argument since you aren't going to find multi-platform compatibility that includes MacOS if you buy a dell or whatnot.
Nobody outside Apple knows. However it looks as though already the graphics part of the A12Z runs circles around Intel’s current iGPUs, and Federighi said something to the effect of “wait until we’re actually trying” when speaking about actual production Macs.Didn't hear that session. By "making their own GPU" did they mean they're making something to replace their AMD dGPUs, or merely their Intel integrated GPUs?
Didn't hear that session. By "making their own GPU" did they mean they're making something to replace their AMD dGPUs, or merely their Intel integrated GPUs?
If the apps you want to run are a combination of "universal" (Windows + Mac) apps and Windows-only apps (which I think is the most typical situation), then to run both you only need multiplatform capability on the Mac; you don't need it with a PC. That's likely why the poster only mentioned needing multiplatform capability with the Mac.
Why a kludge? I thought it was a great idea -- a low-power GPU for when you're mobile, to conserve battery life, combined with a high-power GPU for when you're plugged in and need the performance. Conceptually, it's identical to having both low-power and high-power CPU cores, which Apple is rumored to be planning.Pretty sure that, at least for sealed boxed (unliked Mac Pro) there will be one and only one gpu, though they didn’t say it. Multiple GPUs is a design kludge.
as for the latter point, *my* point is why put the onus on Mac? Why not bitch and moan that Dell doesn’t make a box that can run windows+mac. I guess if you have no mac-specific apps you need to run, then fine, but people who spend the extra $$ for mac Probably have some mac-only apps they like to run, or at least prefer mac versions of those “universal” apps.
Why a kludge? I thought it was a great idea -- a low-power GPU for when you're mobile, to conserve battery life, combined with a high-power GPU for when you're plugged in and need the performance. Conceptually, it's identical to having both low-power and high-power CPU cores, which Apple is rumored to be planning.
A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery.
Two GPUs requires lots of complication. And there have been numerous problems in Apple’s laptops caused by this over the years. Things like color rendition not matching when it switches, power-induced failures, etc.
16inch Macbook Pro successor would still need dGPU alongside its own SoC, otherwise, it will most likely be a step back in performance. I can see them still using AMD for dGPUs, at least until they can scale up the GPU to match them.
I see -- because a GPU has so many cores, it can be made scalable. By contrast, a single CPU core can't be made scalable, which is why, for a CPU, you need both low-power and high-power cores.A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery.
Two GPUs requires lots of complication. And there have been numerous problems in Apple’s laptops caused by this over the years. Things like color rendition not matching when it switches, power-induced failures, etc.
Hard to know. There has been a secret massive project to design a GPU going on at Apple for years, and they licensed a very interesting technology about 6 months ago. I expect to be surprised. And I doubt there will be support for third party GPU in their MBPs.
Interesting tech? From Imagination or some other tech company I am not aware of? Apple has been licensing their tech since 2014. Apple did attempt to break away from them but probably realized it is impossible to find a walkaround solution wihtout violating their patents.
I see -- because a GPU has so many cores, it can be made scalable. By contrast, a single CPU core can't be made scalable, which is why, for a CPU, you need both low-power and high-power cores.
I'm inferring this scalability is not available in current AMD dGPUs. Because if it were, then Apple wouldn't need to utilize a separate integrated GPU to save power in its current MBP's. Is that correct?
And if it isn't available in current AMD GPU's, then I further infer Apple is trying to offer something AMD can't (since, given the obvious benefits, if AMD could make their mobile GPUs power-scalable, they would). That would seem indicate that producing GPUs that can scale from integrated-GPU power to mobile dGPU-power will represent a significant technological challenge (and, if successful, achievement) for Apple.
Apple can do something AMD cannot - apple can build the discrete GPU right into it’s CPU package/module. (AMD can do that for its own processors if it wishes, but then it limits the market to people willing to buy that.)
A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery.
Two GPUs requires lots of complication. And there have been numerous problems in Apple’s laptops caused by this over the years. Things like color rendition not matching when it switches, power-induced failures, etc.
But as you said before....
Given this, why wouldn't laptop manufacturers who currently buy both AMD CPUs (which have integrated graphics) and dGPUs instead prefer a "far more elegant" solution that is also less prone to complications?
I am not sure i understand the question. Sure, an AMD cpu+GPU is more elegant than some sort of amd cpu+ some sort of integrated graphics+some sort of discrete graphics. Again, I may be misunderstanding your question.
Your prediction was that Apple would, for all its sealed boxes, be using a scalable dGPU built into its CPU package/module: "Pretty sure that, at least for sealed boxed (unliked Mac Pro) there will be one and only one gpu, though they didn’t say it." When I asked how, with a single GPU, they'd offer both low power on battery and high performance when plugged in, you explained they'd be using a scalable GPU: "A single GPU with scalable performance/power is far more elegant. Turn off cores and pipelines when on battery."
So this sounds like a great idea with obvious benefits and reduced downsides (eliminating the problems associated with switching). Which raises the obvious question: If it's so great, why doesn't AMD offer this? There are two possibilities:
1) AMD doesn't know how to make this (in a commercially successful way, i.e., in sufficient volume at an acceptable cost and with adequate reliability).
2) AMD does know how to make this, but they don't have sufficient customers for it.
You said the answer wasn't no 1., suggesting it's no. 2. But that doesn't make sense to me, because if it is so great, and given they have laptop manufacturers that currently purchase both their CPU (with integrated graphics) and dGPU, why wouldn't those customers want a significantly improved solution?
So, unless I'm missing something, I think either AMD hasn't figured out how to make this, or this approach isn't all honey and roses. Because if they could do it, and it were all honey and roses, I don't know why their customers wouldn't flock to it.
I hope I've explained it better!
Thanks, that’s more clear. The reason I say “2” is that its customers (who aren’t you and me - but the computer companies) like to mix and match hardware. They want to upsell you to a discrete card, and then up sell you to a better discrete card, and maybe an NVIDIA card or maybe not. Do Dell’s customers want to be locked into a product that’s availabel in one configuration (because it’s in the chip package)? Maybe maybe not. But Dell would prefer to be able to sell a range of options.
Whereas Apple doesn’t give a damn about that. They are happy to control the entire package. They’ll sell you one with 32 graphics cores or you can upgrade to 48, but it will be the same chip (perhaps with some cores turned off by blowing fuses), and won’t sweat it.
That, and computer manufacturers like second-sourcing and playing nvidia against amd.OK, that makes sense. Thanks for the added explanation.
I could then ask why AMD couldn't also offer the same mix-and-match options with an integrated solution (level 1 CPU with level 1, 2, or 3 scalable graphics, level 2 CPU with level 1, 2, or 3 scalable graphics; level 3 CPU with level 1, 2 or 3 scalable graphics). To which your answer might be that PC manufacturers like to offer more possible combos than Apple does, making it cost-prohibitive (wrt manufacturing efficiency and economies of scale) for AMD to cover a sufficiently wide range of combos—unless they could, like you say, just blow some fuses to adjust the graphics level.
And if it isn't available in current AMD GPU's, then I further infer Apple is trying to offer something AMD can't (since, given the obvious benefits, if AMD could make their mobile GPUs power-scalable, they would). That would seem indicate that producing GPUs that can scale from integrated-GPU power to mobile dGPU-power will represent a significant technological challenge (and, if successful, achievement) for Apple.
Also true. Though i guess if the integrated GPU is not actually integrated (i.e. it’s a “chiplet” or whatnot) it’s less of an issue. But the chiplet package is actually not that advanced, thermally, and couldn’t handle the scaling up of power, so you’d end up with two different packages in any case.This is not a technological challenge but a economic question only. Typically you would not like to make an integrated GPU too large because you would need to sell the same die to customers, who need either no integrated GPU or need much less performance than you would offer from the integrated GPU.
No I know what you mean, and that’s good you found something better suited to you!I know. The stress level as a teacher is better (not necessarily less, but much more manageable).
Ironically I was quit apt at finding even the weirdest bugs back when I was "just" a developer; I think that maybe suited me better as a person.
To keep this (slightly) OT: I'm looking forward to see what Apple can do with their own Apple Silicon, I think maybe we're in for a treat... I might even consider ditching my (newish) 2019 MBP if the new machines are as good as I expect them to be.
Intel has disappointed over and over again the last couple of years, and this anecdotal evidence of (very) poor quality control only amplifies that.
There is nothing wrong with Intel processors and if it is then we will see other manufacturers abandoning them, if Intel was the problem then why not use AMD?
Or make your own x86?
There is not much to innovate really, Intel is still the most powerful chip for the most part, otherwise why would anyone still get Intel over AMD?
...And Apple will be subject to their own roadmap only:Lets be real here, the only reason Apple is switching is because their devices will be cooler with less power usage while giving similar Intel performances
Fair enough.I want to see Apple's Silicon ARM beat Intel most powerful competitive x86 chip on Windows/Linux, only then I will be convinced.
Because of corporate momentum. For example it will be two years before we invest in the foundations of a completely new virtualization farm, and only then will we look at alternatives, since we're stuck with choices made long ago.Intel is still the most powerful chip for the most part, otherwise why would anyone still get Intel over AMD?