Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

That agreement expired in 2017, but even if it continued today it would be $233M from licensing out of $10+B in revenue...

It's not that Nvidia doesn't license anything, most companies do, it's just not their primary business. It is Arm's primary business.

I suppose "diversification" could be an answer to my question, but I'm sure it's going to be an expensive buy right now-- I'd expect there to be some synergy to make it worth the price.
 
Why wouldn't Intel buy ARM?

There is probably a law against it. I mean why doesn’t NVidia buy AMD instead if there isnt. It seems more beneficial to them. They would knock out a competitor in Gpu’s and gain cpu goodness.
 
Last edited:
How does a graphics card company is worth more than Intel which runs like all computers in the world since the personal computer was a thing?

Also surprising no one wants a piece of ARM, there are so many potential buyers like TSMC, Google, Microsoft, AMD, Qualcomm, even Sony and Samsung and why is Softbank is selling when Apple is finally adopting ARM if anything this means ARM should be on the rise.

I have a feeling they all know something we don't, and something tells me it has to do about the open source RISC V CPU but I am only speculating.

Apple don’t care about this - I used to work for ARM, and one thing people joke about there is how Apple just licenses the instruction set. Everything about how that is implemented is custom.

For the rest of the industry, this sucks big-time. Nvidia are a very aggressive company. We were all wondering what Jenson had up his sleeve (he always has something, and it always turns out to be enough to save Nvidia for the next decade+).

I heard the instruction set is just like 150 commands, why doesn't Apple make their own instruction set?
 
  • Like
Reactions: DeepIn2U
I don’t think this will affect Apple very much, but this is certainly a good sign for Nintendo, who is the driving force of sales of Tegra chips in the Switch. This type of investment will solidify nVidia’s further development of the line.

you’re forgetting Tegra chips have been used in Audi A8-S8, A4/S4, and in their A3 cars as well as VW’s Passat and GTi lineup as well. I’d presume those sales are a much higher driving force globally.

this will heavily bolster their partnership with Tesla as well. I’m not sure if Telsa designa the cpu while NVidia builds it or helped in design only and Tesla makes their AI chip??
 
Pretty good chance this will deepen , not lessen , the 'dust up' between Apple and Nvidia. Nvidia GPUs would be as unlikely state on macOS as ARM's GPU.

Apple has a license for the parts of the ARM IP they use now. If Nvidia gets heavy handed in cranking up licensing costs and/or trying to act like they have deep leverage on Apple then , Nvidia will just dig a deeper hole.

Yet another substantive income stream means Nvidia "needs" the small fragment of dGPUs left of the macOS even less. That doesn't put Apple in a better leverage position either.

Apple is out to remove as many discrete GPUs as that can from most of the Mac line up. Metal is top priority and anyone who can't get on board with that is 'out'. ( as long as it is "CUDA first and Metal second' ... not going to get in. ) . Metal will just be even more deeply entrenched on macOS when Apple GPU is the largest volume GPU on Macs ( versus Intel iGPU now) .


Even if Nvidia runs ARM has a "hands off" , wholly own subsidiary where doesn't get looped into the Nvidia - Apple 'dust up'. That won't help resolve that situation either.
Well, whatever the reason is for Apple and Nvidia's feud, it seems pretty petty from the eyes of the consumer. We just want good Apple products. And Nvidia makes better GPUs than AMD... So it feels pretty slimy as a consumer that Apple is holding back on us. Can you imagine having dual RTX 2080ti cards in the new Mac Pro? That would be the dream! Alas, Apple doesn't want us to have that much power, apparently.
 
( as long as it is "CUDA first and Metal second' ... not going to get in. ) . Metal will just be even more deeply entrenched on macOS when Apple GPU is the largest volume GPU on Macs ( versus Intel iGPU now) .
On the other hand Apples Metal only strategy is quite strange. Apple doesn‘t sell any silicon to other companies and IMHO doesn‘t plan to do so.
But computer accelerated image processing and ai plays a huge roll in computing industry today and that roll is growing. No matter if you have a phone, a drone or a car every single hardware that is equipped with a camera needs GPU acceleration.
And no device expect Apple devices is using a Metal framework or any Apple silicon GPU therefore you cannot develop on Apple hardware since there is no Cuda on Apples island, no Vulkan and in the future not even OpenGL.

If you want to do these kind of things as a developer - you have to use Linux or Windows.
 
On the other hand Apples Metal only strategy is quite strange. Apple doesn‘t sell any silicon to other companies and IMHO doesn‘t plan to do so.
But computer accelerated image processing and ai plays a huge roll in computing industry today and that roll is growing. No matter if you have a phone, a drone or a car every single hardware that is equipped with a camera needs GPU acceleration.
And no device expect Apple devices is using a Metal framework or any Apple silicon GPU therefore you cannot develop on Apple hardware since there is no Cuda on Apples island, no Vulkan and in the future not even OpenGL.

If you want to do these kind of things as a developer - you have to use Linux or Windows.

Is this why Apple released those Metal Windows drivers recently?
 
Is this why Apple released those Metal Windows drivers recently?
Just to clarify - they delivered tools and not drivers.
This only means that you can now use Windows to build Metal game assets

Many game development studios have established game or graphics asset production pipelines that use the Microsoft Windows infrastructure. One of the key final steps in the asset creation process is compiling the graphics and compute shaders for inclusion in the game
https://www.macrumors.com/2020/07/11/metal-developer-tools-windows/

The Metal Developer Tools for Windows enables Metal Shading Language (MSL) compilation on Windows into Metal Library Objects targeting Apple platforms.
https://developer.apple.com/metal/
 
Last edited:
TSMC work closely with ARM to optimise new processes - in return ARM gets to tune their IP ‘cookbook’.

Rather than tune an ARM Mali GPU on a new process, ARM could be tuning an ARM Cortex with CUDA extensions to directly call the Nvidia GPU IP on the same silicon.

Nvidia’s offerings could be on the market earlier than the current production cycle.

Correct me if I'm wrong, but I don't think ARM has a relationship with TSMC.

With IP blocks that incorporate ARM Cotex + Nvidia GPU + Mellanox Network on a chip, and handpicked Nvidia language extensions that reduce latency / increase performance of CUDA, Nvidia could eek out performance a watt beyond current offerings.

If ARM IP includes Nvidia GPUs - Nvidia could well become the dominant GPU on 3rd party SoC.
If ARM IP includes Mellanox NoC - There could be some interesting ‘ramifications for HPC.

AJ

Couldn't Nvidia choose to do this anyway, even if ARM were a separate entity?
 
  • Like
Reactions: Analog Kid
I think people are somewhat off base on this news. NVIDIA will have little to no interest on the consumer end of this acquisition.

NVIDIA is already very deep in on using custom ARM SoCs in high end enterprise and AI applications. They are the premier partner in the move towards self driving cars, for example.

The acquisition is probably somewhere between allowing them to build far more tightly integrated SoCs and seeing the growing demand (and possibly future regulation) around reducing power draw of servers. ARM serves both these purposes.
 
  • Like
Reactions: ZzapDK
I don't think you can really say "there's nothing wrong with the architecture except for all the cruft". Backwards compability is the x86 ISA.

If the implication is that Intel chose not to sell into the highest growth market ever because they just don't like money, then the problems at Intel are worse that I thought and AMD missed a huge opportunity.

View attachment 939354

Intel didn't want a piece of that, or couldn't get a piece of that?

the “highest growth” link above is an article from August 14,2012 and what’s the source of your graph above? Is it o my focusing on smartphones sales in the USA? I’m sure Nokia alone sold more than 50K smartphones globally back in 2012.
 
I think people are somewhat off base on this news. NVIDIA will have little to no interest on the consumer end of this acquisition.

NVIDIA is already very deep in on using custom ARM SoCs in high end enterprise and AI applications. They are the premier partner in the move towards self driving cars, for example.


If any one of the major Arm implementors buys Arm that is a potential. Even more so if that implementor is only interested in a relatively narrow subset of what Arm IP covers. There is a very good chance that the requirements of that subset will take precence in terms of investment and resources over the rest of the coverage.


Apple buying Arm would be even worse than Nvidia. But both are 'bad' buyers in terms of the long term prospects of a broader Arm ecosystem.

A better solution would be if 5-8 diverse set of implementors collectively bough Arm. Run Arm as a privately held company that is independent from group with no "under the table discount" deals to the owners to kill off the smaller players that didn't buy in. Run Arm well for 3-8 years and then IPO out a bigger company and get their money back. Apple , Amazon , Microsoft , Qualcomm , Samsung , and Google would be a decent balance of interests. The Saudi's ( or some other big, but siloed in a non growth industry, money player 0 would be some others. And/or some of the bug private equity folks. KKR , etc. . Any balanced group with is not myopically focused on a narrow , short term outcome would be best.

The root cause "problem" here is Softbank; not Arm.

If Apple and six or eight other players all threw $4-6B into the pot then Softbank could exit and get what they probably over paid for back sooner. The only reason why Softbank is selling Arm is that they are eyeball deep in trouble and need to "make money fast".

A diverse mix of buyers would get largely get rid of the government regulators who might try to squash the deal. Nvidia is higly likely to kill off Arm's GPU implementation. That is going to draw a dust up. Every one of the big implementors has a similar drawback.


The acquisition is probably somewhere between allowing them to build far more tightly integrated SoCs and seeing the growing demand (and possibly future regulation) around reducing power draw of servers. ARM serves both these purposes.

If nobody else steps up then Nvidia could make this work. Shouldn't be going to for extremely tight integration that no one else can get to ( either short term or long term ). Buying and operating it a detached subsidiary would be better. Buying it as a "embrace , extend , extinguish " tool to dig a deeper moat around Nvidia own products and software stacks would be bad for the rest of the Arm ecosystem.
 
But why own it? Everyone else is fine licensing from them. ARM Arm is an IP company. I could imagine if someone saw an advantage in bundling IP. Nvidia is not an IP company. What's the advantage to Nvidia in owning Arm?

Because they want to stir their own stuff to the Arm mix to sell more chips to mobile manufacturers.
 
On the other hand Apples Metal only strategy is quite strange. Apple doesn‘t sell any silicon to other companies and IMHO doesn‘t plan to do so.

Apple wants to control all the parts of their own destiny. They also want to use Metal as a way for them to differentiate their smartphone and tablet hardware from competitors.
 
Last edited:
....
Also surprising no one wants a piece of ARM, there are so many potential buyers like TSMC, Google, Microsoft, AMD, Qualcomm, even Sony and Samsung and why is Softbank is selling when Apple is finally adopting ARM if anything this means ARM should be on the rise.

Last part of that addressed first. Softbank is selling not because Arm isn't a long term profitable business. They are selling because Softbank has made several 10's billions of bad, underperforming investments. Giant hole in the ground like WeWork. ( Which was doomed before the pandemic. How it is even still around now and huge shift to remote work is bigger head scratcher. ). Softbank needs to sell something that is valuable to raise cash.

The "bash big tech" movement of late is likely keeping some players on the sideline. Single handedly buying up Arm is going to draw large regulatory fire from any one of the super deep pocketed players and grumbles from the other players. Collectively buying probably would get "collusion" complaints too.

There is a mix to the above two also in that nobody probably wants to get into a bidding war over Arm. That is actually what Softbank wants (and their investment banker because of bigger fee amount because probably get a percentage of the deal in payment). For someone to grossly overpay for Arm. The large overpayment isn't going to go into Arm resources and future prospects..... it is probably purely going into Softbank's pockets with a hole them. Anyone who overpays is also going to have to turn around and put long term investment resources into Arm for any hope of getting a return on the amount "overpaid" with. The fact need to put money into Arm for get future returns will probably keep away the "'vulture capital" private equity folks. The price that Softbank wants is high enough that it would make it tough to buy Arm as a cash cow and suck it dry before would kill the company.

If everyone stays on the sidelines and Softbank is desperate enough Nvidia could get Arm for price where it wouldn't cost Nvidia too much to kill off over an extended period of time the other parts of the Arm ecosystem. That 'loss' would be the price of digging a deeper moat around the rest of Nvidia offering so could make that up over time as long as they keep enough other customers inside the moat.


I have a feeling they all know something we don't, and something tells me it has to do about the open source RISC V CPU but I am only speculating.

Probably only indirectly. The large Arm implementors probably aren't too pressed about someone buying Arm and doing something "dump" to the broad ecosystem because they are all confident that there are other options to jump off to later over time. Regulators will probably be all over this like stink on poo-poo if the buyer makes moves to kill off competitors (these other players on the bidding sideline) positions too fast. So if the buyer slowly kills it off they can make the leap to something else over time.

However, if the buyer looks to keep a broad Arm ecosystem... then not about RISC-V at all. Arm has a much broader ecosystem than any of the other rivals. If the new buyer comes in and makes investments to keeping Arm broad (and therefore getting a return on what paid for Arm) then nothing huge the large implementors need to do other than keep sending in their Arm royalty checks. Lots of implementors sending in regular checks to Arm keeps it healthy even if Arms owner's address changes. Largely paying the same developers/workers at Arm the same amount of money as they were before. Nothing to get overly excited about.

I heard the instruction set is just like 150 commands, why doesn't Apple make their own instruction set?

A proprietary instruction set means that Apple would have to pay for 100% of all the compiler and developer tool development. Also for a large amount of developer education about their proprietary instruction set and longer developer education costs for the software.

As long as keep vast majority of the instruction set the same all the work on better optimizations for that same subset paid for by someone else... Apple gets to leverage also.

Also the case that if Apple misses something that is important then they are "out of the loop" for that. Will have a recovery time until can finish playing catch up. Having a broad set of users helps keeps groupthink and myopic tendencies from biting them in the butt. having a different instruction set just to be different doesn't really buy much. [ If Apple jumped off of Arm because the new owner was off in the 'weeds' then they'd probably jump to another. MIPS, RISC-V , something that had a group so that could spread out the shared R&D. ]


The instruction set isn't the major differentiator. It is the implementations. Even if the new Arm owner goes off into the 'weeds' Apple could just 'deep fork off' Arm instruction set where it is now. Imagination Tech hasn't gone under so other Arm implementors could get around taking Arm's GPU (e..g, Nvidia kills off the current one and trying to shove the Nvidia one down folks throats at pricing/costs that aren't as competitive. )
 
Apple wants to control all the parts of their own destiny. They also want to use Metal as a way for them to differentiate their hardware from competitors.

Metal runs on Intel and AMD GPUs. it is not a single hardware implementation differentiator.

Metal is more so "glue" between iOS , macOS, and the other Apple OS variants. It also is more efficient power wise ( which is why driven more so from the iOS side. )

It will likely start to drift more toward Apple GPU when Apple flips over putting an Apple GPU into every Mac, but for the last several years macOS had a influence of making Metal broader more than more narrow.

Metal was also a way of avoiding the fragmentation that has been standardization processor for a while. Folks trying to "embrace , extend, extinguish" or just outright block OpenCL. "OpenGL next" drifting until AMD contributed Mantle to be the core of the standard that Vulkan became. If Apple sat around waiting for those folks to get their act together then may have missed out on moves they made. That just isn't control for control sake. That is in part control to fill a vacuum.

The problem with Metal is more so Apple spinning that is actually covers what OpenCL and OpenGL did. It doesn't substantive cover those. OpenGL side it pushes too much complexity into apps that they don't necessarily need to do. Every app isn't some high end 3D engine. Some apps just need decent graphics performance. (and sure there could be multiple portable frameworks to port from OpenGL but that is work for lots of current apps. ). Shading languages really earn't the total of the GPGPU computational space. The Metal does it all spin is in part "moat digging" by Apple because pushing some folks into pounding round pegs in square holes.
 
  • Like
Reactions: CWallace
The problem with Metal is more so Apple spinning that is actually covers what OpenCL and OpenGL did. It doesn't substantive cover those. OpenGL side it pushes too much complexity into apps that they don't necessarily need to do.

I can see that being annoying now for Mac developers in the Intel/AMD Era who are still leveraging legacy Open CL / Open GL code in their apps, but once everything has been re-created into an Apple Silicon and Metal native app, will it really matter?
 
Apple still needs a compiler, so unless they're willing to contribute work on a custom instruction set to Clang I don't see them switching.

Apple started Clang. They can afford to contribute another ISA to Clang.

A proprietary instruction set means that Apple would have to pay for 100% of all the compiler and developer tool development.

First: so what? A compiler team isn't that large.

And second, it doesn't have to mean that at all. They could (and likely would) publish enough implementation details that anyone can write a compiler.

But, really, what they'll do is a mixed approach. We know from this slide:

Jane Manchun Wong on Twitter: Apple is patching these Open Source projects  to support Apple Silicon Macs #WWDC20…


They'll jumpstart porting to their ISA by doing the work for some popular projects themselves and thus leaving everyone with enough real-world sample code to make it feasible.
 
Well, whatever the reason is for Apple and Nvidia's feud, it seems pretty petty from the eyes of the consumer. We just want good Apple products. And Nvidia makes better GPUs than AMD... So it feels pretty slimy as a consumer that Apple is holding back on us.

Apple is holding back? Nvidia is holding back also. Apple not signing a driver likely means there is problem with they driver that needs to be corrected to Apple specs before they will sign it. Nvidia pointing the finger at Apple and cajoling their "fanboy" army to hurl all the blame on Apple just digs a deeper hole. That isn't helping to solve the problem at all.

As long as a large faction keep blaming Apple only for this , it is extremely likely never going to get solved. Most likely both sides need to make some adjustments. This impasse isn't strategic for either side so neither party is pressed to solve it either. Many of the external parties jumping on the blame game bandwagon won't move it either (not doing anything substantive different than what the first parties are doing. And if they are going "no where" , so is these other groups. Same path to "no where". )

Consumer wise Metal covers a much large group of folks that buy Apple products than CUDA does. By several orders of magnitude. Where Nvidia pisses on Metal means they are pissing on consumers. So it isn't really a consumer thing for Nvidia . Otherwise Nvidia done what they needed to do to get their driver components signed.

Fact is that Imagination Tech , AMD , and Intel all didn't have problems getting their graphics driver components signed over a long period of time. (Imagination Tech is out , swaped by Apple implemented hardware,), but that still leaves two other vendors getting it done. Nvidia doesn't get it done , but somehow that one out of three means it is all Apple's fault. If Apple was the only GPU implementation getting its drivers signed then Apple might be a more likley sole root cause problem source. They aren't .

Can you imagine having dual RTX 2080ti cards in the new Mac Pro? That would be the dream! Alas, Apple doesn't want us to have that much power, apparently.

Benchmarks out of context doesn't mean much to Nvidia either. Otherwise they would have uncorked the problem on their end.
 
On the other hand Apples Metal only strategy is quite strange. Apple doesn‘t sell any silicon to other companies and IMHO doesn‘t plan to do so.


When the Mac first launched Steve Jobs had them not put any control keys on the keyboard. he didn't want a large number of command line programs ported over to Mac with minimal GUI (or no) GUI slapped on top.

For Apple this is not too strange. There were saner heads around Apple and Jobs wasn't quite deeply entrenched in his visionary super powers around the time Mac launch that that keyboard thing was undone pretty quickly. At this point though the 'massive herd' of applications that would come in is iOS influenced stuff. Being stuff that Apple already makes a percentage on is probably clouding Apple's vision (with dollar signs). If nuking more apps that Apple doesn't make money on for a apps that have a higher return for Apple this is not so strange if willing to swap customer X for customer Y.

Finally, have to also look at how Apple approached OpenGL ( and large extent OpenCL) on the Mac. Apple wrote the "top" half of OpenGL that directly faced the applications and the GPU vendors wrote a common, 'simpler' target bottom half underneath that. This gave macOS a more uniform OpenGL implementation. ( dual edge sword because also to an extend also a 'lowest common denominator' OpenGL foundation also. So if looking for latest, bleeding edge OpenGL implementation it was not going to be there on macOS. ). iOS took an even smaller subset of OpenGL (OpenGL ES) and with similar chop on a smaller range of GPU vendors (just one PowerVR. ). The point common thread though is that the GPU hardware implementors job is to do a subset of the graphics driver stack; not the whole thing.

Metal just moves to an even smaller 'kitchen' with multiple chefs. Apple is still doing the top, app facing part but the whole stack is shorter.


But computer accelerated image processing and ai plays a huge roll in computing industry today and that roll is growing. No matter if you have a phone, a drone or a car every single hardware that is equipped with a camera needs GPU acceleration.

If the graphics stack only did 90+% image processing that might be the point. Over the full set of app ecosystem, it doesn't.

The root cause problem here is more so in the standards and how they evolved and got buy in. OpenCL had a fragmented adoption, commitment rate. Google pointed at something else for a long while. Microsoft never fully supported OpenGL or OpenCL. Nvidia was heavily pushing CUDA to take advantage of the GPGPU inflection point. AMD was stumbling in different directions at once. etc.

OpenCL%2030%20Extra_23_575px.png

The gap between Nvidia 1.1 and 1.2 is relatively huge. Never got anywhere near 2.0. That is extremely indictive of an "embrace, extend, extinguish" approach. Do an initial version of an open standard to get the 'heat' off. Then muck around with it. Then do something to kill it off ( "use CUDA it makes steady progress. that OpenCL implementation is snail slow evolution" ).

Apple had a hand in the fragmentation too. They also didn't put their best effort into evolving OpenCL. ( was hardly "world class" implementation over most of the actively supported lifetime. ). Apple could have afforded to beat on both (OpenCL and Metal).

OpenGL has had similar issues in the past. ( OpenGL 2.x ? hit a point a low consensus. ). Windows/Microsoft disinterest has a bit of a dual edge outcome where the stack was left entirely up to 3rd party GPU drivers.

"OpenGL Next" (what was suppose to be the "clean up and move on") floundered for a long while before AMD contributed Mantle to be concrete for the committee to debate about.

In short, the "open" graphics standards has been a bit too much like herding cats. Apple is one of the wandering cats but some of the other major players wandering about as much makes reinforces the fragmentation effects. If everyone else was complying and Apple was more of an outlier that would help to bring Apple into more open standards compliance. When Microsoft and Nvidia are both hard charging to slap proprietary solutions on technological inflection points, Apple has enough confidence (huge user base ) and resources these days to drift in that direction also.

[quote[
And no device expect Apple devices is using a Metal framework or any Apple silicon GPU therefore you cannot develop on Apple hardware since there is no Cuda on Apples island, no Vulkan and in the future not even OpenGL. [/quote]

As long as Nvidia plays the "Metal has to loose to for CUDA to win" game it isn't going to be on Apple continent. ( it really isn't a small dinky island with 1+ Billion users. ).


If you want to do these kind of things as a developer - you have to use Linux or Windows.

CUDA is yet another proprietary moat. So say not on an island there also is a bit of hand waving. There are no phones that CUDA is going to get you on. TV streaming boxes? Nope. Vast majority of currently operating cars ? Nope.

Apple not doing more to create a way to optionally "plug in" Vulkan and OpenCL 1.2 (the baseline of 3.0 ) is probably a long term mistake. If Apple is in the process of kicking the GPU drivers out of the kernel (into the inbetween , protected mode with MMU I/O protections for targeted shared space) then perhaps that window will open back up after they finish. If this will be a permanent long term thing, then Apple is probably making an overly greedy short term decision. ( far more like that keyboard with missing control keys. Not everybody needs them but some folks do. ) Vulkan and OpenCL will probably build more traction going forward. it is a tough slog though.
 
  • Like
Reactions: Madd the Sane
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.