On the other hand Apples Metal only strategy is quite strange. Apple doesn‘t sell any silicon to other companies and IMHO doesn‘t plan to do so.
When the Mac first launched Steve Jobs had them not put any control keys on the keyboard. he didn't want a large number of command line programs ported over to Mac with minimal GUI (or no) GUI slapped on top.
For Apple this is not too strange. There were saner heads around Apple and Jobs wasn't quite deeply entrenched in his visionary super powers around the time Mac launch that that keyboard thing was undone pretty quickly. At this point though the 'massive herd' of applications that would come in is iOS influenced stuff. Being stuff that Apple already makes a percentage on is probably clouding Apple's vision (with dollar signs). If nuking more apps that Apple doesn't make money on for a apps that have a higher return for Apple this is not so strange if willing to swap customer X for customer Y.
Finally, have to also look at how Apple approached OpenGL ( and large extent OpenCL) on the Mac. Apple wrote the "top" half of OpenGL that directly faced the applications and the GPU vendors wrote a common, 'simpler' target bottom half underneath that. This gave macOS a more uniform OpenGL implementation. ( dual edge sword because also to an extend also a 'lowest common denominator' OpenGL foundation also. So if looking for latest, bleeding edge OpenGL implementation it was not going to be there on macOS. ). iOS took an even smaller subset of OpenGL (OpenGL ES) and with similar chop on a smaller range of GPU vendors (just one PowerVR. ). The point common thread though is that the GPU hardware implementors job is to do a subset of the graphics driver stack; not the whole thing.
Metal just moves to an even smaller 'kitchen' with multiple chefs. Apple is still doing the top, app facing part but the whole stack is shorter.
But computer accelerated image processing and ai plays a huge roll in computing industry today and that roll is growing. No matter if you have a phone, a drone or a car every single hardware that is equipped with a camera needs GPU acceleration.
If the graphics stack only did 90+% image processing that might be the point. Over the full set of app ecosystem, it doesn't.
The root cause problem here is more so in the standards and how they evolved and got buy in. OpenCL had a fragmented adoption, commitment rate. Google pointed at something else for a long while. Microsoft never fully supported OpenGL or OpenCL. Nvidia was heavily pushing CUDA to take advantage of the GPGPU inflection point. AMD was stumbling in different directions at once. etc.
www.anandtech.com
The gap between Nvidia 1.1 and 1.2 is relatively huge. Never got anywhere near 2.0. That is extremely indictive of an "embrace, extend, extinguish" approach. Do an initial version of an open standard to get the 'heat' off. Then muck around with it. Then do something to kill it off ( "use CUDA it makes steady progress. that OpenCL implementation is snail slow evolution" ).
Apple had a hand in the fragmentation too. They also didn't put their best effort into evolving OpenCL. ( was hardly "world class" implementation over most of the actively supported lifetime. ). Apple could have afforded to beat on both (OpenCL and Metal).
OpenGL has had similar issues in the past. ( OpenGL 2.x ? hit a point a low consensus. ). Windows/Microsoft disinterest has a bit of a dual edge outcome where the stack was left entirely up to 3rd party GPU drivers.
"OpenGL Next" (what was suppose to be the "clean up and move on") floundered for a long while before AMD contributed Mantle to be concrete for the committee to debate about.
In short, the "open" graphics standards has been a bit too much like herding cats. Apple is one of the wandering cats but some of the other major players wandering about as much makes reinforces the fragmentation effects. If everyone else was complying and Apple was more of an outlier that would help to bring Apple into more open standards compliance. When Microsoft and Nvidia are both hard charging to slap proprietary solutions on technological inflection points, Apple has enough confidence (huge user base ) and resources these days to drift in that direction also.
[quote[
And no device expect Apple devices is using a Metal framework or any Apple silicon GPU therefore you cannot develop on Apple hardware since there is no Cuda on Apples island, no Vulkan and in the future not even OpenGL. [/quote]
As long as Nvidia plays the "Metal has to loose to for CUDA to win" game it isn't going to be on Apple continent. ( it really isn't a small dinky island with 1+ Billion users. ).
If you want to do these kind of things as a developer - you have to use Linux or Windows.
CUDA is yet another proprietary moat. So say not on an island there also is a bit of hand waving. There are no phones that CUDA is going to get you on. TV streaming boxes? Nope. Vast majority of currently operating cars ? Nope.
Apple not doing more to create a way to optionally "plug in" Vulkan and OpenCL 1.2 (the baseline of 3.0 ) is probably a long term mistake. If Apple is in the process of kicking the GPU drivers out of the kernel (into the inbetween , protected mode with MMU I/O protections for targeted shared space) then perhaps that window will open back up after they finish. If this will be a permanent long term thing, then Apple is probably making an overly greedy short term decision. ( far more like that keyboard with missing control keys. Not everybody needs them but some folks do. ) Vulkan and OpenCL will probably build more traction going forward. it is a tough slog though.