Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm watching the Blender contribution the most closely -- at this point last year it looked like their revision to the Cycles-X renderer was going to be effectively locked into NVidia if you wanted hardware acceleration, because only NVidia was providing any engineering support. They were already looking for an alternative for OpenGL because too many of the OpenGL drivers had been left to drift and Blender was tired of building workarounds for known driver bugs that were reported but never fixed. Blender 3.0 has just come out and is Apple Silicon native, but without Metal hardware acceleration for PBR. Thanks to Apple's support, 3.1 will likely add it, and the first pull requests have come in.

I think Apple may have decided that it was worth an FTE to keep Blender from becoming NVidia-only for GPU acceleration.

One other OSS project I follow pretty closely is the game engine Godot, which over the past couple years has rewritten its rendering engine to use Vulkan, partly because Apple's deprecation of OpenGL but also because their rendering engine was ready for a comprehensive rewrite. Godot 4 is in pre-alpha, and I honestly don't expect alpha before next June-ish, but I'm looking forward to it. I wouldn't mind it if Apple loaned them some engineering when it comes time to optimize the Godot-Vulkan-MoltenVK pipeline.
 
Wow open source! Yet I can't run my own software on my own devices without Apple approval and payment.
 
Last edited:
Apple has brought Tensorflow to the M1, using Metal. I'm not sure you realize exactly how much Metal is the GPU hardware architecture and how much the GPU hardware *IS* Metal. Porting Vulkan or any other compute layer would introduce a serious hit.

If Apple continues to make hardware that's well beyond what others are making in a performance/watt scenario, more things will come to it.

The Metal Shading Language is a subset of C++ and it's pretty straightforward. Things had to be ported to OpenCL, and OpenCL things got ported to Vulkan. Change happens all the time.

People tend to focus on their own niche and I'm as guilty as the rest, but I'm not seeing the relevance bringing up a machine learning library when bemoaning the future lack of OpenGL on a series of lowend (ie, not a supercomputer) workstation-type machines. There's a lot more to "scientific computing" than ML. Lots of pre- and post-processing tools were written against OpenGL. You know what the "G" stand for, right?
 
Last edited:
And until it is, you’re going to be stuck hacking old Fortran code from a grad student who left the lab in 1989.

I learned Fortran in high school, around 1977. It’s fair to say that anybody still finding it mission critical — for anything other than maybe communicating with a spacecraft launched in 1978 — has boxed himself into a pretty serious corner.

Now, I understand how it happens, how grant structures don’t generally offer any opportunity to update software, and how grant officers are likely to shrug with ”they already have software to ‘open their mail’ so no funding to modernize it” and such.

It’s been my pleasure to work with a lot of researchers in a national lab. They know their stuff, but they don’t know software engineering, and they don’t know the price they’re paying -- or, rather, their long-suffering grad students are paying -- carrying effectively unmaintainable legacy code.
Fortran is one of the most used languages for scientific computing (computational fluid dynamics, computational astrophysics - a number of fields, although C++ is slowly gaining) even today (current standard is Fortran 2018). Let me guess: CS background? CS and CE are important, but the focus of most research using computers isn't development itself, it's using development to get results.

"It’s fair to say that anybody still finding it mission critical — for anything other than maybe communicating with a spacecraft launched in 1978 — has boxed himself into a pretty serious corner."

Flown on an aircraft lately? I'd bet that quite a few Fortran codes were involved with its design and certification from CFD to FEA.

Regarding OGL: what I'd prefer to see is a statement other than "deprecated and can go away at any time". I'm fine with the current supported version, as I'm not writing or compiling game code. However, while OGL is still supported in Big Sur, for how long and will a Mesa-based replacement be available? These aren't concerns on my work machines as none of them are or will be Macs (Linux), just home research....
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.