Simply put, porting an older codebase designed for Windows 32bit to be 64bit clean on macOS/Linux is rarely as easy as you think. The old WWDC video of Steve Jobs saying it is as easy as ticking a check box in Xcode is pure fantasy.
True, but for all practical purposes sizeof(int)==4 across all Mac platforms. Not that it’s relevant to the topic really. That’s also why I said that I consider any programmer that would make implicit assumptions about ABI incompetent.
I really want to reiterate the following from MasConejos (slight corrections in bold):
It's not as simple as selecting the 64-bit target and recompiling. (I'm simplifying here...) The code will have many memory-mapped data structures where a POINTER/LONG/ULONG is expected to be 32-bits. When POINTER/LONG/ULONG are suddenly 64-bits, all of the addresses and offsets are no longer correct as they were set with the expectation that a POINTER/LONG/ULONG was 4 bytes. </snip>
While 'int' does not change in size on POSIX or Windows when moving from 32bit to 64bit, all pointer types change in size (e.g. sizeof(void*) == 4 to sizeof(void*) == 8). Moreover on Windows the 'long' & 'unsigned long' types are 4 bytes on both 32bit & 64bit, but on POSIX systems (like macOS) 'long' & 'unsigned long' are 8 bytes in 64bit configurations. The differences between long/ulong are especially irritating unless you find a way to mass correct them because they tend to be spread throughout the code.
There are more subtleties to the difference between the 32 bit and 64 bit ABIs & APIs than you might imagine, especially across different platforms.
The fact that any games ever since have been developed for intel Macs in 32-bit is basically incomprehensible to me. Especially since most of the Windows originals are available as 64-bit versions. Game developers are to blame entirely. All productivity software I have has made than transition so long ago, I forgot thinking about it.
Windows games didn't consistently make the switch to 64 bit until the PS4/Xbox One-era where the console version was already 64 bit. Any title that had to support Windows XP was limited to 32 bit because almost no-one used the 64 bit version of XP - OEMs would always install the 32 bit version for compatibility reasons. This didn't really get resolved until Windows Vista & Windows 7.
All it takes is a dependency on some old version of Havoc, PhysX, Unreal, whatever. Then you first need to move to a newer library, which in turn will introduce lots of breakage.
If you have the source code for the library then you can usually cope, with the above caveats that you'll have work to do. If you don't have the source code then it becomes much harder and may be impossible. Upgrading to a newer engine version of UE3 or Unity for example will typically be *much* harder than upgrading a physics or video library simply because more can change.
[doublepost=1560900546][/doublepost]
OpenGL was up-to-date on Mac until a several years ago, but it was and is severely lacking and in dire need of a replacement.
I disagree with this and it also misses a broader and more important point.
Apple didn't really keep pace with the OpenGL specification in the lead up to the PPC->Intel transition and from that point on were routinely a year or so behind advertising support for new specifications - at least in part due to Apple's timing of major OS releases. Sometimes it would take another major release before those features actually worked well enough to ship games.
Worse, the OpenGL standard itself never caught up with Direct3D after D3D 8 which introduced the first shaders & became the industry standard for 3D graphics in games. The original ARB board collapsed in the early 00's and the Khronos board that formed to take over couldn't make the comprehensive overhaul needed to compete with Direct3D - hence why we ended up with Metal & Vulkan.
[doublepost=1560902869][/doublepost]
You see, this is something we disagree on. In my opinion, an experienced graphics programmer is proficient across all the modern APIs, because they are based on the sane principles. And Metal is so simple that an experienced OpenGL programmer will need just a couple of hours to pick up the API. Vulkan and DX12 are more difficult to get into since the API is less friendly but still, the core concepts are the same.
If wishing only made it so. As someone who works in games development I've rarely seen any of this to be true.
Metal, Vulkan & DX12 do all share some similar principles but their differences are non-trivial and take quite some time to really, truly understand. The implications aren't always clear from the documentation - you have to use the APIs in anger to be able to debug & profile them and see how the differences affect the underlying hardware.
Plus most games developers learn one API, which due to market dominance is almost always D3D, and then whine/whinge/moan like crazy about the differences if they are ever asked to even *consider* the design or capabilities of one of the others. Sometimes I wonder if this is actually one of the main reasons why original developers are loathe to support anything beyond the big-three platforms (Windows, Xbox, Playstation) which either use D3D or support something sufficiently low-level that you can implement a near identical approximation.
Sure, tessellation/geometry shaders use a different model. Still, don’t see much of an issue here for a well-designed engine.
The absence of geometry shaders is a pain in Metal, but not insurmountable provided the use cases are stupidly simple - which they are in UE4. The differences in tessellation however *are* insurmountable in some circumstances - no amount of engineering can overcome the fundamental limitations of Metal's approach vs. the industry standard D3D/VK/GL implementation. There is simply no way to seamlessly support Metal tessellation with DrawIndirect which isn't a problem with UE4 but will be for most other modern titles.
Shaders are a bigger issue but then again, current cross-compilers make it manageable.
The differences in shader language design and capability are by far the biggest problem. I've spent so much of my time at Feral & Epic working on HLSL->GLSL/MetalSL translation. The imperfections and performance penalties both subtle and gross have been an unending struggle and this is the cause of the vast majority of performance delta between Windows and macOS.
UE4's hlslcc library and the Khronos/LunarG/MoltenVK SPIRV-Cross library really show the problems pretty well and they can do an OK job - but neither are as good as writing all your shaders by hand for each target platform - and that's impossible to do for a real AAA-game that has to launch across multiple platforms. As a practical reality you've got to pick one shading language to work in and accept that translations to the others will be imperfect.
And as to accepted and well-documented standard... very few OpenGL programmers would have bothered to read the spec and even if they did, it won’t help much. The core problem with OpenGL is not it’s overly complicated spec or strange API model, but the fact that it doesn’t map well to the hardware anymore and haven’t done so in years (yes, it’s much better now with 4.6). So writing OpenGL based software in practice means testing across all platforms and drivers, finding confusing slowdowns, trying to figure out idiosyncratic driver behavior and coding around it. With OpenGL, it’s very easy to shoot your self in the foot even if what you are doing is based on the spec. With a modern API, the core promises are already part of the API itself.
Frankly, the only player who really still cares for OpenGL is NVIDIA since a) they have the money and massive driver teams to fine-tune their drivers for individual software and b) it allows them to push CUDA down people’s throats as the “more efficient” API.
We're broadly in agreement here. The problem with OpenGL was that it was a cruel and unforgiving mistress. It had rubbish debugging tools & validation support until 2011/2012, poor quality driver implementations, an API full of legacy decisions that made it hideously inefficient and a terrible implementation of shader support.
And it's really easy to port games developed for Vulkan to macOS with MoltenVK (
https://moltengl.com/moltenvk/).
While I also think this is a vast overstatement, I've had chance to revise my opinion on MoltenVK. It will need a lot of work to be able to support a modern engine like UE4 (it lacks a lot of functionality I've implemented in UE4's MetalRHI backend) but as and when that work is done it should be pretty good. The shader translation component especially has come along very nicely over the last year or so. Provided the implementation is fleshed out and gets used in anger and optimised it should eventually be pretty close or perhaps even as good as a native Vulkan driver would be, with the notable exceptions of Geometry & Tessellation support which will remain problematic. This of course assumes that Apple keep updating Metal in-line with Vulkan.