Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Irishman

macrumors 68040
Original poster
Nov 2, 2006
3,452
860
So, last year was transformative in a disruptive way for Metal in iOS and macOS. So far, based on the WWDC Keynote, it seems that this year is going to be one of refinement and expansion.

Some things that Apple shared during the keynote were:

Metal 2 (with performance gains of up to 10x over Metal 1)
AR Kit
VR support via Steam and Vive VR
Plug-and-Play support for e-GPU solutions
(for Metal, OpenGL and OpenCL)

Also, we can peek into the full schedule for the conference (either via the WWDC app on iOS or the developer page here https://developer.apple.com/wwdc/schedule/#/ ), so we can see what's coming for the week ahead.

(All times are PDT)

Tuesday, June 6, from 1:50pm to 2:50pm there is the "Introducing Metal 2" session.

Tuesday, June 6, at 3:00pm there is the "
Metal 2 Lab"

Tuesday, June 6, from 5:10 to 6:10 there is the "Introducing ARKit: Augmented Reality for iOS" session.

Wednesday, June 7, from 10:00am to 10:40am there is the "VR with Metal 2" session.

Wednesday, June 7, from 1:00pm to 3:10pm there is the "ARKit Lab
"

Wednesday, June 7, from 3:10pm to 6:00pm there is the "VR with Metal 2 Lab"

Wednesday, June 7, from 6:30pm to 7:45pm there is the "
AR/VR Get-Together"

Wednesday, June 7, from 6:30pm to 7:45pm there is the "
Games Get-Together"

Thursday, June 8, from 12:00pm to 3:00pm there is the "
ARKit Lab"

Thursday, June 8, from1:50pm to 2:50pm there is the "
Advances in Core Image: Filters, Metal, Vision, and More"

Thursday, June 8, from 3:10pm to 3:50pm there is the "
Metal 2 Optimization and Debugging" session.

Friday, June 9, from 9:00am to 12:00pm there is the "
Metal 2 Lab"


I'm hoping that we can use this thread to keep track of any useful news coming out of these sessions and labs.

Thoughts so far?
 
  • Like
Reactions: Sdtrent
Watching the first Metal session... "Direct To Display" looks interesting. It circumvents all the compositing and grants direct access to the display for a Metal fullscreen app. It looks like the good old fullscreen mode on older OSes, though I suppose it's much more flexible and allows switching between apps easily.
 
  • Like
Reactions: Cougarcat
Watching the first Metal session... "Direct To Display" looks interesting. It circumvents all the compositing and grants direct access to the display for a Metal fullscreen app. It looks like the good old fullscreen mode on older OSes, though I suppose it's much more flexible and allows switching between apps easily.
This is a really important feature to keep display lag down on HMDs. The less processing on the completed framebuffer the computer has to do, the faster it can be pushed to the Vive.
 
Yes, the VR session was quite interesting.
The Metal debugging and optimisation tools look really powerful. I wonder if it's now easier to use Metal than using an existing OpenGL code base when porting a game engine to macOS (mostly because of the sorry state of OpenGL on macOS).
But the big deal in respect to performance appears to be the "argument buffers" brought my Metal 2. I wonder if they directly correspond to a DX12 or Vulkan feature, of if they include things that are unique to Metal.

As for eGPU support, Apple makes it sound as if it was enabled by Metal 2. But in reality it's a system-wide thing. The only difference with Sierra is that no special script is required to have eGPU working. You can have any OpenGL or openCL app accelerated by the eGPU so long as this app is on the monitor that is connected to it. What Metal 2 adds is improved compatibility, so that an app can move gracefully to a monitor that is connected to a different GPU, and continue to run if the eGPU is added, removed, etc. It also looks like it is possible (though not recommended) to have the eGPU accelerate an app this is not on the monitor it is connected to. So eventually, it may be possible to use an eGPU with the integrated display, with a performance hit.
All these require changes in the application code though. They don't come "for free". I wonder how it's done on Windows.
 
Last edited:
Yeah, watched the intro to metal 2 and metal for VR and as per the usual Apple fare, the APIs seems pretty clean. Although I don't work at the OS level, it seems pretty easy for devs to get the correct display for the GPU. I'm not really sure why Apple felt like it was necessary to bump the version of Metal to 2.0 for these features. However, versionings are always more to do with marketing than with technical advancement. Maybe Apple felt that it was better to make a clean break with their initial 1.0 El Capitan (10.11) vs what the functioning API now in 10.12.5. Regardless, I'm happy that they are trying to consolidate more of the API across iOS and macOS. It really seems like Apple is committed to GPUs and they really seem to be receptive to developer feedback so far. (at least for the short-term again) They seem to be shepherding the API pretty well, which they absolutely need to do by using a proprietary API. I mean letting users use eGPUs and letting apps outside of the compositing engine is a big win for VR and very "un-Applely". This was the first keynote in a couple years where I felt Apple was on the right track again. Their iPads are really looking to be competent computing for the masses and some content creators. While they can push macOS to be the higher-end platform for content creation and development.

The only thing that really threw me for a loop was Apple's push for macOS to transition to 64-bit only apps! That basically kills a huge number of legacy apps. Windows would never do something like that. I don't know how I feel about it. I mean there isn't any reason to develop new 32-bit macOS, but them potentially removing support for all existing 32-bit, quite a bold move.
 
Last edited:
From another thread :)

I'm personally wondering if argument buffers have equivalents in DX12 and Vulkan (if they correspond to reusable command buffers that people were asking)

Argument buffers corresponds to what DX12/Vulkan know as descriptors/descriptor tables/sets etc. But in Metal 2 they seem much easier to setup to use and also much more flexible. You can also manipulate the contents of the argument buffers in the shaders, which allows some crazy stuff as well as arbitrary nest pointers to other argument buffers. If I understand it correctly, this is very similar to Mantle.


Reusable command buffers: no. But frankly, don't see that much benefit in them. With argument buffers and indirect draw calls you can encode complex scenes in only few calls anyway, so the performance win is going to be very small unless you are doing a lot of shall static batches (and then probably there is a better way of doing what you are doing)

and if important features that some said were missing (sparse data, transform feedback, shader atomics on texture objects...) are now in Metal 2.
I could find answers myself but the Metal documentation is Chinese to me. :confused:

Sparse data: no. Maybe iOS hardware doesn't do it? Anyway, from what I read here and there, sparse data is very expensive to use in practice due to OS-imposed restrictions which makes it not that useful in real world.

Texture atomic load/store - not arbitrarily, but Metal 2 has these Raster order groups which ensure ordering for overlapping fragments. I guess this should cover most cases where you need this? And isn't atomic texture access in other APIs extremely limited to begin with?

Transform feedback has been available from the start. You just write to buffer in your vertex shader. This is much more elegant than creating an additional pipeline configuration type.

For me, the most exiting additions for macOS are the heaps and argument buffers. With those, one can get performance parity with DX12/Vulkan for most cases, and boy is Metal easier to work with. And the flexibility of argument buffers is huge.

With Metal 2, reasons why Apple didn't jump on the Vulkan bandwagon becomes more clear. As I was speculating years ago, they wanted an API that could replace OpenGL and do it in a friendly way. Metal is just a pleasure to work with, it's very clear and straightforward. There is still a lot of hand holding and high-level stuff (so more overhead compared to other low-level APIs), but that won't matter for most applications. Vulkan does give you much more control, but working with it is really not fun. I'd say that making a good performing app with Metal is easier than making a good performing app on Vulkan, simply because with later you have more tools to short yourself in the foot.
[doublepost=1497166766][/doublepost]
I'm not really sure why Apple felt like it was necessary to bump the version of Metal to 2.0 for these features.

Apple seeend to use their own weird semantic versioning where major features get bumped a whole version. Like, we are now on Swift 4, which I'd still call something like Swift 0.5 ;)

The changes to Metal are pretty much substantial though. It catapults it from "that another stubborn thing Aplle does" to a very formidable GPU API. It might still lack features from its competitors, but it's incredibly well though out and offers unmatched cost/benefit ratio.
 
Thanks leman. :)
letting users use eGPUs and letting apps outside of the compositing engine is a big win for VR and very "un-Applely".
I wonder how much of an impact this feature has. It reduces latency, but shouldn't impact frame rate much.
The good news is that this new "direct" mode doesn't require any code. It automatically applies if the system detects that no compositing is necessary.
 
It reduces latency, but shouldn't impact frame rate much.
The good news is that this new "direct" mode doesn't require any code. It automatically applies if the system detects that no compositing is necessary.

My guess is that the compositing engine is a decent chunk of frame-time? The other problem is that the compositing engine may not have consistent frame timing which is a of huge importance in VR. With VR or other high-frame rate situations you can't just let your frame times be dictated by the OS if there are frame-time spikes it creates user discomfort. My guess is this direct mode has a much more consistent frame graph since you're bypassing the OS for the most part.
 
My guess is that the compositing engine is a decent chunk of frame-time?
I don't think so. They showed that the VR compositor takes ~1ms, and I suppose it does more complex things than the Window server (which should be more optimised, since it's been refined by Apple over the years). So desktop compositing probably takes much less than 1ms, out of 16ms for a 60Hz display.
What I'm not sure about is latency. On Metal system trace captures, we typically see a frame that has finished rendering during a v-blanck interval, but it "hits the glass" at interval n+2.
Perhaps the direct-to-display thing reduces it to n+1 (it cannot be n, because of V-sync).
Also, Metal now powering the Window Server could help with latency (?).
 
Last edited:
https://developer.apple.com/metal/

A library for thoughts.

I like this one:
GPU-driven Pipelines
Metal 2 includes features that empower the GPU to take control over key aspects of the graphics and compute pipelines. Essential tasks, such as the assigning of resources to shaders, can now be efficiently driven by the GPU instead of waiting on the CPU. Important rendering details, such as the order that low-level drawing is performed, can be specified on-the-fly by the GPU, opening up new efficiencies in advanced rendering engines.
Hardware scheduling implementation in software at its best. Allows the GPU to adapt to situation, and lower latency where it is most needed(VR, especially on external GPUs!).

Apple is building by far the best API in the industry. It still lacks few features, but... The progress is evident.
 
  • Like
Reactions: leman
If a game uses Metal do they have to patch in support for Metal 2 or will it just update and work seamlessly?
 
Federighi also mentioned more optimised drivers, which may help current apps. I also believe that "direct to display" should be automatically adopted by fullscreen apps. But it should be about the only new feature that may not require changes in application code.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.