This is pretty cool but it's not quite there yet, although I guess OK for casual use -- how effective you think it is depends on how closely you're looking, your level of photographic knowledge, and how much you care about the details.
There are definitely artifacts visible around the edges of the figurine where the blur engine had trouble determining how to blur naturally.
In the Sony shot (or any traditional camera), you can see that round bokeh trademark shape in brighter bits and specular highlights, where the iPhone kind of "mushes" those.
Finally, there is no foreground blur. If there's a subject closer to the lens than the main subject, it should also be out of focus.
However, it's not hard to imagine that in future iPhones, the 9 layers of depth mapping will turn into 20 (or whatever), which will probably allow for more natural transitions and foreground blur, since there's more layers to spread around the scene. And obviously the software can get better at blurring around edges, and could even add specular highlight artifacts. Sharper, brighter, higher-resolution lenses should yield more data to process in the algorithm.
This is a really impressive start, but the feature should get a lot better in iPhone 8-9.