Not only do I know how it works, but I'm also a photographer (who has edited using similar techniques manually) and work as a developer and read a lot about computational photography so I probably understand how it works on a deeper level than most people.Are you familiar with how Night Mode works? It takes several exposures and merges them together, resulting in a brighter image in which anything moving ends up blurred. Hence my skepticism that this was Night Mode, since the moving subject is not blurred.
Apparently you don't know how it works. I don't usually call people out like this, but you called me out on this without even knowing the facts yourself. I'll allow Apple to explain:
Night mode comes on automatically when needed — say, in a candlelit restaurant. When you tap the shutter, the camera takes multiple images while optical image stabilization steadies the lens.
Then the camera software goes to work. It aligns images to correct for movement. It discards the sections with too much blur and fuses sharper ones. It adjusts contrast so everything stays in balance. It fine-tunes colors so they look natural. Then it intelligently de-noises and enhances details to produce the final image.
Emphasis is mine. They use ML for everything. I'm sure that in the parts that have movement, which they discard, they are applying ML to fill in the details and de-noise so those parts don't look too weird from being boosted. Basically the neural network has enough good data from the parts that aren't moving that it can bring up the parts that are moving (and therefore don't have enough usable exposure info to do noise stacking or expose for shadow detail) to make it look natural with the rest of the scene. But I wouldn't be surprised if you pixel peeped moving objects if they don't look quite as good up close. On Instagram though? Perfect.
People truly underestimate the power of computational photography. It's a game changer to the fullest extent of that phrase. They are doing things on the fly that might take an expert dozens of minutes to hours to get right with careful preparation and editing. And they do it in a fraction of a second. It's nuts. I can't believe how many people I saw who were upset that they weren't adding this for the XS. It's a lot of processing power! Good thing the iPhone 11 Pro has much longer battery life and faster CPU and neural engine.