But we aren't talking about a mirror. We are talking about a super small area of focus on someone's face.
The sunglasses themselves are reflecting the light, like a mirror. As an example: If you have nearsightedness , you can't use reflection to cheat your vision to seeing something farther away, that's just not how things work. However there are a few ways this image can go in terms of giving off shallow depth of field (needed for bokeh). Realistically
I'm not an expert, but I don't think that the image in the sunglassess should be blurred... if he had a picture on his t-shirt then the person on that picture should be sharp but the background on that picture blurred?
It depends on the focal length of the camera, which on the iPhone is fairly wide so it would be harder to achieve bokeh naturally hence why Apple artificially adds it with Portrait mode and various apps that have been doing it for 8 or so years now. I think you are confusing things a bit though regarding the t-shirt. It's a fixed distance from the lens at all time so no there shouldn't be blur on a portion of his t-shirt just because it has 'background'. However the sunglasses, in real world effect as I'll demonstrate and explain with some links below, in certain aspects could show the background blurred depending on how the photo was taken. Objects in the reflection are at distance because they're real world and that's how light works. What you're referring to would be like saying you're nearsighted but if you use a mirror you can see things far away clear, which isn't the case at all.
In this picture:
https://www.flickr.com/gp/72313243@N05/Q50t3Q if I focus on the objects in the reflection, the background beyond the sunglasses is highly blurred, the sunglasses themselves are less sharp but you can still see them fairly clear, and the objects closeer in the reflection are fairly sharp (rough quick picture I just took) and the farther away objects are the less clear they get. For reference the corner was about 15 feet behind me and the buildings were about 100 feet, sunglasses were 3 feet away, and background trees were about 150 feet away.
Changing to this picture :
https://www.flickr.com/gp/72313243@N05/8074H4 where the focus is on the sunglasses themselves, nothing else changing, the objects in the reflection are significantly less clear. It's a definite matter of physics. These were shot with a DSLR camera with a 50mm at f1.8 for both photos which is a definite difference to the iPhone but this was just to be able to give a real world offering to what the iPhone and Apple are trying to mimic.
For one more case, look this picture of a truck:
https://www.flickr.com/gp/72313243@N05/o17jJ7. Notice that the truck is the subject and focus (like the guy in the photo) and it's shiny surface does not reflect a clear image of what's surrounding the area and the background of the shot is out of focus as well. This is how light works. Now I'm not arguing that the reflection in the guys sunglasses can't be clear as you can actually get this to occur real world with a DSLR, but it's just too clear and adds to the artificial look in my opinion because there's no fall off, it's a straight capture of what the iPhone always sees with it's focal length (which keeps a huge range of depth in focus). It would be extremely hard to produce an algorithm to add that little bit of extra realism to the photo of course so not something I'm upset about, just something I easily can pick out on if I was told this was supposed to be a natural looking photo.