Having two lenses in a different X/Y could help aid AR/auto focus/depth perception, etc.
This is my suspicion as well. Makes the most sense, and was something I'm honestly surprised Apple didn't do initially. Diagonal lenses would have made more sense from the beginning, IMHO.I agree with you. Software-based depth mapping would make sense since Apple are showing signs of leveraging AR more and more each year. Depth mapping with hardware would force greater costs to get this feature into the hands of the masses, but this change would help gather a lot of perspective metadata.
Now if we could just get the camera gurus at Apple to kill the Vertical Video nonsense by always shooting 16:9, falling back to pumping 4:3 into the encoder by default while the device is upright/vertical. 9:16 is an abomination that only Apple can staunch at this point. (Thankfully Twitter's Vertical Video wasteland app Periscope has been killed.)