The technology behind the dual-lens system opens up a lot of possibilities for mobile photography and a new generation of depth-sensing apps that will lead to more realistic augmented reality, but Apple has really only embraced the zoom features and is just scraping the surface of depth-sensing with the portrait feature of this dual-lens setup.
Professional photographers, would a larger sensor with a higher resolution (24 megapixels?) to make up for the lack of "optical zoom" have been a better use of cost and space? As far as the possibilities and depth-sensing opportunities the dual-lens system provides go, it will be impressive down the road when it's opened up to developers. But for now, I'd be curious to argue the merits of a larger single camera sensor that would've provided better low-light photography vs the dual-lens system Apple ultimately implemented. It was implemented with good reason but I can't help but wonder what a single sensor twice as large would have been able to accomplish.
Professional photographers, would a larger sensor with a higher resolution (24 megapixels?) to make up for the lack of "optical zoom" have been a better use of cost and space? As far as the possibilities and depth-sensing opportunities the dual-lens system provides go, it will be impressive down the road when it's opened up to developers. But for now, I'd be curious to argue the merits of a larger single camera sensor that would've provided better low-light photography vs the dual-lens system Apple ultimately implemented. It was implemented with good reason but I can't help but wonder what a single sensor twice as large would have been able to accomplish.