Unfortunately, instead of significantly increasing the sensor size and the number of pixels, it seems Apple wants to go the Oppo Find 7(a) way: using more than questionable post processing like this to emulate more pixels.
This, as has been proved by the Oppo Find 7 and 7a, doesn't really work in practice. Just check out the test images at
http://www.gsmarena.com/oppo_find_7a-review-1073p8.php or at
http://www.allaboutsymbian.com/features/item/19736_Camera_head_to_head_Lumia_1020.php . Particularly the former proves artificial 50 Mpixel images are actually worse than native 13 Mpixel ones.
While I do know Apple may still come up with an algorithm vastly superior to that of Oppo, I'm still pretty skeptical. Let's not forget that, upon announcement, Apple heavily advertised their software-only image stabilizer post processor in the iPhone 5s, which, as has been proved by independent testers like DPReview, simply isn't as effective as hardware solutions. (See the last section at
http://connect.dpreview.com/post/7518611407/apple-iphone5s-smartphone-camera-review?page=4 )
All in all, don't expect miracles from Apple. Highly detailed images would in only one way possible: by putting a 40+ Mpixel, large (at least 1/1.5", as in the Nokia Lumia 1020) sensor in the iPhone. That, however, would add significant thickness (bumping it to at least 10-11mm) to the iPhone and, consequently, will never happen in the "let's-get-out-phones-as-thin-as-possible" Apple world. Software / post processing gimmicks will
not work.
----------
This is basically an Oppo Find 7 with OIS
Just don't forget that the Oppo Find 7 only delivers in theory. In practice (see the reviews I've linked above) its 50 Mpixel mode is useless and in no way recommended.
----------
No, it isn't. Not even Apple can beat the laws of physics. And, knowing how bad their most-advertised multiframe "stabilization" is (see the DPReview link above), I'm pretty sure they won't be able to come up with something really cool in the future either.
It simply isn't possible to properly interpolate ("guess") missing pixels. There's only one way of providing significantly more detailed images: by using significantly higher-Megapixel sensors (and quality lens, of course); that is, going the Nokia 808 / 1020 way. But that'd mean
significantly thicker and heavier phones. Software postprocessing (basically, this is what all about) won't help.
----------
What makes Apples tech different than what's already implemented in Nokia and Sony phones? This sounds exactly like what they do.
Nope, this ("let's interpolate the input of missing pixels by interpolating, based on the stabilizer's data") has nothing to do with
- simply stabilizing optically (Lumia 1020, 920, 925, 2013 HTC One's, LG G2)
- delivering a handset with a truly 40+ Mpixel sensor (Lumia 1020, Nokia 808).
Currently, only the Oppo Find 7 and 7a do what this is all about - and it just doesn't deliver. Again, interpolating rarely or, with Oppo, not at all works.
----------
I guess Apple has invented photography. Good job.
Probably the trailing /sarcasm is missing from your comment?

----------
please link us to specific cameras that had this exact feature (componsite photo-stitching using OIS and onboard processor).
Oppo Find 7a. Under-delivering.
----------
When Apple implements fingerprint tech in iphone (not the first mobile phone with this tech) and Samsung comes out with it, every apple fanboys call Samsung a copycat. Let's see if those same people call Apple a copycat (Nokia is the first mobile phone with OIS) when they use OIS on iPhone. Double standard much?
You're wrong - Nokia "only" implemented (proper) OIS in many of their phones. "Simple" OIS has nothing to do with pixel interpolation - the one this article (and the Oppo Find 7(a)'s useless 50 Mpixel interpolated mode) is all about.
----------
This doesn't work for anything that moves. Just like image stabilization.
Exactly. This is one of the major problems with the tech.
Nevertheless, the interpolation in the Oppo Find 7a doesn't work with static objects either.
----------
Honestly, does anybody think that resolution is the problem with cell phone cameras?
The higher, the better, assuming the individual pixels don't get too small (which requires bumping up the sensor size - hence the huge, 1/1.2" sensor in the 808). Ever seen the actual, pixel-level detail of a Nokia 808 image?
Taking multiple pictures in the same time reduces exposure time per pixel (just like shrinking pixels) and therefore increases noise.
Wrong. Ever heard of temporal noise suppression? It has very widely been used in many-many Sony and Canon cameras for 4-5 years (in JPEG mode only and only with static objects.)
----------
What kind of person comes to MacRumors to post that?
A real "Apple enthusiast" maybe?
