HDR doesn’t necessarily require multiple images in the traditional sense, because iPhones have an electronic, not a mechanical, shutter. So instead of taking multiple separate images, the pixel data can be read out multiple times during one image. It’s one of the advantages that electronic shutters have (of course they also have disadvantages, like the jello effect, but that’s another story).
So, for example, if your exposure is 1/125th of a second, you read out the pixel data at 1/250th, and 1/175th, and 1/125th (or whatever the algorithm is) instead of taking two or three separate photos. No blur problem since instead of taking 1/30th of a second for three photos, you fit all three “exposures” into one exposure.
As I've conceded from the start, "I know not whereof I speak" on Sensor Shift/Pixel or subpixel Shift/Obtaining higher megapixel photos than the megapixels of the sensor itself, etc.
I'm just a very
curious person, so kindly bear with me.
Referring to your last post, if HDR photos are achieved not buy capturing multiple images in succession (which takes more
time—however brief), and "pixel data can be read out multiple times during one image," then can multiple "Sensor Shift"/Pixel or subpixel shift "readings" be taken
at the same instant with absolutely zero latency between readings?
Might this yield higher megapixel photos where motion blur and camera shake are
not issues for the quality of the picture (beyond the expected, sometimes even desired motion blur that can occur on any camera)?
BTW, the MacRumors "Sensor Shift" article concedes, "according to a paywalled report today from hit-or-miss Taiwanese industry publication
DigiTimes."
DigiTimes has "gotten it wrong" on multiple occasions, not unlike with the
DigiTimes-sourced, September 5, 2017, MacRumors article,
“Apple Takes Early Step Towards iPhones With 'Above 12-Megapixel' Rear Cameras.”
Maybe it just has yet to "come to pass," or maybe DigiTimes got it
wrong. (Had bad information.)
Bottom line, though, I'd really like it if Apple would "graduate" the iPhone
from 12MP sensors—which it's been using in every generation iPhone since the 6s—nearly five years ago(!)
My suspicion is that Apple has put in YEARS of work on the multi element lens system, the
hardware digital processing and synthesis that happens behind the scenes, the GPU SoC layer, the
algorithmic software and AI and ML that all serve to
amaze–
all engineered
specifically for 12MP sensors. And that bumping up to a 16MP or 14MP sensor might be
disruptive and require a
total do-over of
allll this hardware and software engineering over many years that you could describe as "beyond the sensor" technologies, which, admittedly, have achieved
stunning results with each new generation of iPhone—that all
still have 12MP sensors. (Hope my suspicion is wrong.)
(Off topic: I read a white paper about advances in CMOS Sensors and newer generation sensors, and "non-conventional" pixel/subpixel
arrays, orthogonal and geometric rotations during "capture," that allow video cameras to do many things including "see" or discover microscopic particulates and toxins that not only bear telltale shapes, but that move in patterns only associated with those respective particulates and toxins. In the not-too-distant future, will we be able to aim our iPhone's camera at food to get a reading on its purity? Test air quality in the home/workplace?)
16MP or even 14MP would make me "happy," as digital zooming with a 12MP sensor can lead to
terrible results that Apple should be embarrassed by. And cropping photos or video can
also lead to
terrible results. The
worst example though is trying to create a portrait of one person from a photo that is a
group shot. Apple doesn't "show off" these types of edited iPhone photos at their Events.
I'm using FiLMiC Pro which has a digital zoom slider that will turn red when slid a certain amount to indicate unacceptable image degradation. It doesn't "stay in the green" for long
at all, recommending with its color warnings only the most
minimal digital zooming.
The iPhone takes absolutely
stunning photos when no digital zooming has been applied. Apple shows off these stunning photos at Apple Events always at full 12MP resolution. The iPhone 11's Deep Fusion photos are
breathtaking—but are still 12MP photos. (AFAIK.)
I was recently sent some iPhone 11 video of an Elementary School children's stage performance. The parents had "good seats," but they insisted on using digital zoom to a "fair thee well" to "live crop" the video to their own child only.
I was presented with lots of blocky, blurry, shaky, grainy footage of their child, and I couldn't help but be reminded of early flip phone video.
Let's see a higher-than-12MP sensor in an iPhone
SOON!