Does anybody know how exactly LIDAR improves low light photos? Previously we had raw camera specs that bettered low light photography on these phones year to year, now we have LIDAR. Can anyone explain?
LIDAR gives the phone the distance from the phone to everything else... and it works using infrared so it works perfectly well in the dark. It actually creates a "distance map" (the distance from the phone for each "pixel" (one ifrared ray shot from the phone)). The distance map can help photography in a number of ways:
1. Identify the subject (it can "see" humans)
2. The distance can instantly give the camera the correct focal distance to use - so the camera doesn't have to figure it out
3. The depth map can be used to simulate depth-of-field effects (i.e. Bokeh - the blurry backgrounds on portrait photos).
Since all of this works just as well at night as not - that gives the 12 a huge leg up on low-light/night photography.
I feel like I may be in the minority, but I think the Pro’s Pacific Blue color is much more appealing than the standard iPhone 12 blue.
Agreed! I'll be getting a PB 12 Pro Max
Just not sure on what to do. Not really impressed with the wide camera as goes fisheye to me so don’t use much. I’m not a professional at all, but do take lots of pics and vids all with my iPhone.
Don't forget that the 12 (all of them) have MUCH better ultra-wide photos with much less distortion than before....
But that's only on the ultra-wide camera.
Frankly, and assuming sensor-OIS is substantially better than lens-OIS, I'd rather have that on the wide and telephoto sensors, as camera shake is more detrimental at longer focal lengths.
Not the ultra-wide... just the "regular" wide.