Are you sure about that?
I only ask because another poster was adamant about the calculations being done on the Qualcomm chip.
I will admit I was very skeptical about their claims.
Google's camera app works without any internet connection no problem.
You can install in on a phone as an apk without any initial internet connection and it just works.
That's just your opinion which you tried to justify with incorrect claims about Google's HDR+ tech.
It's just a more advanced HDR technique like Google has been doing for many years.
Well google can do it with a Snapdragon 820 or 835 no problem. What matters in the end is the improvement in photo quality.
Apple is simply depriving A11 iphone users of this Smart HDR feature in order to distance their new phones in terms of camera capabilities. And that's the only reason.
After some research on my end, to ensure I was giving the correct information - you are indeed correct that the Google Camera app does NOT require internet access and does NOT upload pictures to the cloud. However, the way in which it's HDR+ feature works is rather drastically different than the Smart HDR on the iPhone XS and the even the auto HDR on the X. The HDR+ feature, when enabled, constantly takes under-exposed images and combines those images. More traditional HDR takes both under, over and properly exposed images and combines those. The HDR+ mode has some major benefits - especially on slower hardware - in that the underexposed images are taken before the shutter is ever pushed - thus when you push the shutter button, there is no shutter lag as the pictures have already been taken. In addition, because underexposed images are literally taken faster (faster shutter), a lot of underexposed images can be taken before the shutter is ever pushed - giving the algorithm that combines the photos a lot of information to work with. However, it still doesn't achieve the full effect of normal HDR, in that no over, or even properly, exposed images are taken to bring out the details in the shadows. The shadow details are extrapolated somewhat from the plethora of underexposed images taken, but a lot of detail can still be missing. The Google Camera app that can be side loaded onto non-Pixel phones utilizes this HDR+ mode to enhance the photos (even from subpar cameras).
However, the Pixel 2 Google Camera app includes an additional HDR mode - HDR enhanced. This is your more typical HDR mode, that takes under, over, and properly exposed pictures and combines them. On the Pixel 2, these images aren't taken until the shutter button is pushed. The Pixel 2 includes an enhanced ISP (Google calls it the Visual Core) that allows for these pictures to be taken with near zero shutter lag.
Stepping back a bit, the HDR+ mode is primarily what Google is focusing on - because even on slower hardware, underexposed images can be taken quickly, and can bring the effect of zero shutter lag. On slower hardware though, there may not be a shutter lag, but the picture recombining can take a sec or two (which occurs behind the scenes).
Now that we have all of that out of the way, let's go back to the iPhone. On iPhones prior to the X, there is an HDR mode that had to either be enabled manually, or you could set it to auto and allow the Camera app to determine the best time to utilize HDR. This HDR mode was traditional HDR - one over, one under, and one properly exposed image. The reason HDR wasn't on all the time was that in lowlight situations, HDR greatly increases shutter lag (from the over exposed image) and can actually result in lower quality photos than just a standard exposure. With both the iPhone X AND 8, Apple turned HDR on by default (and also turned the "Keep Normal Photo" option off by default - "Keep Normal Photo" meant you would end up with two pics in your camera roll, one the HDR image, and the other just the standard single exposure image). However, this HDR still relied on multiple images being taken one after the other very quickly AFTER the shutter button is pushed. Obviously, Apple deemed that the hardware in the X and 8 - both the camera module itself as well as the beefed up A11 processor (i.e. the Neural Engine) - was fast enough to make HDR effective in all situations, including lowlight.
Onto the iPhone XS and XS Max - standard HDR is no more. Instead, there is Smart HDR. Smart HDR is somewhat similar to Google's HDR+ (constantly taking underexposed images) in that it occurs before the shutter is even pushed. However, unlike HDR+, Smart HDR is taking underexposed AND overexposed AND properly exposed images constantly - once the shutter is pushed, the best parts of various images are then combined. The impact of Smart HDR is two fold - unlike traditional HDR (and similar to HDR+), you see the results of Smart HDR even before the image is taken. With standard HDR, you won't see the results until after the images are combined (after the shutter is pushed). In addition, because Smart HDR is always running, the Camera app has a huge sample size to create the best looking image. Now, because the images being taken aren't just underexposed, the hardware in the phone needs to handle being able to take over exposed images, even in lowlight (which is usually accomplished by increasing the ISO - which will increase noise - which then requires noise reduction algorithms to cleanup the noise). Since Smart HDR is done dynamically - as these photos are being taken behind the scenes, they are also be analyzed to determine the best parts of each exposure, and they are being combined immediately, allowing you to see the final output even before the shutter is pushed - the hardware needs to be powerful enough to accomplish all of this with no lag.
So, what does this all mean?
First, Google Camera’s HDR+ feature can work on older and slower hardware precisely because it only takes underexposed images. Even with this limitation, combining multiple underexposed images can yield better results than a single properly exposed image.
Prior to the iPhone XS, Apple’s HDR implementation was a traditional HDR - on the iPhone 8 and X, Apple deemed the hardware fast enough to have HDR turned on all the time, but it was still simply taking and combining multiple exposures AFTER the shutter was pressed.
The iPhone XS’ Smart HDR feature is unlike either Google’s HDR+ feature, or Apple’s standard HDR feature from past phones, in that it is taking multiple exposures (not just underexposed) and combining them constantly, before the shutter is pressed. In addition, these exposures aren't simply being combined haphazardly - each exposure is analyzed to determine its best part, and only those best parts are combined into the final photo. This takes some pretty beefy hardware to accomplish.
It is certainly possible that Apple could develop an HDR+ like feature that could be enabled on the iPhone X (or even earlier), but for whatever reason, Apple has decided not to pursue this course (naysayers will of course point to planned obsolesces…). However, and most importantly, based on how Smart HDR functions, it simply wouldn’t work as well (or at all) on slower hardware.
As such, and I will repeat myself again here, Smart HDR definitely relies on the enhancements Apple has made to the Neural Engine. Could they back port the feature to the iPhone X - maybe? But it is unlikely that the feature would work as well on anything but the A12.
Will this actually change your mind? Probably not - but I’ve at least done the research you, and many others, where unwilling (or simply un-wanting) to do.
[doublepost=1538460300][/doublepost]
Are you sure about that?
I only ask because another poster was adamant about the calculations being done on the Qualcomm chip.
I will admit I was very skeptical about their claims.
I stand correct re: uploading to the cloud - the Google Camera app does NOT upload to the cloud or require an internet connection. However, it functions in a very different manner than Apple's Smart HDR (you can read all about that above).