Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You can't say that the Pixel 2 has a more complex algorithm than the iPhone XS, we really don't know. With that being said, the Pixel 2 actually included a special ISP specifically for the camera. So whatever speed advantage the A11 has over the Snapdragon 835 doesn't effect the Pixel 2 camera as a result of its ISP.

That is incorrect. I'm using Pixel 2's Camera software on a phone with Snapdragon 625(so a low end SOC) and I can get the same HDR+ Enhanced style images that the Pixel 2 can take. The difference is that it takes a couple a seconds to process an image vs a much faster SOC.
Most of the Pixel 2's tricks are software based and they can be replicated on other phones without much effort.
Also the Pixel Visual Core wasn't even activated until the release the the 8.1 update and even now it's mostly used by 3rd party apps like Snapchat, Instagram, Whatsapp or 3rd party Camera Apps.

So yes people who say that Smart HDR should work on older iphones are absolutely correct.
Google's HDR+ Enhanced works on phones with Snapdragon 625 What does that tell you?
[doublepost=1538232551][/doublepost]
Except that’s not really true is it?

Cross referencing their phones here https://en.m.wikipedia.org/wiki/Products_of_Xiaomi#Mi_Series with Android release dates here https://en.m.wikipedia.org/wiki/Android_(operating_system)#Platform_usage shows the vast majority didn’t get supported for anywhere near 3 years of updates, let alone 5 years.
He was talking about MIUI.
 
Last edited:
Google's HDR+ Enhanced works on phones with Snapdragon 625 What does that tell you?

It tells me a lot about Google's HDR+ software, and nothing about Apple's. I'm not saying it can't run on a lower phone. I'm just saying you can't use Google as proof that it's possible. Just because everyone in the business does something a particular way, doesn't mean Apple follows suit. Look at FaceID. It's just unlocking your phone with your face like many Android products do. And yet they did it in a completely different manner than anyone else out there.

Maybe they're not offering the new features on older phones to generate more sales. Or maybe they do it in a way that really does require the new phone. Maybe the experience would be poor on an X, maybe not.

You want my guess? And it really is a total guess. I say the new software WOULD run on last year's phones. But I think it would run slower than they would like. Which means some people might miss that perfect shot because their phone is now slower at taking pictures. Then they would bitch that Apple intentionally slowed things down to force them to buy the new phone (just search the forums for Planned Obsolescence). I think it would be a bad PR move for Apple. Sure, they could make it an optional thing, but that's not how Apple typically operates. I don't think they like giving the user the ability to worsen the out-of-the-box performance of their device. Again, just a guess.

No matter who is right, getting upset at Apple for doing business the same way they've always done business seems a little silly. We might not like how they do things. But at this point, it really shouldn't be a surprise that each year the new phones are going to get features that the older phones don't get. I'm not saying I like it. But I do expect them to do it again next year, and the year after and the year after that.
 
  • Like
Reactions: FFR
Just because everyone in the business does something a particular way, doesn't mean Apple follows suit.
Well Apple did follow suit with Smart HDR didn't they?
I get that everybody is bias but there is simply no reason for Smart HDR to not work on an iphone with an A11 SOC.

I say the new software WOULD run on last year's phones. But I think it would run slower than they would like. Which means some people might miss that perfect shot because their phone is now slower at taking pictures.

You say that because you have absolutely no idea how Googles HDR+ works on slower phones.
You are the guessing level and sums up your entire post.
 
You say that because you have absolutely no idea how Googles HDR+ works on slower phones.
You are the guessing level and sums up your entire post.

And you know how Apple's works? I'm talking about having seen the source code yourself or at least have seen some architectural designs or something along those lines. Actual knowledge, not opinion. You're guessing as much as anyone else here. At least I was upfront about my guesses.
 
And you know how Apple's works? I'm talking about having seen the source code yourself or at least have seen some architectural designs or something along those lines. Actual knowledge, not opinion. You're guessing as much as anyone else here. At least I was upfront about my guesses.
I don't need access to any source code. This is getting ridiculous.
Are you saying that Google's similar solution has some magic behind it that it works on slower phones? I'm just looking at two software solutions so what are you looking at? I don't get it.
You're guessing as much as anyone else here. At least I was upfront about my guesses.

You are not upfront about anything you are just trying to contradict me with incorrect interpretations of things you know nothing about.

For example
I say the new software WOULD run on last year's phones. But I think it would run slower than they would like. Which means some people might miss that perfect shot because their phone is now slower at taking pictures.

That is not the case with HDR+ on slower phones. You can take as many successive pictures as you want, it just takes more time to save these pictures that's it(this shouldn't be a problem anyway with the A11 and Apple's fast nvme), so nobody is going to miss any important moment.
Now I doubt Apple's similar software solution can't work like this(because that means it's inferior) on the iphone x which is much faster than a S835 phone, not to mention slower phones running S625, 630, 636 or 660.
 
Last edited:
A series 4 Apple Watch should help.

Actually, that's next up. I lost a Series 3 and iPad Pro 10.5" in fire (garage, not house) back in July. When we get the insurance money, they're both getting replaced.
[doublepost=1538249450][/doublepost]
I don't need access to any source code. This is getting ridiculous.
Are you saying that Google's similar solution has some magic behind it that it works on slower phones? I'm just looking at two software solutions so what are you looking at? I don't get it.

Go back and read what I said: "Just because everyone in the business does something a particular way, doesn't mean Apple follows suit. Look at FaceID. It's just unlocking your phone with your face like many Android products do. And yet they did it in a completely different manner than anyone else out there."

Just because they're both doing HDR doesn't mean Apple's solution is software only. That might not be how Google did it. But that doesn't mean Apple isn't taking advantage of the 9x increase in the ML part of the CPU.
 
That is incorrect. I'm using Pixel 2's Camera software on a phone with Snapdragon 625(so a low end SOC) and I can get the same HDR+ Enhanced style images that the Pixel 2 can take. The difference is that it takes a couple a seconds to process an image vs a much faster SOC.
Most of the Pixel 2's tricks are software based and they can be replicated on other phones without much effort.
Also the Pixel Visual Core wasn't even activated until the release the the 8.1 update and even now it's mostly used by 3rd party apps like Snapchat, Instagram, Whatsapp or 3rd party Camera Apps.

So yes people who say that Smart HDR should work on older iphones are absolutely correct.
Google's HDR+ Enhanced works on phones with Snapdragon 625 What does that tell you?
[doublepost=1538232551][/doublepost]
He was talking about MIUI.

That's because the Google Camera software connects to Google's cloud servers to do the heavy lifting. The complex calculations aren't actually occurring on the phone's hardware.

So, again, as I repeat ad nauseam - the Smart HDR function and depth effect capabilities on the iPhone XS are very much tied in to the more powerful Neural Engine and ISP, and, as such, those effects would not be possible, or not possible in any form of an efficient manner, on the iPhone X.
 
  • Like
Reactions: MEJHarrison
Actually, that's next up. I lost a Series 3 and iPad Pro 10.5" in fire (garage, not house) back in July. When we get the insurance money, they're both getting replaced.
[doublepost=1538249450][/doublepost]

Go back and read what I said: "Just because everyone in the business does something a particular way, doesn't mean Apple follows suit. Look at FaceID. It's just unlocking your phone with your face like many Android products do. And yet they did it in a completely different manner than anyone else out there."

Just because they're both doing HDR doesn't mean Apple's solution is software only. That might not be how Google did it. But that doesn't mean Apple isn't taking advantage of the 9x increase in the ML part of the CPU.


Sorry to hear about the fire, that’s a terrible experience.

I hope you enjoy the s4 when the time comes, I upgraded from a series 2, but posters are claiming to notice difference from the series 3 as well, must be related to going 64 bit.
 
  • Like
Reactions: MEJHarrison
That's because the Google Camera software connects to Google's cloud servers to do the heavy lifting. The complex calculations aren't actually occurring on the phone's hardware.

So, again, as I repeat ad nauseam - the Smart HDR function and depth effect capabilities on the iPhone XS are very much tied in to the more powerful Neural Engine and ISP, and, as such, those effects would not be possible, or not possible in any form of an efficient manner, on the iPhone X.

Are you sure about that?

I only ask because another poster was adamant about the calculations being done on the Qualcomm chip.

I will admit I was very skeptical about their claims.
 
Go back and read what I said: "Just because everyone in the business does something a particular way, doesn't mean Apple follows suit. Look at FaceID. It's just unlocking your phone with your face like many Android products do. And yet they did it in a completely different manner than anyone else out there."

Just because they're both doing HDR doesn't mean Apple's solution is software only. That might not be how Google did it. But that doesn't mean Apple isn't taking advantage of the 9x increase in the ML part of the CPU.
You do understand that if Apple's solution needs way more powerful hardware and also specific hardware support to work it's actually inferior to Google's solution.
It's like comparing 2 antiviruses that have similar detection efficiency but one needs 2 times more CPU cores to work the same way the other antivirus does.
My premise is that Apple's HDR solution can work as efficient as Google's. Is that wrong or what?
 
That's because the Google Camera software connects to Google's cloud servers to do the heavy lifting. The complex calculations aren't actually occurring on the phone's hardware.

Google's camera app works without any internet connection no problem.
You can install in on a phone as an apk without any initial internet connection and it just works.


So, again, as I repeat ad nauseam - the Smart HDR function and depth effect capabilities on the iPhone XS are very much tied in to the more powerful Neural Engine and ISP,

That's just your opinion which you tried to justify with incorrect claims about Google's HDR+ tech.
It's just a more advanced HDR technique like Google has been doing for many years.


and, as such, those effects would not be possible, or not possible in any form of an efficient manner, on the iPhone X.
Well google can do it with a Snapdragon 820 or 835 no problem. What matters in the end is the improvement in photo quality.
Apple is simply depriving A11 iphone users of this Smart HDR feature in order to distance their new phones in terms of camera capabilities. And that's the only reason.
 
Last edited:
You do understand that if Apple's solution needs way more powerful hardware and also specific hardware support to work it's actually inferior to Google's solution.

If it's using the hardware to do stuff that could be emulated in software at the same quality and speed, then I totally agree.

My premise is that Apple's HDR solution can work as efficient as Google's. Is that wrong or what?

I don't know any details of either companies solution. It seems like a fair thing to say, but I don't have enough details to say for sure one way or the other. Maybe they're screwing us (probably), or maybe not. Maybe those 5 trillions operations/second is getting put to good use or maybe this is a full-blown software solution.

At this point no one is right or wrong. It's all guessing. Just remember, for most companies, unlocking your phone with your face is a software solution. Clearly Apple has involved hardware in the process. So I wouldn't draw too many conclusions based on what the competition are doing. Apple marches to their own beat.

What I do know is I like the photos I've been getting out of the new camera.
 
Are you sure about that?


I only ask because another poster was adamant about the calculations being done on the Qualcomm chip.


I will admit I was very skeptical about their claims.

Google's camera app works without any internet connection no problem.

You can install in on a phone as an apk without any initial internet connection and it just works.





That's just your opinion which you tried to justify with incorrect claims about Google's HDR+ tech.

It's just a more advanced HDR technique like Google has been doing for many years.




Well google can do it with a Snapdragon 820 or 835 no problem. What matters in the end is the improvement in photo quality.

Apple is simply depriving A11 iphone users of this Smart HDR feature in order to distance their new phones in terms of camera capabilities. And that's the only reason.

After some research on my end, to ensure I was giving the correct information - you are indeed correct that the Google Camera app does NOT require internet access and does NOT upload pictures to the cloud. However, the way in which it's HDR+ feature works is rather drastically different than the Smart HDR on the iPhone XS and the even the auto HDR on the X. The HDR+ feature, when enabled, constantly takes under-exposed images and combines those images. More traditional HDR takes both under, over and properly exposed images and combines those. The HDR+ mode has some major benefits - especially on slower hardware - in that the underexposed images are taken before the shutter is ever pushed - thus when you push the shutter button, there is no shutter lag as the pictures have already been taken. In addition, because underexposed images are literally taken faster (faster shutter), a lot of underexposed images can be taken before the shutter is ever pushed - giving the algorithm that combines the photos a lot of information to work with. However, it still doesn't achieve the full effect of normal HDR, in that no over, or even properly, exposed images are taken to bring out the details in the shadows. The shadow details are extrapolated somewhat from the plethora of underexposed images taken, but a lot of detail can still be missing. The Google Camera app that can be side loaded onto non-Pixel phones utilizes this HDR+ mode to enhance the photos (even from subpar cameras).

However, the Pixel 2 Google Camera app includes an additional HDR mode - HDR enhanced. This is your more typical HDR mode, that takes under, over, and properly exposed pictures and combines them. On the Pixel 2, these images aren't taken until the shutter button is pushed. The Pixel 2 includes an enhanced ISP (Google calls it the Visual Core) that allows for these pictures to be taken with near zero shutter lag.

Stepping back a bit, the HDR+ mode is primarily what Google is focusing on - because even on slower hardware, underexposed images can be taken quickly, and can bring the effect of zero shutter lag. On slower hardware though, there may not be a shutter lag, but the picture recombining can take a sec or two (which occurs behind the scenes).

Now that we have all of that out of the way, let's go back to the iPhone. On iPhones prior to the X, there is an HDR mode that had to either be enabled manually, or you could set it to auto and allow the Camera app to determine the best time to utilize HDR. This HDR mode was traditional HDR - one over, one under, and one properly exposed image. The reason HDR wasn't on all the time was that in lowlight situations, HDR greatly increases shutter lag (from the over exposed image) and can actually result in lower quality photos than just a standard exposure. With both the iPhone X AND 8, Apple turned HDR on by default (and also turned the "Keep Normal Photo" option off by default - "Keep Normal Photo" meant you would end up with two pics in your camera roll, one the HDR image, and the other just the standard single exposure image). However, this HDR still relied on multiple images being taken one after the other very quickly AFTER the shutter button is pushed. Obviously, Apple deemed that the hardware in the X and 8 - both the camera module itself as well as the beefed up A11 processor (i.e. the Neural Engine) - was fast enough to make HDR effective in all situations, including lowlight.

Onto the iPhone XS and XS Max - standard HDR is no more. Instead, there is Smart HDR. Smart HDR is somewhat similar to Google's HDR+ (constantly taking underexposed images) in that it occurs before the shutter is even pushed. However, unlike HDR+, Smart HDR is taking underexposed AND overexposed AND properly exposed images constantly - once the shutter is pushed, the best parts of various images are then combined. The impact of Smart HDR is two fold - unlike traditional HDR (and similar to HDR+), you see the results of Smart HDR even before the image is taken. With standard HDR, you won't see the results until after the images are combined (after the shutter is pushed). In addition, because Smart HDR is always running, the Camera app has a huge sample size to create the best looking image. Now, because the images being taken aren't just underexposed, the hardware in the phone needs to handle being able to take over exposed images, even in lowlight (which is usually accomplished by increasing the ISO - which will increase noise - which then requires noise reduction algorithms to cleanup the noise). Since Smart HDR is done dynamically - as these photos are being taken behind the scenes, they are also be analyzed to determine the best parts of each exposure, and they are being combined immediately, allowing you to see the final output even before the shutter is pushed - the hardware needs to be powerful enough to accomplish all of this with no lag.

So, what does this all mean?

First, Google Camera’s HDR+ feature can work on older and slower hardware precisely because it only takes underexposed images. Even with this limitation, combining multiple underexposed images can yield better results than a single properly exposed image.

Prior to the iPhone XS, Apple’s HDR implementation was a traditional HDR - on the iPhone 8 and X, Apple deemed the hardware fast enough to have HDR turned on all the time, but it was still simply taking and combining multiple exposures AFTER the shutter was pressed.

The iPhone XS’ Smart HDR feature is unlike either Google’s HDR+ feature, or Apple’s standard HDR feature from past phones, in that it is taking multiple exposures (not just underexposed) and combining them constantly, before the shutter is pressed. In addition, these exposures aren't simply being combined haphazardly - each exposure is analyzed to determine its best part, and only those best parts are combined into the final photo. This takes some pretty beefy hardware to accomplish.

It is certainly possible that Apple could develop an HDR+ like feature that could be enabled on the iPhone X (or even earlier), but for whatever reason, Apple has decided not to pursue this course (naysayers will of course point to planned obsolesces…). However, and most importantly, based on how Smart HDR functions, it simply wouldn’t work as well (or at all) on slower hardware.

As such, and I will repeat myself again here, Smart HDR definitely relies on the enhancements Apple has made to the Neural Engine. Could they back port the feature to the iPhone X - maybe? But it is unlikely that the feature would work as well on anything but the A12.

Will this actually change your mind? Probably not - but I’ve at least done the research you, and many others, where unwilling (or simply un-wanting) to do.
[doublepost=1538460300][/doublepost]
Are you sure about that?

I only ask because another poster was adamant about the calculations being done on the Qualcomm chip.

I will admit I was very skeptical about their claims.

I stand correct re: uploading to the cloud - the Google Camera app does NOT upload to the cloud or require an internet connection. However, it functions in a very different manner than Apple's Smart HDR (you can read all about that above).
 
Last edited:
After some research on my end, to ensure I was giving the correct information - you are indeed correct that the Google Camera app does NOT require internet access and does NOT upload pictures to the cloud. However, the way in which it's HDR+ feature works is rather drastically different than the Smart HDR on the iPhone XS and the even the auto HDR on the X. The HDR+ feature, when enabled, constantly takes under-exposed images and combines those images. More traditional HDR takes both under, over and properly exposed images and combines those. The HDR+ mode has some major benefits - especially on slower hardware - in that the underexposed images are taken before the shutter is ever pushed - thus when you push the shutter button, there is no shutter lag as the pictures have already been taken. In addition, because underexposed images are literally taken faster (faster shutter), a lot of underexposed images can be taken before the shutter is ever pushed - giving the algorithm that combines the photos a lot of information to work with. However, it still doesn't achieve the full effect of normal HDR, in that no over, or even properly, exposed images are taken to bring out the details in the shadows. The shadow details are extrapolated somewhat from the plethora of underexposed images taken, but a lot of detail can still be missing. The Google Camera app that can be side loaded onto non-Pixel phones utilizes this HDR+ mode to enhance the photos (even from subpar cameras).

However, the Pixel 2 Google Camera app includes an additional HDR mode - HDR enhanced. This is your more typical HDR mode, that takes under, over, and properly exposed pictures and combines them. On the Pixel 2, these images aren't taken until the shutter button is pushed. The Pixel 2 includes an enhanced ISP (Google calls it the Visual Core) that allows for these pictures to be taken with near zero shutter lag.

Stepping back a bit, the HDR+ mode is primarily what Google is focusing on - because even on slower hardware, underexposed images can be taken quickly, and can bring the effect of zero shutter lag. On slower hardware though, there may not be a shutter lag, but the picture recombining can take a sec or two (which occurs behind the scenes).

Now that we have all of that out of the way, let's go back to the iPhone. On iPhones prior to the X, there is an HDR mode that had to either be enabled manually, or you could set it to auto and allow the Camera app to determine the best time to utilize HDR. This HDR mode was traditional HDR - one over, one under, and one properly exposed image. The reason HDR wasn't on all the time was that in lowlight situations, HDR greatly increases shutter lag (from the over exposed image) and can actually result in lower quality photos than just a standard exposure. With both the iPhone X AND 8, Apple turned HDR on by default (and also turned the "Keep Normal Photo" option off by default - "Keep Normal Photo" meant you would end up with two pics in your camera roll, one the HDR image, and the other just the standard single exposure image). However, this HDR still relied on multiple images being taken one after the other very quickly AFTER the shutter button is pushed. Obviously, Apple deemed that the hardware in the X and 8 - both the camera module itself as well as the beefed up A11 processor (i.e. the Neural Engine) - was fast enough to make HDR effective in all situations, including lowlight.

Onto the iPhone XS and XS Max - standard HDR is no more. Instead, there is Smart HDR. Smart HDR is somewhat similar to Google's HDR+ (constantly taking underexposed images) in that it occurs before the shutter is even pushed. However, unlike HDR+, Smart HDR is taking underexposed AND overexposed AND properly exposed images constantly - once the shutter is pushed, the best parts of various images are then combined. The impact of Smart HDR is two fold - unlike traditional HDR (and similar to HDR+), you see the results of Smart HDR even before the image is taken. With standard HDR, you won't see the results until after the images are combined (after the shutter is pushed). In addition, because Smart HDR is always running, the Camera app has a huge sample size to create the best looking image. Now, because the images being taken aren't just underexposed, the hardware in the phone needs to handle being able to take over exposed images, even in lowlight (which is usually accomplished by increasing the ISO - which will increase noise - which then requires noise reduction algorithms to cleanup the noise). Since Smart HDR is done dynamically - as these photos are being taken behind the scenes, they are also be analyzed to determine the best parts of each exposure, and they are being combined immediately, allowing you to see the final output even before the shutter is pushed - the hardware needs to be powerful enough to accomplish all of this with no lag.

So, what does this all mean?

First, Google Camera’s HDR+ feature can work on older and slower hardware precisely because it only takes underexposed images. Even with this limitation, combining multiple underexposed images can yield better results than a single properly exposed image.

Prior to the iPhone XS, Apple’s HDR implementation was a traditional HDR - on the iPhone 8 and X, Apple deemed the hardware fast enough to have HDR turned on all the time, but it was still simply taking and combining multiple exposures AFTER the shutter was pressed.

The iPhone XS’ Smart HDR feature is unlike either Google’s HDR+ feature, or Apple’s standard HDR feature from past phones, in that it is taking multiple exposures (not just underexposed) and combining them constantly, before the shutter is pressed. In addition, these exposures are simply being combined haphazardly - each exposure is analyzed to determine its best part, and only those best parts are combined into the final photo. This takes some pretty beefy hardware to accomplish.

It is certainly possible that Apple could develop an HDR+ like feature that could be enabled on the iPhone X (or even earlier), but for whatever reason, Apple has decided not to pursue this course (naysayers will of course point to planned obsolesces…). However, and most importantly, based on how Smart HDR functions, it simply wouldn’t work as well (or at all) on slower hardware.

As such, and I will repeat myself again here, Smart HDR definitely relies on the enhancements Apple has made to the Neural Engine. Could they back port the feature to the iPhone X - maybe? But it is unlikely that the feature would work as well on anything but the A12.

Will this actually change your mind? Probably not - but I’ve at least done the research you, and many others, where unwilling (or simply un-wanting) to do.
[doublepost=1538460300][/doublepost]

I stand correct re: uploading to the cloud - the Google Camera app does NOT upload to the cloud or require an internet connection. However, it functions in a very different manner than Apple's Smart HDR (you can read all about that above).

Thank you for the analysis.
 
After some research on my end, to ensure I was giving the correct information - you are indeed correct that the Google Camera app does NOT require internet access and does NOT upload pictures to the cloud. However,
However? LoL

After you clearly got things wrong a couple of times you finally decided to inform yourself. At least there is progress.

Anyway HDR+ Enhanced also works on slower phones and doesn't require support from any Pixel Visual Core. It's just software like HDR+ is just software. And it's perfectly usable even on phones with Snapdragon 625 SOC's producing clearly better result especially in low light in comparison to HDR+.

4CuZV4i.jpg

9Ik0cnC.png


Apple's Smart HDR is also just software and it's impossible to think it wouldn't work on phones with an A11 chip. Yeah it might have slightly more impact on battery but taking better pictures is more important than saving a couple of percents of battery.

Also we have this article:
https://9to5mac.com/2018/10/01/halide-camera-iphone-xs-explainer/
It doesn't sound that great in comparison to Google's HDR solution.
 
However? LoL

After you clearly got things wrong a couple of times you finally decided to inform yourself. At least there is progress.

Anyway HDR+ Enhanced also works on slower phones and doesn't require support from any Pixel Visual Core. It's just software like HDR+ is just software. And it's perfectly usable even on phones with Snapdragon 625 SOC's producing clearly better result especially in low light in comparison to HDR+.

4CuZV4i.jpg

9Ik0cnC.png


Apple's Smart HDR is also just software and it's impossible to think it wouldn't work on phones with an A11 chip. Yeah it might have slightly more impact on battery but taking better pictures is more important than saving a couple of percents of battery.

Also we have this article:
https://9to5mac.com/2018/10/01/halide-camera-iphone-xs-explainer/
It doesn't sound that great in comparison to Google's HDR solution.


The other poster did go out of his way to explain as best he could.

As far as we know apples implementation does use the a12 or so we were told during the keynote.
 
The other poster did go out of his way to explain as best he could.

As far as we know apples implementation does use the a12 or so we were told during the keynote.
LoL he only tried very hard to imply a few thing that are not supported by what information is available now.
This after 2 attempt of incorrectly portraying the way HDR+ Enhanced works.
The claim that Smart HDR can work well only on the A12 is simply absurd.
 
LoL he only tried very hard to imply a few thing that are not supported by what information is available now.
This after 2 attempt of incorrectly portraying the way HDR+ Enhanced works.
The claim that Smart HDR can work well only on the A12 is simply absurd.

If you know more, go ahead.
But it did look like he tried.

From what I understand there are two different implementations of hdr. Apples and googles, and they are not the same.
 
If you know more, go ahead.
But it did look like he tried.

From what I understand there are two different implementations of hdr. Apples and googles, and they are not the same.
Yeah, he tried LoL. I'm reading the same info and I don't understand why he sees things like that: Smart HDR needs the A12 specifically or the experience will be subpar.
The A11 is definitely fast enough that taking a few more photos at different exposure than regular and stitching them together shouldn't be a problem.
Like I've said HDR+ Enhanced(which I specifically mentioned in my original post) works on slower phones no problem.
 
Yeah, he tried LoL. I'm reading the same info and I don't understand why he sees things like that: Smart HDR needs the A12 specifically or the experience will be subpar.
The A11 is definitely fast enough that taking a few more photos at different exposure than regular and stitching them together shouldn't be a problem.
Like I've said HDR+ Enhanced(which I specifically mentioned in my original post) works on slower phones no problem.

No clue about hdr+ or googles implementation.

the a12 does have an 8 core npu vs the 2core on the a11. That could be it.
 
However? LoL

After you clearly got things wrong a couple of times you finally decided to inform yourself. At least there is progress.

Anyway HDR+ Enhanced also works on slower phones and doesn't require support from any Pixel Visual Core. It's just software like HDR+ is just software. And it's perfectly usable even on phones with Snapdragon 625 SOC's producing clearly better result especially in low light in comparison to HDR+.

4CuZV4i.jpg

9Ik0cnC.png


Apple's Smart HDR is also just software and it's impossible to think it wouldn't work on phones with an A11 chip. Yeah it might have slightly more impact on battery but taking better pictures is more important than saving a couple of percents of battery.

Also we have this article:
https://9to5mac.com/2018/10/01/halide-camera-iphone-xs-explainer/
It doesn't sound that great in comparison to Google's HDR solution.

*Sigh*

As a mature adult, I'm more than happy and willing to admit when I was wrong. If you want to LOL at that, that speaks volumes about you as a person.

In any case, I'm done arguing with you. Honestly, at this point, I could care less what you think. If you want to believe Smart HDR on the XS is all software and no hardware, so be it.
 
Yeah, he tried LoL. I'm reading the same info and I don't understand why he sees things like that: Smart HDR needs the A12 specifically or the experience will be subpar.
The A11 is definitely fast enough that taking a few more photos at different exposure than regular and stitching them together shouldn't be a problem.
Like I've said HDR+ Enhanced(which I specifically mentioned in my original post) works on slower phones no problem.

Read through this thread - https://forum.xda-developers.com/pixel-2-xl/help/hdr-vs-hdr-enhanced-t3698330. HDR+ Enhanced ≠ Smart HDR.
Educate yourself. And maybe grow up a little?
 
  • Like
Reactions: FFR and MEJHarrison
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.