Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Can Google make this same thing work on iOS, or would it require different hardware or drivers?

I'd be willing to pay up to ~$100 for an app capable of doing this in iOS. This is the kind of feature that makes upgrading to a new phone meaningful... but you need at least a dozen of them for me to buy your $1000+ iPhone.
 
It's funny how photography changed from "what the sensor sees" to "what the image processor decided to show" :)

Apple fake blur was the first step. Now Google's AI decides what color should those dark pixels have :)
 
It's more a curiosity how they managed to oversaturate the photo without introducing significant noise or blur.

Am I seeing the same photos as the rest of you, or are my definitions of noise and blur off base? Because I don't find those photos to be clean or clear.
 
When google showed the iPhone XS vs pixel 3 comparison, the butthurt fanboys cried that google was fooling about the XS photo. Now it is proven that it was true and google is able to capture details in pitch black dark situations. Give credit where it is due. Google leaped way ahead of competition in photography, again!
Now I hope apple can copy this and the call screen feature as I miss them in my iPhone.
 
Last edited:
That picture of the fire extinguisher...why would you take a picture of that in the dark? The second photo looks like someone just turned on the lights. Maybe it’s not a good example of what this is doing. Seeing a photo that’s pitch black..to me that’s not low light. I think they need better examples to show this off.
 
  • Like
Reactions: 5105973
Am I seeing the same photos as the rest of you, or are my definitions of noise and blur off base? Because I don't find those photos to be clean or clear.

I wouldn't call these "amazing" by any stretch of the imagination. But the quality of these for a smartphone with high ISO is quite impressive.

would I compare it to DSLR quality under the same conditions? Front no!
 
That picture of the fire extinguisher...why would you take a picture of that in the dark? The second photo looks like someone just turned on the lights. Maybe it’s not a good example of what this is doing. Seeing a photo that’s pitch black..to me that’s not low light. I think they need better examples to show this off.

Why would you take a picture of a fire extinguisher in the dark? Are you serious?! To show how the Pixel works with low light photos!!!
 
Last edited by a moderator:
How do quality photos compare to unnecessary and unreasonable features like you mentioned? Absolutely horrible analogy.

I need to be able to take less-muddled night photos just as much as I need my phone to walk my dog. Make better sense? My *opinion* is these types of photos are once-in-a-year-or-half-decade type of need for the vast majority of users. We're well into the phase where innovation for extremely niche low-need design exercises is upon us. As a guy who wants an option for a new-from-Apple $400 or less iPhone, cramming marginal-use tech like this into future phones is eye-rolling. Just like I wouldn't pay up for dog walking ability from my phone nor expect it to be able to jump my car. I'm not taking away from those for whom this ability to take better dark/muddled night photos is something they've wanted their entire life. Just don't charge me and the majority of users for this "delight feature" of truly marginal use.

Many people have a social life, especially outside of working hours.

Who have been waiting decades to take less-muddled less-grainy night photos?

I'll amend: this feature should be great for CIA/KGB agents and vampires. Just don't charge me more for incremental design contest "features" like this. :)
 
I don’t care about any of this. My Xs Max is producing gorgeous pics in day light/low light. Software is software and there is always room for improvement. Who does it first does not matter. Apple will eventually catch up. What matters to me is the Apple ecosystem (as long as it is holding up). To each his own...
 
  • Like
Reactions: MacNeb
So Apple could technically do the same right?
Unknown. Since presumably Google is uploading people's photos to their cloud and applying AI and then showing the updated image versus Apple's approach of privacy first where things are done locally on device. So anything is possible but the reason why Google's engineers can do such amazing things is their choice of features over privacy. Not saying one direction is right or wrong but both companies are locked into their paths. A normal high end camera can do the same if on a tripod and you set it to a long exposure even in near pitch black. On phones this hasn't been possible because phones are handheld and a long exposure would be very blurry. So my guess is they are doing a long exposure on the phone and then the AI figures out exactly how the phone is moving and undoes that in software to make it sharp.
 
I'm curious about how this works. Is it machine learning, compositing multiple shots, a combination, or something else?

made a post slightly above this where I applied my (amateur) knowledge of photography.

I believe this is being done using an extremely high ISO sensor setting combined with software (AI if you want to call it) noise reduction.

While the photos are good, i can spot "smoothing" in several areas of the photo that you get when you take a noisy photo and apply a denoise to it. I've produced similar affects before using a high ISO, than post processing it in lightroom with a massive amount of noise reduction filter.
 
  • Like
Reactions: MacNeb
Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.

I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.
I have an app call Cortex Cam on my iPhone and it does the same thing the night mode does on the pixel. So no need for the pixel just an app will do.
 
  • Like
Reactions: ConvertedToMac
I don’t care about any of this. My Xs Max is producing gorgeous pics in day light/low light. Software is software and there is always room for improvement. Who does it first does not matter. Apple will eventually catch up. What matters to me is the Apple ecosystem (as long as it is holding up). To each his own...
I absolutely love Apple, though certainly care about this since as photographer this is such a major leap in handheld long exposure with the assistance of AI. This is a massive and doesn't make Apple bad or Google good it just is, and great for everyone since all this competition is driving the industry forward.
 
  • Like
Reactions: shotokaizer
Can Google make this same thing work on iOS, or would it require different hardware or drivers?

I'd be willing to pay up to ~$100 for an app capable of doing this in iOS. This is the kind of feature that makes upgrading to a new phone meaningful... but you need at least a dozen of them for me to buy your $1000+ iPhone.
I don't see why it would be tied to hardware. Photoshop and any photo manipulation program with level and/or curve adjustments can do this easily. Making it look good depends upon the skill of the user though. Of course, JPEG will have a lot of missing detail so it won't be as good as drawing out details from a RAW file.
 
That picture of the fire extinguisher...why would you take a picture of that in the dark? The second photo looks like someone just turned on the lights. Maybe it’s not a good example of what this is doing. Seeing a photo that’s pitch black..to me that’s not low light. I think they need better examples to show this off.

I'd like to hear examples of when this feature would be truly useful.

Vampires, CIA/KGB agents, Apple marketing interns, MacRumors article writers, and raccoons need not respond, we know you could use this feature. As for the rest of the world...?
 
Last edited:
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.

You might as well stick to film. Almost any professional digital photographer post processes their photos and digitally changes the image to "look better". What's the difference between that and machine learning recoloring.
 
  • Like
Reactions: raam89
Now if Google brings it's Camera app to iPhone... that would be best of both the worlds!!!!
After all, its magic is in the software and not the camera hardware itself, so in theory, it can be ported to iPhone or in fact any other Android phone as well
 
Best part about this is that it's coming to Pixel 1 and 2 as well. Super excited to start using it on my 2 xl. I hope Apple adopts something similar. More importantly, I hope it's a feature they introduce to phones going back to the iPhone 8.

Since Google can do this on the Pixel 1 (and how anemic its CPU is in comparison to the A series) Apple should probably be able to do it all the way back to the 6s. Get to it Apple engineers, would love to have this rolled out next summer.
 
I have an app call Cortex Cam on my iPhone and it does the same thing the night mode does on the pixel. So no need for the pixel just an app will do.
I've tried Cortex app to, and while the idea is the same of using multiple longer exposures to a degree, the results are literarily night and day different. Take a picture of a near pitch black scene outside with your phone then with cortex app and it helps ever so slightly but not much, compared to what the verge is showing here where near pitch black seems like a daylight.
 
I don't see why it would be tied to hardware.

if my guess above is correct that this is ISO based, than it's somewhat limited to hardware. While most cameras do have the ability to change iSO VIA software (or dials etc), there's an upper limit to each sensor based on design. Not all cameras are capable of the same ISO settings.

For example, Canon 40D has a max iso of 3200. But a 60D due to new advancements was capable of 12800

so if this is ISO based, it wil depend on what Apple's sensor tech is capable of
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.