Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.



At a media event in New York City earlier this month, Google previewed a new low-light camera feature called "Night Sight" that uses machine learning to choose the right colors based on the content of the image. The result is much brighter photos in low-light conditions, without having to use flash.

Google showed a side-by-side comparison of two unedited photos shot in low light with an iPhone XS and its latest Pixel 3 smartphone with Night Sight, and the photo shot on the latter device is much brighter.

pixel-night-sight-800x318.jpg

Google said Night Sight will be available next month for its Pixel smartphones, but an XDA Developers forum member managed to get the feature to work ahead of time, and The Verge's Vlad Savov tested out the pre-release software on a Pixel 3 XL. The results, pictured below, are simply remarkable.

low-light-1.jpg

Without Night Sight

high-light-1.jpg

With Night Sight


low-light-2.jpg

Without Night Sight

high-light-2.jpg

With Night Sight


low-light-3.jpg

Without Night Sight

high-light-3.jpg

With Night Sight

Google and Apple are both heavily invested in computational photography. On the latest iPhones, for example, Smart HDR results in photos with more highlight and shadow detail, while Depth Control significantly improves Portrait Mode. But, Night Sight takes low-light smartphone photography to a whole new level.

Article Link: Google's Upcoming 'Night Sight' Mode for Pixel Phones Captures Remarkable Low-Light Photos
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?
 
As much as I love iPhone and iOS, I gotta admit it this is pretty awesome feature. There are people now making choices for their next smartphone the camera should be really good.
 
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?

read more carefully.

these are not shots from google. these are photos taken with the Beta camera App by The Verge.
 
cool. just edit your exposure settings and BAM, "night shot". doubt you'd want to brag about any of these night shot photos, the quality looks terrible
 
Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.

I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.
Google didn’t post these images. A tech reporter on The Verge did. And they aren’t comparisons between the iPhone XS and the Pixel 3. They’re comparisons of the Google Pixel 3 XL with and without Night Sight.
 
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?

The results are so magical that it's understandable that a lot of people will have doubts. You'll see results like this all over the internet from people trying the leaked camera app on their own Pixel.

 
Last edited:
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.
exactly! Better to apply the filter later if you want it.
 
So Apple could technically do the same right?

you can already do this, but not incorporated directly into the camera app. Edit the photos, or "there is an app for that" as other apps boost the light dramatically. this is a trick of surrealism vs realism. Nothing in the hardware specs of the Pixel obviates this to Pixel alone.

Kudos to Google if they figured out people want this in a stock app. But the collective oohs and has from the Roboboys can stop.
 
I don't have a Pickle 3 and didn't watch the event. I assumed el Goog turned it on by default, since their camera app leans towards the sparser side of things.
Couldn't Google just give you both the original lighting and the altered image and let you choose which one, similar to HDR on iphones? No one said this is overriding your intended low-light shots.
 
  • Like
Reactions: H3LL5P4WN
As much as I love iPhone and iOS, I gotta admit it this is pretty awesome feature. There are people now making choices for their next smartphone the camera should be really good.

except, you can do this on virtually any camera, Google just moved the feature to the stock app.
 
  • Like
Reactions: BlueParadox
I call ********.

That shot on the XS photo? The buildings in the back are really that bright with no light bleeding into the foreground. Really? That shot looks like it was taken in Central Park on the rocks, that area isn't lit like that.

I'm pretty sure that Google's optics are great. Their phones have critically had the best processing for awhile. But to claim that the XS shot was unedited, or that it's that poor in comparison, nah.
 
Last edited by a moderator:
There's a lot of salt in this thread for cool new "optional" tech added to cell phone cameras. Great that you don't ever take pics in the dark, but just maybe there's someone who was looking for just this thing.
 
  • Like
Reactions: megagene
Amplifying image signals from a tiny sensor would produce tons of noise, and it's a war of how to remove noise while keeping meaningful data out of it. I won't expect miracles as the laws of physics still apply. It's like infinitely zooming in photos in one of those tv shows.
 
Great looking photos there, iPhones are lacking in this department, hopefully this will motivate them to catch up. When well lit, iPhones make great photos, but in low light situations, some other phones are way ahead (tried it and seen it first hand). I loved playing around with camera doing night shots in the past (long exposures as well), would be great having that option in the near future.
 
I am not Pro Apple at all, I always come here to trash Apple but these images are fake as hell. Not even an iPhone 4 was so bad at these low light conditions. Just look at the sky. That is a picture from a digital camera from late 90's. It is IMPOSSIBLE that the iPhone Xs could take that picture like that. I have an iPhone 5 and iPhone X and any of them came up with such bad quality.
Screen Shot 2018-10-26 at 12.36.45 PM.png
 
  • Like
Reactions: prasand
After testing this with a Pixel 3 next to an iPhone X I have to say this is a huge step up in phone camera technology! It's really remarkable what they were able to do. It's not a gimmick at all.
 
Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.

I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.

It's just a longer exposure, it's not some magnificent technology or magic. I'd imagine the effort to do the same thing on an iPhone is minimal, they'd just need to make the shutter stay open and let more light in. Again, not magic.
 
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?

It is because the before picture is a different photo. This doesn't magically take an all dark exposure and lighten it, it is a different camera mode that you have to hold in place for longer so it can piece together a bunch of frames from a ton of different exposure lengths. Looks like you need around 5 seconds of holding the camera, meaning taking photos of people is going to be difficult.
 
Yep. And it won't be available on my $1150 XS Max. I'll have to buy the new version, even though it could easily be done on the XS Max.

It's the Apple way.
Lol, sounds about right! iPhone XIS Max exclusive next year.
[doublepost=1540573126][/doublepost]
And they'll also be introducing it to 100 million users instantaneously. What good are AI advancements when Google can't even get this into the hands of more than 3 million users at once?
Lol, now you know it will be the new 2019 iPhone exclusive. Something along the lines of "introducing our new and improved camera with iNight."
 
Ever do mechanical repairs on engines or appliances and snap photos of the internals for reference or inspection? Typical HDR isn’t always sufficient, and flash causes a jumble of parts to cast harsh shadows that make the scene confusing.

I agree! I was shortsightedly focusing on an apparent need to take fun photos in complete darkness.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.