Download google photos, shoot a low light pic, hit edit picture, hit auto. Looks pretty good depending on the picture.Seems interesting _ I wonder if Google will release some sort of app for the iPhone that does this.
Download google photos, shoot a low light pic, hit edit picture, hit auto. Looks pretty good depending on the picture.Seems interesting _ I wonder if Google will release some sort of app for the iPhone that does this.
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?
At a media event in New York City earlier this month, Google previewed a new low-light camera feature called "Night Sight" that uses machine learning to choose the right colors based on the content of the image. The result is much brighter photos in low-light conditions, without having to use flash.
Google showed a side-by-side comparison of two unedited photos shot in low light with an iPhone XS and its latest Pixel 3 smartphone with Night Sight, and the photo shot on the latter device is much brighter.
![]()
Google said Night Sight will be available next month for its Pixel smartphones, but an XDA Developers forum member managed to get the feature to work ahead of time, and The Verge's Vlad Savov tested out the pre-release software on a Pixel 3 XL. The results, pictured below, are simply remarkable.
![]()
Without Night Sight
![]()
With Night Sight
![]()
Without Night Sight
![]()
With Night Sight
![]()
Without Night Sight
![]()
With Night Sight
Google and Apple are both heavily invested in computational photography. On the latest iPhones, for example, Smart HDR results in photos with more highlight and shadow detail, while Depth Control significantly improves Portrait Mode. But, Night Sight takes low-light smartphone photography to a whole new level.
Article Link: Google's Upcoming 'Night Sight' Mode for Pixel Phones Captures Remarkable Low-Light Photos
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?
Google didn’t post these images. A tech reporter on The Verge did. And they aren’t comparisons between the iPhone XS and the Pixel 3. They’re comparisons of the Google Pixel 3 XL with and without Night Sight.Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.
I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?
exactly! Better to apply the filter later if you want it.I... I don't think I can get behind this.
If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.
So Apple could technically do the same right?
Couldn't Google just give you both the original lighting and the altered image and let you choose which one, similar to HDR on iphones? No one said this is overriding your intended low-light shots.I don't have a Pickle 3 and didn't watch the event. I assumed el Goog turned it on by default, since their camera app leans towards the sparser side of things.
As much as I love iPhone and iOS, I gotta admit it this is pretty awesome feature. There are people now making choices for their next smartphone the camera should be really good.
Uhh - these images aren't from Google. These are examples from The Verge.
I'd like to hear examples of when this feature would be truly useful.
Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.
I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?
Lol, sounds about right! iPhone XIS Max exclusive next year.Yep. And it won't be available on my $1150 XS Max. I'll have to buy the new version, even though it could easily be done on the XS Max.
It's the Apple way.
Lol, now you know it will be the new 2019 iPhone exclusive. Something along the lines of "introducing our new and improved camera with iNight."And they'll also be introducing it to 100 million users instantaneously. What good are AI advancements when Google can't even get this into the hands of more than 3 million users at once?
Ever do mechanical repairs on engines or appliances and snap photos of the internals for reference or inspection? Typical HDR isn’t always sufficient, and flash causes a jumble of parts to cast harsh shadows that make the scene confusing.