Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I give google props and I think this competition is good for Apple. But, to me, its creating light where there actually isn’t. And how many people take pictures in pitch black environment?? Isn’t this why we have a flash?? I also have an app that can take similar pictures (Nightcap).
One of the reasons this feature was introduced was because of the poor shots you can get with flash on. As for your comment about people who take pictures in a pitch black environment, i'm sure many have had situations where they wanted to take a picture of something but didn't because it was too dark/late at night to get something meaningful?
 
Exactly! I mean if this is fake then all photos are fake because they are merely mathematical representations of what actually occurred.

yep, that is right.

by the way lots of people upthread are tripping over high ISO creating noise in photographs. high ISO doesn't actually produce noise, but it does reduce the dynamic range of the sensor. the noise in low-light or short-shutter speed images is actually due to the properties of light itself, photon emission being a statistical process at it's core. the problem is called 'photon shot noise' and is essentially an unavoidable property of our universe.

https://en.wikipedia.org/wiki/Shot_noise

the more target photons you are able to capture with a sensor, the more accurate your knowledge of the actual intensity of the target. the fewer photons you capture, the higher the uncertainty (noise)

once the number of target photons is so limited, other sensor issues start coming into play - noise created by heat (Johnson noise: https://en.wikipedia.org/wiki/Johnson–Nyquist_noise) and other artifacts intrinsic to the sensor (bias patterns, lens shading) contribute to degradation of the image.

anyway, the high ISO setting allows you to actually sense these smaller number of incident photons which would otherwise create voltages too low to be detected at lower gain settings.
 
  • Like
Reactions: imtoomuch
For all the Apple Apologist saying the XS takes just as good pictures in dark environments. Here is a video comparison of the XS and pixel 3 with night sight.

If you still say you would prefer the XS pictures then I really don't know what else to say.

 
No, but you’re right, the phone does need to be stabilized for best results. I did brace the phone as I shot. Did those photos from the Verge get shot on a tripod?
Nope. You don't have to brace or anything. Just take the shot like how you would in the day
 
Nope. You don't have to brace or anything. Just take the shot like how you would in the day
Just reeead the article. He said he didn’t use a pod but didn’t mention anything about bracing his holding hand(s) against anything. His gushing frankly sounds a little silly although the results are impressive, and this phrase, “Although it’s not one single long exposure, Google’s night mode still gathers light over a period of a few seconds,” makes no sense to me at all.
 
If it was the other way around where Apple introduced this you all would be singing praises towards them.

Don't hate, appreciate.

This will only make Apple step up their game which will only benefit all of us in the end.
 
Last edited by a moderator:
Just reeead the article. He said he didn’t use a pod but didn’t mention anything about bracing his holding hand(s) against anything. His gushing frankly sounds a little silly although the results are impressive, and this phrase, “Although it’s not one single long exposure, Google’s night mode still gathers light over a period of a few seconds,” makes no sense to me at all.
I posted a video comparison earlier of the pidel wnd XS. Take a look and you'll see why he is gushing.

You don't have to brace or anything. Just snap your photos. Apparently the pixel takes multiple pics and combine them to produce the shot. So technically it is not w long exposure.
 
GREAT response. This has allowed me to broaden my mindset that perhaps, yes, taking photos in slightly darker than normal areas would benefit from being less grainy. I was keying in on the idea/need for taking photos in the dark, which seemed to be quite a stretch.

Great response. Thanks.
It really depends on your life style, for me low light photography is a must due to the venues that I tend to visit that restrict the use of flash photography and also use low light. As a result the equipment I select for photography reflects this need, just as the equipment you select will reflect your needs. And while I will probably never use my phone as my main camera, it is nice to have the additional option.
 
Very impressive but at some point we need to stop calling these photos. These are going through so much post-processing and modification that they no longer bear resemblance to traditional photos that capture what we are seeing with our eyes. From the photo mode smoothing to the google feature - its a post-photo world.
 
  • Like
Reactions: iMEric984
Seriously loving this - and not just thinking of the UFO photos that will actually be able to be photographed now rather than being a blurry blob, you'll see a crisp saucer! I love this feature seriously for things like astrophotography and other night sky type shots.
 
Very impressive but at some point we need to stop calling these photos. These are going through so much post-processing and modification that they no longer bear resemblance to traditional photos that capture what we are seeing with our eyes. From the photo mode smoothing to the google feature - its a post-photo world.

Interesting....when did photos not become photos? Whats your definition of traditional?
 
The colors won't change. It draws out the color already there, not create new colors.

It's something RAW shooters have been doing for years: shoot underexposed and then draw out the details post production. We do it in tricky lighting. ie a spot where normal exposure would completely blowout one section of the picture. There is no way to retrieve data when the it's 255 255 255 RGB (completely white). The "black" in the raw isn't all 0 0 0 RGB (completely black) so some details can be recovered.

So Google does this now in the phone. Nice trick, but I prefer doing it in during post production.
Thanks for the explanation. I figured it was some sort of RAW file manipulation; it appears they are able to extract a lot of information from it. The sensor must have some pretty decent low light performance.
 
. His gushing frankly sounds a little silly although the results are impressive, and this phrase, “Although it’s not one single long exposure, Google’s night mode still gathers light over a period of a few seconds,” makes no sense to me at all.
It is probably similar to my Huawei P20 Pro. In Night mode, the camera takes a number of images with increasing ISO setting and provides a composite image. Takes a few seconds to complete, but bracing hand isn't required.
In daytime, instances with very bright sky and dull foreground, I get a better image using Night mode as the foreground would otherwise be too dark. Like the vast majority of phone users, I don't want the hassle of post processing to correct this. I am happy to point and shoot.
 
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.

For some reason, the phrase "give credit when credit is due" comes to mind when i read this :rolleyes:
 
Thanks for the explanation. I figured it was some sort of RAW file manipulation; it appears they are able to extract a lot of information from it. The sensor must have some pretty decent low light performance.
I’d assume that a part of it is machine learning as well. Likely works similarly to the Pixel 2’s Portrait Mode where the picture was taken, but then additional changes applied prior to the image being saved.

I don’t think this is anything that Apple would do, but there are API’s available that could enable this (Halide and Focos use these API’s to enable features Apple doesn’t in the standard Camera app). This seems like a good area for a surveillance or nature watching app to explore. You attach your phone to a tree in your campsite and see how many creatures wander through. Well, and it’s assuming that you wouldn’t get a blur with moving objects.
 
Last edited:
This is the only feature I have seen that makes me envious of Pixel owners. I wonder if it could be made to work with images that have already been taken. Or is it only possible with sensor data? It would be a great addition to the editing options in Google Photos.
I believe they are copying an iPhone app available since the iPhone 6
 
  • Like
Reactions: ConvertedToMac
Apple getting embarrassed by Google with the camera now, SAD!!

Amazing how people take one feature and suddenly everything about the camera is subpar! Let's keep reality in check, eh? This is low-light performance only. Google has done some very good work, which is very impressive. As long as the feature can be turned off, because turning night into day is a bit odd, I'd say.
 
  • Like
Reactions: iMEric984
This is great for landscape photography, but will be useless with capturing people in the moment without being "smeary." It seems like an easy software implementation to just tell the camera to keep the exposure open for 5 seconds just like any other mirrorless or DSLR camera.

Now if this works with "instant shutter" then I'll be blown away.
Pretty sure that's not how it works. A shutter time of even 1 second will just be a world of blur. I've used HDR+ before and even on old phones like the Nexus 5 where it takes something like 3 - 5 seconds just to snap an HDR+ photo, where people can be walking by and blocking your screen, the camera seems to filter out those moving subjects out quite well. I suspect the phone is taking a lot of underexposed photos and then doing a lot of heavy processing. Comparing this to a DLSR with a 5 second exposure is totally inaccurate. If it were so easy to take 5 second photos handheld, then tripods would not exist.
 
I suspect the phone is taking a lot of underexposed photos and then doing a lot of heavy processing.

there might be heavy processing involved, but the core algorithm in combining a bunch of short exposures is just to average the corresponding pixels from each image together. of course, before you do that, you have to align the images first. for that you probably need an FFT, but again, very well understood (and not new) math.
 
Very impressive but at some point we need to stop calling these photos. These are going through so much post-processing and modification that they no longer bear resemblance to traditional photos that capture what we are seeing with our eyes. From the photo mode smoothing to the google feature - its a post-photo world.

I think your point is valid if the photos are used as evidence to prove something. Otherwise, photography is an art medium.
 
This is why Apple needs competition. This (if true and works) is a perfect example. Love when others do something great. Lets hope it really is that good :)

It is really that good. Check my photo with the Pixel 3 night mode (unofficial aoftsoft release).
IMG_20181027_012544-01.jpeg
IMG_20181027_012544-01.jpeg
IMG_20181026_191234-01.jpeg
IMG_20181027_012544-01.jpeg
IMG_20181026_191234-01.jpeg
IMG_20181026_191112-01-01.jpeg

[doublepost=1540598477][/doublepost]Not sure why pictures posted multiple times. Anyways here is another compare. First shot is normal and second is night mode. #teampixel
 

Attachments

  • IMG_20181026_002314.jpg
    IMG_20181026_002314.jpg
    1 MB · Views: 251
  • Like
Reactions: imtoomuch
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.