Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Or, rather than this you could drop two grand on a Sony A7S (original) or II and that is a DSLR with INSANE low light capabilities, especially in movie mode - check out some of the videos of the aurora borealis on youtube or the video of a rural street at night.

So I suspect we'll see more of this as time goes on but it is pretty amazing.

Now I'm waiting for decent UFO pics/footage.
 
Last edited:
Sorry, those filtered photos are neither better nor more realistic. I never understood why non-Apple people are on the Apple site.

I am not an 'Apple' person, I do not really like Apple but that is my opinion. I have owned Apple products in the past. I still come on Apple sites because I like to see what they come out with, I like that there is competition, in my opinion that is what advances things. I want Apple, Samsung, Google and any others to keep pushing the envelope and improving, I don't care who is first, competition makes them all better. That is why I still come here to see what they come up with, I just look past the trolls on both sides.
 
  • Like
Reactions: bydandie
To address a couple of your points:

"I can just drag exposure up on the photos which is what software does."

Incorrect. It is impossible once a photo has been taken to invent and fill in photographic data if it wasn't detected by the sensor (that's also why photographers capture in RAW mode, as there's more data overall ). there is only so much you can "brighten" a photo this way, as it also has the affect of blowing out the whites/highlights. to get a uniformity of picture that google's doing you would need some more advanced logic to be able to do the "lightening" either by region in the photo or down to a pixel by pixel level. if you were to take your run of the mill everyday photo and just crank the slide,r you would end up with a washed out photo with greys and blobs of white. if the photo was sufficiently dark where no data detected at the sensor, than you'd just get a grey mess.

"people cried when Apple made peoples skin a little nicer"

No, People are crying that Apple has dictated that all selfies need to be in this "beauty" mode with overly aggressive skin smoothing with zero capability of turning it off. Nightmode shown here is entirly optional and can be enabled/disabled by changing modes. Do not conflate the issue. The issue with Apple isn't the "Beauty Mode". it's the mandatory beauty mode because "Apple knows best"

"Do they say what iso/shutter speeds these are shot at? The difference is too big, I can do the same on the iPhone if I wanted to."

Like all claims. Prove it.

1.

That is correct and googles photos is pretty much doing that if its using software, the software probably also slows down the shutter a lot which works great in demo but in real left its very hard to hold a phone stable or have a subject sit still.

2.

its was from HDR actually, merging of multiple exposures to increase exposure in shadow areas which caused blemishes to reduce contrast. A similar technique is being used with this in googles attempt to brighten dark images I imagine.

3.

Sorry I wasn't clear, you just need to use a camera app that allows manual shutter control e.g Camera+ but I found and I think most users would find its not that practical for reasons mentioned above. It's fine for inanimate objects. I can take one to "prove" it but I think you know enough about cameras to know what I'm saying.
[doublepost=1540932081][/doublepost]
Yeah sure you can. Could you post some examples?
The funny part is you don't even understand how Nigh Sight works.

"Google’s night mode still gathers light over a period of a few seconds" you can see it in the image he posts outside where things are just hanging there, the car is totally blurred as it passes the bus.

I would post an example but just look at most night photos and you can see this if cars are passing. The default camera on the iPhone doesn't force a slow shutter as most situations your friends would be blurred. You can grab a app that lets you manually set the shutter speed but most users wouldn't bother.
 
1.


"Google’s night mode still gathers light over a period of a few seconds" you can see it in the image he posts outside where things are just hanging there, the car is totally blurred as it passes the bus.

Again it's not one single long exposure it's a multitude of pictures shot at different exposures in a period of 2-4 seconds.
And of course cars and fast moving object are blurred in low light pictures it's extremely difficult to freeze motion in such conditions, even with a professional camera. Are you complaining now that Googles Nigh Sight mode doesn't totally bend the laws of physics?


I would post an example but just look at most night photos and you can see this if cars are passing. The default camera on the iPhone doesn't force a slow shutter as most situations your friends would be blurred. You can grab a app that lets you manually set the shutter speed but most users wouldn't bother.

You are all talk.
Night Sight is not '"long exposure", like I've said you don't even understand how the feature works. You don't understand how long exposure works either. Even if you were able to put the iphone on a tripod and take a picture with a few sec exposure time, most of the time when highlights are involved one single image wouldn't be enough to get similar result to Night Sight.
Also Nigh Sight clearly works with people(as staying still for 2-3 seconds is not impossible) as there are multiple examples posted online already.
And the default camera on the iPhone lowers the exposure as much as it can for handheld low light pictures, Night Sight brakes past this kind of limitations. There is a lot of information about this feature online I don't understand why so many users here insist on not trying to understand how it works.
 
Last edited:
IMG_20181025_112223.jpg


Screen Shot 107.png


ISO 7000?

I suppose that's impressive, but it sort of reminds me of this party trick.

 
Again it's not one single long exposure it's a multitude of pictures shot at different exposures in a period of 2-4 seconds.
And of course cars and fast moving object are blurred in low light pictures it's extremely difficult to freeze motion in such conditions, even with a professional camera. Are you complaining now that Googles Nigh Sight mode doesn't totally bend the laws of physics?

you seem to be an expert can you help me understand how taking multiple exposures in the dark could help a dark photo, the shutter needs to be open longer to allow more light in so taking a series of fast shutter images really does nothing. So either its a long exposure as mentioned which isn't anything new or its a existing shot with software exposure increased.
[doublepost=1541028151][/doublepost]
You are all talk.
Night Sight is not '"long exposure", like I've said you don't even understand how the feature works. You don't understand how long exposure works either. Even if you were able to put the iphone on a tripod and take a picture with a few sec exposure time, most of the time when highlights are involved one single image wouldn't be enough to get similar result to Night Sight.
Also Nigh Sight clearly works with people(as staying still for 2-3 seconds is not impossible) as there are multiple examples posted online already.
And the default camera on the iPhone lowers the exposure as much as it can for handheld low light pictures, Night Sight brakes past this kind of limitations. There is a lot of information about this feature online I don't understand why so many users here insist on not trying to understand how it works.

If it makes you happy I will take one tonight on the iPhone to test, I was genuinely curious why people are so amazed at this thats all
 
If you have the fstop, 1.7; the exposure time, 1/3 s; and the ISO, 7269, you can calculate the ambient light in the scene.

However, the resulting picture is unpublishable.

My Nikon D7000 does this at ISO 8000.

high iso.jpg


ISO 8000, f/3, 1/320 Aperture priority. White balance adjusted, otherwise straight out of camera.

I don't really like the instagram look, so I think of it as unpublishable. But it won't embarrass me.

To get photos that are publishable at that high of an ISO, I'd probably use a full frame camera.

The Pixel 3 shots are like a talking dog that can't do sums. Impressive as a gimmick. Useless if you're looking for an accountant.
 
you seem to be an expert can you help me understand how taking multiple exposures in the dark could help a dark photo, the shutter needs to be open longer to allow more light in so taking a series of fast shutter images really does nothing. So either its a long exposure as mentioned which isn't anything new or its a existing shot with software exposure increased.
Well I genuinely am a bigger expert than you and it looks like you prove my point, you simply refuse to educate yourself regarding this feature.
Stacking Multiple images together to improve low light result has been around for quite some time it's not anything new and it's not "long exposure".
In simple terms what Night Sight does is takes a series of images at multiple exposures and ISO values, it chooses a base frame that best represents the scene and uses software algorithms, machine learning AI or whatever to merge useful portions of the other frames into it to enhance the final image, remove noise, grain and so on. Is that good enough for you? I guess it won't be.

If it makes you happy I will take one tonight on the iPhone to test,
Yeah prove me right, put the iphone on an tripod and take a picture.

I was genuinely curious why people are so amazed at this thats all
Well you don't understand how Nigh Sight works, thta's all.
 
Last edited:
Yes, stacking of multiple exposures is the way to go for teasing useful signal out of a noise floor. This is how it's been done in astrophotography for a long time, and also in – gasp – smartphone apps like Hydra.

Google's great innovation here is stealing the right idea and building it into the stock camera app.
 
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.

Just having the choice to use something like this on the iPhone would be nice.
Pixel 3 camera is incredible, Apple have some catching up to do.
 
Yes, stacking of multiple exposures is the way to go for teasing useful signal out of a noise floor. This is how it's been done in astrophotography for a long time, and also in – gasp – smartphone apps like Hydra.

Google's great innovation here is stealing the right idea and building it into the stock camera app.

Yeah just stealing ideas.
It's incredible the lengths some users here would go to denigrate Google no matter the subject.
One of the main contributors to Nigh Sigh being as good as it's now(ever before the official release by Google) is most likely Marc Levoy a pioneer in computational imaging at Stanford thta now is a Distinguished Engineer working on camera technology for Google and who wrote years ago a paper called: “Extreme imaging using cell phones”.
The results Nigh Sight produces suggests google has been working on this tech for many years. The results that Hydra app produces does suggests it just stole ideas and it's nowhere near comparable with Nigh Sight.
 
Last edited:
Yes, stacking of multiple exposures is the way to go for teasing useful signal out of a noise floor. This is how it's been done in astrophotography for a long time, and also in – gasp – smartphone apps like Hydra.

Google's great innovation here is stealing the right idea and building it into the stock camera app.
And does it not occur to you that Google with its potential billions for R&D, may be able to create a superior solution to a $5 app?????
 
  • Like
Reactions: arkitect
And does it not occur to you that Google with its potential billions for R&D, may be able to create a superior solution to a $5 app?????

One would certainly hope so! What I'm saying is, this technology has been developed for many years (much of it in the public domain), and is bread and butter in some fields (astro-imaging). Google (unlike the Hydra guys) managed to pull it into the spotlight and out of the specialist corner, but it isn't something new or unheard of.
 
If it’s all software, I wonder if this means they could theoretically release a version of this for the iPhone. I know they won’t, but I wonder if they could.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.