Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is great for landscape photography, but will be useless with capturing people in the moment without being "smeary." It seems like an easy software implementation to just tell the camera to keep the exposure open for 5 seconds just like any other mirrorless or DSLR camera.

Now if this works with "instant shutter" then I'll be blown away.

Not exactly. This is hand-held. They take many pictures with short exposure and "add" them together, discarding any blurry ones. No DSLR can do that.
 
  • Like
Reactions: komadani
But being worth $750B does lol? Night shot is awesome but you can't get decent mics for recording video on a $900 phone? With all the software bugs on the Pixel, I hope this isn't yet another broken thing from Google that gets released.
Like what? that just nonsense
 
  • Like
Reactions: raam89
Hardware is essentially the same, so this can be done in software. Wait for IOS updates. I agree that it is good to have strong competition.
 
Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.

I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.
Self own of the year.
 
If this is all done in SW, there's no reason Apple couldn't add this as a user selectable feature in a SW update. I would also like Apple to let the user have more options in the camera app in general, like super long exposure, better time lapse control, etc.
 
In many ways, photography is like music. It can take years to learn to do it right, it's a craft with a bit of science thrown in. Now, much of it's automated and what one can do with it in just a few clicks is really just awesome in its own right. I feel like an old timer "well, back in my day!" and while I see both the beauty and need for this intelligence, I'm also a little sad at the loss of human art and style.
 
I looks like the iPhone camera was used with the flash turned on. If turned off I suspect the images from the iPhone would come out with more detail.
 
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.
lol, then simply turn it off. It's not on for all shots, it's simply a "mode" you can enable for the majority of users who are trying to simply capture a moment in low light, not artists trying to create a certain look. I don't think it's creating the image, but rather more compared to how if you have a higher end camera in near pitch black, if you have it on a tripod and set it to a long exposure it will look like daylight. It's not creating but allowing more light into the lens. This hasn't been possible on phones because one of the tiny sensors and lens but most importantly because to capture a long exposure you'd need a tripod which most people do not have when using a phone. So it appears to take multiple longer handled exposures which would normally would be very blurry and then uses AI to merge those photos. Apps like Cortex Camera have done this to an extent but the results are very minor compared to these mind blowing results. While I have no plans of ever switching to Android for many reasons, this is great news for Pixel users and for photographers in general as others try to backward engineer it.
 
In many ways, photography is like music. It can take years to learn to do it right, it's a craft with a bit of science thrown in. Now, much of it's automated and what one can do with it in just a few clicks is really just awesome in its own right. I feel like an old timer "well, back in my day!" and while I see both the beauty and need for this intelligence, I'm also a little sad at the loss of human art and style.

Phtography is one of those interesting things, where if you're good at it, the hardware isn't really the most important thing. it might help with people who are weaker at photography at getting some usable shots, but like most "art", it's going to be a self skill.

where I think such nightmode is beneficial is not really in the "art" side of photography, but the day to day snapshots to capture a moment. Those who only care about that will be happy. Those who are photographers for "art" purposes will probably find interesting ways of using this tech, but it's likely less important. Just gives them another tool in their arsenal.
 
if my guess above is correct that this is ISO based, than it's somewhat limited to hardware. While most cameras do have the ability to change iSO VIA software (or dials etc), there's an upper limit to each sensor based on design. Not all cameras are capable of the same ISO settings.

For example, Canon 40D has a max iso of 3200. But a 60D due to new advancements was capable of 12800

so if this is ISO based, it wil depend on what Apple's sensor tech is capable of
ISO doesn't not affect the data from the sensor. When I shoot, I don't pay attention to the ISO at all set ISO 100 in Manual mode. In tricky lighting conditions, I shoot RAW, M mode and under exposed expose for the brightest part.

Anyhow, if the program directly accesses the camera sensors and manipulates the data from that, the results would be better . I'm guessing it's what the Pixel does. If it has to manipulate data from a processes JPEG then a lot of data has already been lost. That's why some photographers swear by RAW (basically a data dump of what the sensor captured).

Edit: I reread what I wrote and found it illogical and confusing. I don't under expose on purpose, I expose for the brightest part of the picture. The result is an under exposed picture from which I can draw out details from the dark area. Blown out highlights cannot be recovered. Hopefully that makes more sense.
 
Last edited:
Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.

I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.
So Google is faking these images, even when taken by somebody else? What are you talking about?
 
I wonder if night site involves being VERY still (read tripod) and VERY long exposure times?
My Panasonic M43 camera does the same. Takes multiple photos (3-5 I think) and bundles them together into a far brighter image. It's great for when I'm exploring a city on holiday and don't want to carry a tripod with me. The results are "good enough" for what I want, and that's memories not prints. Likewise if viewed on a phone, I'm sure they would be more than good enough.
 
ISO doesn't not affect the data from the sensor. When I shoot, I don't pay attention to the ISO at all in Manual mode. In tricky lighting conditions, I shoot RAW, M mode and under exposed.

Anyhow, if the program directly accesses the camera sensors and manipulates the data from that, the results would be better . I'm guessing it's what the Pixel does. If it has to manipulate data from a processes JPEG then a lot of data has already been lost. That's why some photographers swear by RAW (basically a data dump of what the sensor captured).


yes, RAW > JPG everyday :p

I would much rather do what you say, shoot lower ISO in raw, and then use post processing to increase the exposure.

However, ignoring ISO as a factor can be a detriment. There are certain lighting conditions and activities where ISO might be your only adjustible factor.

For example, Shooting poorly lit hockey inside an arena. The activity / motion is fast, therefore you loose ability to increase the exposure time. Also you're often a little further away, so using a longer lense might prevent you from using a lower fstop on your aperture without shrinking your depth of field too much. So you're stuck with ISO increases to help offset to help increase the amount of light the sensor is allowed to detect.

When I shoot, I believe in the "holy trinity" of those three above. But ISO is often the last of the three I like to adjust due to the known issues with increasing ISO. Ideally, I start at 100 and only increase should the photos not turn out.
 
This is great for landscape photography, but will be useless with capturing people in the moment without being "smeary." It seems like an easy software implementation to just tell the camera to keep the exposure open for 5 seconds just like any other mirrorless or DSLR camera.

Now if this works with "instant shutter" then I'll be blown away.
It's not easy. What Google does is from doing long exposure and using AI to remove blur. So no need for a tripod.
 
So Apple could technically do the same right?

Yes, and but I don't think we will see it until the 2019 iPhones as a new 'feature' . lol. Competition is good for the consumer and these smartphone cameras will only continue to get better and better.

On a side note, Pro Camera + has two low light modes that work quite well but I imagine Googles still has the upper hand.
 

Let’s see them blown up to at least 6x4 there is no way a camera can take photos in the dark and reproduce usable photos without either s tripod or a long exposure , all looks well if your maybe looking at the photo on a phone but any serious photographer and I mean just s phone user would find them very noisy ( grainy ) e en if you zoom in on the pictures taken on the I phone there pretty poor quality really

At a media event in New York City earlier this month, Google previewed a new low-light camera feature called "Night Sight" that uses machine learning to choose the right colors based on the content of the image. The result is much brighter photos in low-light conditions, without having to use flash.

Google showed a side-by-side comparison of two unedited photos shot in low light with an iPhone XS and its latest Pixel 3 smartphone with Night Sight, and the photo shot on the latter device is much brighter.

pixel-night-sight-800x318.jpg

Google said Night Sight will be available next month for its Pixel smartphones, but an XDA Developers forum member managed to get the feature to work ahead of time, and The Verge's Vlad Savov tested out the pre-release software on a Pixel 3 XL. The results, pictured below, are simply remarkable.

low-light-1.jpg

Without Night Sight

high-light-1.jpg

With Night Sight


low-light-2.jpg

Without Night Sight

high-light-2.jpg

With Night Sight


low-light-3.jpg

Without Night Sight

high-light-3.jpg

With Night Sight

Google and Apple are both heavily invested in computational photography. On the latest iPhones, for example, Smart HDR results in photos with more highlight and shadow detail, while Depth Control significantly improves Portrait Mode. But, Night Sight takes low-light smartphone photography to a whole new level.

Article Link: Google's Upcoming 'Night Sight' Mode for Pixel Phones Captures Remarkable Low-Light Photos
Let’s see them blown up



At a media event in New York City earlier this month, Google previewed a new low-light camera feature called "Night Sight" that uses machine learning to choose the right colors based on the content of the image. The result is much brighter photos in low-light conditions, without having to use flash.

Google showed a side-by-side comparison of two unedited photos shot in low light with an iPhone XS and its latest Pixel 3 smartphone with Night Sight, and the photo shot on the latter device is much brighter.

pixel-night-sight-800x318.jpg

Google said Night Sight will be available next month for its Pixel smartphones, but an XDA Developers forum member managed to get the feature to work ahead of time, and The Verge's Vlad Savov tested out the pre-release software on a Pixel 3 XL. The results, pictured below, are simply remarkable.

low-light-1.jpg

Without Night Sight

high-light-1.jpg

With Night Sight


low-light-2.jpg

Without Night Sight

high-light-2.jpg

With Night Sight


low-light-3.jpg

Without Night Sight

high-light-3.jpg

With Night Sight

Google and Apple are both heavily invested in computational photography. On the latest iPhones, for example, Smart HDR results in photos with more highlight and shadow detail, while Depth Control significantly improves Portrait Mode. But, Night Sight takes low-light smartphone photography to a whole new level.

Article Link: Google's Upcoming 'Night Sight' Mode for Pixel Phones Captures Remarkable Low-Light Photos
 
I doubt this is big step forward in technology. It may well be simply playing with shutter/aperture plus maybe a noise control algorithm. I use the app NightCap on my iPhone 8 (and 6 before it) and get very good results, similar to this. The good move by Google probably is less technical amazingness than simply incorporating this setting suite into the native camera app. There’s nothing revolutionary about the Pixel’s hardware. And since a third party low light photo app took impressive low light photos on a 6 with its f2.2 camera, I have zero doubt that current models with their max f1.8 would be equally impressive if Apple chose to implement this. And they should, because it’s a nice ability to have natively.
You used a tripod right?
 
Kinda creepy that someone can take a picture of you in the dark and have it come out this clear without a flash to give them away... but it’s a cool tech.
 
When I shoot, I believe in the "holy trinity" of those three above. But ISO is often the last of the three I like to adjust due to the known issues with increasing ISO. Ideally, I start at 100 and only increase should the photos not turn out.
99.44% of the time, I also follow this rule, because it's less work afterwards. The .66% of the time the lighting is gawd awfully bad (impossible to expose properly). If it's landscape or still life, I shoot multiple shots at with 1 stop difference and merge them into one photo post production. For action, I set manual and hope for the best.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.