Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Are we already starting to forget the long list of hardware and software issues plaguing the pixel phone?

Such as?

I mean the iPhone XS is "plagued" with issues as well. Charging issue (fixed), reception issue (lengthy thread on here about it), messed up selfie photos to name the big ones.
 
And they'll also be introducing it to 100 million users instantaneously. What good are AI advancements when Google can't even get this into the hands of more than 3 million users at once?
You might need an iPhone 11 to get this feature next year. Even the Pixel 1 will get this new Night mode. It's crazy that the iPhone X just a year ago don't have the latest camera features as the the iPhone XS. Of course the "A12" excuse is doing the work and yet Google is creating magic with even the Pixel 1 without any of these special hardware sauce.
[doublepost=1540669653][/doublepost]
cool. just edit your exposure settings and BAM, "night shot". doubt you'd want to brag about any of these night shot photos, the quality looks terrible
It's so simple yet Apple hasn't added the feature. Or is it magical once they do? BTW I just tried what you said with my crappy LG G7 phone bumping the exposure to 3200 ISO and 5 seconds shutter. It's an utter mess for noise and blurriness. It's no where close to the Night mode from Google and Huawei P20.
 
By adjusting ISO and shutter speed. All google did was put implement it into the native camera app.
That isn't at all what this is. You're talking about one long shot. This is many shots, composed together computationally and accounting for movement.
Owning both phones, I actually prefer the low light photos the Xs Max captures over that of the P3 in more cases. The P3 does a good job of boosting the lows, however it overdoes it with noisy artifacts at times, especially in dark sky above a skyline shot.
Those are some beautiful shots. I prefer the punch of the Pixel 3's - I see a lot more of the skyline - but I can see your position as well. I'm curious - if you edit the P3's to reduce saturation, how does it compare to the edited XS shot?
 
except, you can do this on virtually any camera, Google just moved the feature to the stock app.
How do I do this with my LG G7 has the same Snapdragon 845 and 4 GB of memory.
[doublepost=1540670081][/doublepost]
It's just a longer exposure, it's not some magnificent technology or magic. I'd imagine the effort to do the same thing on an iPhone is minimal, they'd just need to make the shutter stay open and let more light in. Again, not magic.
I am so surprised Apple hasn't implemented this when it's so simple with increasing the shutter time open. How do you think it's not blurry?
[doublepost=1540670365][/doublepost]
This thread is funny. All of the Android faithful flock to an Apple site to get off on a feature that can be accomplished with a number of iOS apps. I guess you need to grasp for anything with the Pixel line today.
Can you tell me what these apps are? I want to try them out on my gf iPhone 6 Plus. I know it's pretty old, but would love to get that Night mode.
 
How do I do this with my LG G7 has the same Snapdragon 845 and 4 GB of memory.
[doublepost=1540670081][/doublepost]
I am so surprised Apple hasn't implemented this when it's so simple with increasing the shutter time open. How do you think it's not blurry?
[doublepost=1540670365][/doublepost]
Can you tell me what these apps are? I want to try them out on my gf iPhone 6 Plus. I know it's pretty old, but would love to get that Night mode.


NightCap, Night Camera, Night Cam, Night Eyes are the ones I saw, there are probably more out there.
 
Apple will have it next year. _and_ Apple will follow
I really don’t think so. Apple’s API’s make this data available to application developers and I believe they will keep the main camera app focused on trying to improve your run of the mill snapshots. I mean, when you consider ALL the things the built-in app can’t do now... that IS possible because third parties are doing it... it just makes sense that Apple’s focusing on the 80-90% of imaging situations and leaving the edge cases to the Halide’s and Focos’s of the world.
 
Owning both phones, I actually prefer the low light photos the Xs Max captures over that of the P3 in more cases. The P3 does a good job of boosting the lows, however it overdoes it with noisy artifacts at times, especially in dark sky above a skyline shot.


View attachment 798727

View attachment 798725

View attachment 798726

I agree that at first glance (which is all most people will do), the Pixel 3 shots look amazing, however when you start to pixel peep (no pun intended) there is a LOT of noise in darker areas, and the middle ground blending zones look blotchy as compared to the Xs. I also dislike the oversaturated colors in the shot that are not true to life. Standing on the roof looking at both photos, the Xs capture was far more realistic to what your eyes would see.

I would say the Xs camera system captures shots exactly between the pixel 3’s two photo modes. The 3 without night sight is a little Dark, the Xs is a bit brighter natively and still produces a clean shot. With night site (which the iPhone lacks) the Pixel 3 image processing is almost overdone with regards to shadow boost, and saturation. It feels like google is doing immediate post processing to boost shadows, and saturation without user intervention. When I boost shadows to Xs Max photos in post processing, I like the results better than many Pixel 3 night sight photos. Sure there is noise in the Xs shots as well, but not as much, and the highlights (as well as shadows) are not overdone.

Since Night sight does not work in video mode, I feel that the Xs has a far better camera system overall. The native photo and video captures from that camera in low light are good in both exposure and saturation. Combine that with far less choppy video, better stereo audio recording, it makes the Xs a far better camera system for my needs.

Regardless of whether the Pixel or the iPhone is better, I just had to say this.

As someone who has been doing digital photography since around 2000 (started with a Kodak DC280), my mind is COMPLETELY BLOWN at what phone cameras can do these days. My jaw drops when I see what these tiny cameras can do. Never in my wildest dreams 18 years ago did I expect we'd be able to take photos like this without a really long exposure and a tripod.
 
Such as?

I mean the iPhone XS is "plagued" with issues as well. Charging issue (fixed), reception issue (lengthy thread on here about it), messed up selfie photos to name the big ones.
Off the top of my head,

1) Photos taken sometimes not saving to gallery
2) Random notches appearing on the sides of the screen (you have to see it to believe it)
3) The back scuffs easily and there are actually videos floating around on YouTube where users are scrubbing the back with toothpaste and water in an attempt to get rid of it.
4) Hissing sound when using the the microphone to record audio (which is also pretty crappy)
5) Memory issue where background apps are routinely being purged
6) Distortion with the speakers (especially at higher volumes)

Any one of these would be a multi-page thread castigating Apple. With Google, there seems to be largely indifference. Seems that every flaw with Apple, be it real or perceived, gets blown way out of proportion while its strengths are underplayed. Meanwhile, the pixel phone is being graded on a curve where its one saving grace, the camera, gets a disproportionate amount of praise while every other problem is swept under the carpet.

Weird. Just weird.
 
You might need an iPhone 11 to get this feature next year. Even the Pixel 1 will get this new Night mode. It's crazy that the iPhone X just a year ago don't have the latest camera features as the the iPhone XS. Of course the "A12" excuse is doing the work and yet Google is creating magic with even the Pixel 1 without any of these special hardware sauce.
[doublepost=1540669653][/doublepost]
It's so simple yet Apple hasn't added the feature. Or is it magical once they do? BTW I just tried what you said with my crappy LG G7 phone bumping the exposure to 3200 ISO and 5 seconds shutter. It's an utter mess for noise and blurriness. It's no where close to the Night mode from Google and Huawei P20.

Apple gets away with it because their fans accept it and allow it.
 
I want this on an iPhone. I think everybody would win if everybody could get this kind of photography genius on their phones. I just bought an iPhone XR. I won't go Android. But I'd really like to get some decent brains behind the photos that get created.
 
  • Like
Reactions: ConvertedToMac
Apple gets away with it because their fans accept it and allow it.
Accept what? The latest iPhones take amazing pictures and for most people the difference between pixel pictures and iPhone their really is no difference.

It’s only mad techies who go into full comparison mode
 
  • Like
Reactions: Abazigal
Accept what? The latest iPhones take amazing pictures and for most people the difference between pixel pictures and iPhone their really is no difference.

It’s only mad techies who go into full comparison mode
That's not what he was referring to. He was saying if / when Apple comes out with something similar let's say on the iPhone 11, the will say it only can work on the iPhone 11 because of the A13. Apple consumers accepts this and upgrades. Google on the other hand offers the feature for all 3 generation of their phones.
 
Photos in pitch black darkness. OK. Kinda cool. Not anti-android/google nor pro-Apple by any stretch. But I'll be curious to hear of any outcry for Apple to incorporate a response. Will customers soon demand their phone be able to jumpstart their car or walk the dog next?
What are you talking about? A camera is one of the biggest draws for phones.
The post is about low light photography, not pitch black darkness.
[doublepost=1540721174][/doublepost]
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.

In the case where there is not enough light to render anything other than black/dim scene, what would you like to render?
As with all software/hardware integration, you make a choice as to what you want to create.
You want more dynamic range so you can see inside a room and see what is outside though a window, turn on HDR. This is not rendering what the camera sees and to get to what you see, some manipulation of the image has to be done like taking two pictures and merging them. This doesn't mean that you have to end up with a "typical" over processed HDR image that defies reality.

Same goes for Night sight, think of taking a picture of your kids sleeping and instead of a picture that you cannot use, you get one that you can. Having the feature doesn't mean that you have to have a scene that looks like daylight in that kids pic.
 
Either this is the biggest hoax everyone is on on with excellent Night Sight shots, or the images are true

Images on Google Pixel 3 would blow iPhone out of the water easy. Google's giving Apple serious battle.

Took some time to register that these were really night time shots
 
Off the top of my head,

1) Photos taken sometimes not saving to gallery
2) Random notches appearing on the sides of the screen (you have to see it to believe it)
3) The back scuffs easily and there are actually videos floating around on YouTube where users are scrubbing the back with toothpaste and water in an attempt to get rid of it.
4) Hissing sound when using the the microphone to record audio (which is also pretty crappy)
5) Memory issue where background apps are routinely being purged
6) Distortion with the speakers (especially at higher volumes)

Any one of these would be a multi-page thread castigating Apple. With Google, there seems to be largely indifference. Seems that every flaw with Apple, be it real or perceived, gets blown way out of proportion while its strengths are underplayed. Meanwhile, the pixel phone is being graded on a curve where its one saving grace, the camera, gets a disproportionate amount of praise while every other problem is swept under the carpet.

Weird. Just weird.
The fact that you know all about these issues as an Apple only user would indicate that these issues are well known about.
These will be discussed at length on Android Central.
Are you complaining that they are not discussed at length on this forum?
As for the camera, I would suggest that Pixel cameras' praise is well merited and not at all "disproportionate".
 
Last edited:
The colors won't change. It draws out the color already there, not create new colors.

It's something RAW shooters have been doing for years: shoot underexposed and then draw out the details post production. We do it in tricky lighting. ie a spot where normal exposure would completely blowout one section of the picture. There is no way to retrieve data when the it's 255 255 255 RGB (completely white). The "black" in the raw isn't all 0 0 0 RGB (completely black) so some details can be recovered.

So Google does this now in the phone. Nice trick, but I prefer doing it in during post production.

Close. I believe they are taking a series of normal-shutter + high ISO pictures over a long period (5 seconds) - each of which will be horribly under-exposed - and aligning + stacking the exposures, as one could do in Photoshop fairly easily. Doing so will often give "slightly off" colors or object outlines as there is a lot of noise being summed together (which removes a lot of the noise but sometimes will end up skewed color-wise if you don't have "enough" frames being summed). But, the "trick" they are then applying is identifying items in the pictures to give them a "solid" color that Google thinks they should have. Sometimes to unnatural garish hyper-vibrant results, like the fire extinguishers picture in the article.

The stacking you can do on your own. I'm not aware of any plugin that will do the color correcting bit though; that you would have to do by hand, and it is very tedious.

As with any stacking approach (aside from exposure stacking, you can also stack tones (HDR) or focal distances; very similar idea although those will not be doing a simply summation at each pixel), I suspect it doesn't handle movement in the scene over that five seconds very well though. So a picture of someone who is not very adept at holding very still will just be a mess. But capturing a scene without people, in low light, is a neat trick.
 
  • Like
Reactions: ConvertedToMac
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.