As opposed to no picture?Sorry, those filtered photos are neither better nor more realistic. I never understood why non-Apple people are on the Apple site.
As opposed to no picture?Sorry, those filtered photos are neither better nor more realistic. I never understood why non-Apple people are on the Apple site.
Are we already starting to forget the long list of hardware and software issues plaguing the pixel phone?
You might need an iPhone 11 to get this feature next year. Even the Pixel 1 will get this new Night mode. It's crazy that the iPhone X just a year ago don't have the latest camera features as the the iPhone XS. Of course the "A12" excuse is doing the work and yet Google is creating magic with even the Pixel 1 without any of these special hardware sauce.And they'll also be introducing it to 100 million users instantaneously. What good are AI advancements when Google can't even get this into the hands of more than 3 million users at once?
It's so simple yet Apple hasn't added the feature. Or is it magical once they do? BTW I just tried what you said with my crappy LG G7 phone bumping the exposure to 3200 ISO and 5 seconds shutter. It's an utter mess for noise and blurriness. It's no where close to the Night mode from Google and Huawei P20.cool. just edit your exposure settings and BAM, "night shot". doubt you'd want to brag about any of these night shot photos, the quality looks terrible
That isn't at all what this is. You're talking about one long shot. This is many shots, composed together computationally and accounting for movement.By adjusting ISO and shutter speed. All google did was put implement it into the native camera app.
Those are some beautiful shots. I prefer the punch of the Pixel 3's - I see a lot more of the skyline - but I can see your position as well. I'm curious - if you edit the P3's to reduce saturation, how does it compare to the edited XS shot?Owning both phones, I actually prefer the low light photos the Xs Max captures over that of the P3 in more cases. The P3 does a good job of boosting the lows, however it overdoes it with noisy artifacts at times, especially in dark sky above a skyline shot.
How do I do this with my LG G7 has the same Snapdragon 845 and 4 GB of memory.except, you can do this on virtually any camera, Google just moved the feature to the stock app.
I am so surprised Apple hasn't implemented this when it's so simple with increasing the shutter time open. How do you think it's not blurry?It's just a longer exposure, it's not some magnificent technology or magic. I'd imagine the effort to do the same thing on an iPhone is minimal, they'd just need to make the shutter stay open and let more light in. Again, not magic.
Can you tell me what these apps are? I want to try them out on my gf iPhone 6 Plus. I know it's pretty old, but would love to get that Night mode.This thread is funny. All of the Android faithful flock to an Apple site to get off on a feature that can be accomplished with a number of iOS apps. I guess you need to grasp for anything with the Pixel line today.
How do I do this with my LG G7 has the same Snapdragon 845 and 4 GB of memory.
[doublepost=1540670081][/doublepost]
I am so surprised Apple hasn't implemented this when it's so simple with increasing the shutter time open. How do you think it's not blurry?
[doublepost=1540670365][/doublepost]
Can you tell me what these apps are? I want to try them out on my gf iPhone 6 Plus. I know it's pretty old, but would love to get that Night mode.
It doesn't occur to you that Google, with all their R&D budget, may do a better job than a few free or $2 apps?NightCap, Night Camera, Night Cam, Night Eyes are the ones I saw, there are probably more out there.
It doesn't occur to you that Google, with all their R&D budget, may do a better job than a few free or $2 apps?
I really don’t think so. Apple’s API’s make this data available to application developers and I believe they will keep the main camera app focused on trying to improve your run of the mill snapshots. I mean, when you consider ALL the things the built-in app can’t do now... that IS possible because third parties are doing it... it just makes sense that Apple’s focusing on the 80-90% of imaging situations and leaving the edge cases to the Halide’s and Focos’s of the world.Apple will have it next year. _and_ Apple will follow
Owning both phones, I actually prefer the low light photos the Xs Max captures over that of the P3 in more cases. The P3 does a good job of boosting the lows, however it overdoes it with noisy artifacts at times, especially in dark sky above a skyline shot.
View attachment 798727
View attachment 798725
View attachment 798726
I agree that at first glance (which is all most people will do), the Pixel 3 shots look amazing, however when you start to pixel peep (no pun intended) there is a LOT of noise in darker areas, and the middle ground blending zones look blotchy as compared to the Xs. I also dislike the oversaturated colors in the shot that are not true to life. Standing on the roof looking at both photos, the Xs capture was far more realistic to what your eyes would see.
I would say the Xs camera system captures shots exactly between the pixel 3’s two photo modes. The 3 without night sight is a little Dark, the Xs is a bit brighter natively and still produces a clean shot. With night site (which the iPhone lacks) the Pixel 3 image processing is almost overdone with regards to shadow boost, and saturation. It feels like google is doing immediate post processing to boost shadows, and saturation without user intervention. When I boost shadows to Xs Max photos in post processing, I like the results better than many Pixel 3 night sight photos. Sure there is noise in the Xs shots as well, but not as much, and the highlights (as well as shadows) are not overdone.
Since Night sight does not work in video mode, I feel that the Xs has a far better camera system overall. The native photo and video captures from that camera in low light are good in both exposure and saturation. Combine that with far less choppy video, better stereo audio recording, it makes the Xs a far better camera system for my needs.
Similar camera tricks have been around for years.
In a simple "point and shoot" application?
Now there’s a company that actually knows how to innovate![]()
Off the top of my head,Such as?
I mean the iPhone XS is "plagued" with issues as well. Charging issue (fixed), reception issue (lengthy thread on here about it), messed up selfie photos to name the big ones.
You might need an iPhone 11 to get this feature next year. Even the Pixel 1 will get this new Night mode. It's crazy that the iPhone X just a year ago don't have the latest camera features as the the iPhone XS. Of course the "A12" excuse is doing the work and yet Google is creating magic with even the Pixel 1 without any of these special hardware sauce.
[doublepost=1540669653][/doublepost]
It's so simple yet Apple hasn't added the feature. Or is it magical once they do? BTW I just tried what you said with my crappy LG G7 phone bumping the exposure to 3200 ISO and 5 seconds shutter. It's an utter mess for noise and blurriness. It's no where close to the Night mode from Google and Huawei P20.
Accept what? The latest iPhones take amazing pictures and for most people the difference between pixel pictures and iPhone their really is no difference.Apple gets away with it because their fans accept it and allow it.
That's not what he was referring to. He was saying if / when Apple comes out with something similar let's say on the iPhone 11, the will say it only can work on the iPhone 11 because of the A13. Apple consumers accepts this and upgrades. Google on the other hand offers the feature for all 3 generation of their phones.Accept what? The latest iPhones take amazing pictures and for most people the difference between pixel pictures and iPhone their really is no difference.
It’s only mad techies who go into full comparison mode
What are you talking about? A camera is one of the biggest draws for phones.Photos in pitch black darkness. OK. Kinda cool. Not anti-android/google nor pro-Apple by any stretch. But I'll be curious to hear of any outcry for Apple to incorporate a response. Will customers soon demand their phone be able to jumpstart their car or walk the dog next?
I... I don't think I can get behind this.
If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.
The fact that you know all about these issues as an Apple only user would indicate that these issues are well known about.Off the top of my head,
1) Photos taken sometimes not saving to gallery
2) Random notches appearing on the sides of the screen (you have to see it to believe it)
3) The back scuffs easily and there are actually videos floating around on YouTube where users are scrubbing the back with toothpaste and water in an attempt to get rid of it.
4) Hissing sound when using the the microphone to record audio (which is also pretty crappy)
5) Memory issue where background apps are routinely being purged
6) Distortion with the speakers (especially at higher volumes)
Any one of these would be a multi-page thread castigating Apple. With Google, there seems to be largely indifference. Seems that every flaw with Apple, be it real or perceived, gets blown way out of proportion while its strengths are underplayed. Meanwhile, the pixel phone is being graded on a curve where its one saving grace, the camera, gets a disproportionate amount of praise while every other problem is swept under the carpet.
Weird. Just weird.
The colors won't change. It draws out the color already there, not create new colors.
It's something RAW shooters have been doing for years: shoot underexposed and then draw out the details post production. We do it in tricky lighting. ie a spot where normal exposure would completely blowout one section of the picture. There is no way to retrieve data when the it's 255 255 255 RGB (completely white). The "black" in the raw isn't all 0 0 0 RGB (completely black) so some details can be recovered.
So Google does this now in the phone. Nice trick, but I prefer doing it in during post production.