Apple used to be 2-3 years ahead, now they are behind in some cases 2-5 years (inductive charging, oled)
It's more a curiosity how they managed to oversaturate the photo without introducing significant noise or blur.
Am I seeing the same photos as the rest of you, or are my definitions of noise and blur off base? Because I don't find those photos to be clean or clear.
That picture of the fire extinguisher...why would you take a picture of that in the dark? The second photo looks like someone just turned on the lights. Maybe it’s not a good example of what this is doing. Seeing a photo that’s pitch black..to me that’s not low light. I think they need better examples to show this off.
How do quality photos compare to unnecessary and unreasonable features like you mentioned? Absolutely horrible analogy.
Many people have a social life, especially outside of working hours.
Night Sight is closer to reality. Or did you think The Verge editor was stumbling all over his house and the street in pitch black?
Unknown. Since presumably Google is uploading people's photos to their cloud and applying AI and then showing the updated image versus Apple's approach of privacy first where things are done locally on device. So anything is possible but the reason why Google's engineers can do such amazing things is their choice of features over privacy. Not saying one direction is right or wrong but both companies are locked into their paths. A normal high end camera can do the same if on a tripod and you set it to a long exposure even in near pitch black. On phones this hasn't been possible because phones are handheld and a long exposure would be very blurry. So my guess is they are doing a long exposure on the phone and then the AI figures out exactly how the phone is moving and undoes that in software to make it sharp.So Apple could technically do the same right?
I'm curious about how this works. Is it machine learning, compositing multiple shots, a combination, or something else?
I have an app call Cortex Cam on my iPhone and it does the same thing the night mode does on the pixel. So no need for the pixel just an app will do.Sorry but I have to call BS where I see it. Unless I'm missing something specific about the lighting in these images (please feel free to correct me if I am), they're ********. I've been taking plenty of low light shots on my iPhone XS without flash because I never use flash, and they're great.
I'm not saying Google hasn't made some kind of advancement, but it's hard to tell when they're obviously faking these images.
I absolutely love Apple, though certainly care about this since as photographer this is such a major leap in handheld long exposure with the assistance of AI. This is a massive and doesn't make Apple bad or Google good it just is, and great for everyone since all this competition is driving the industry forward.I don’t care about any of this. My Xs Max is producing gorgeous pics in day light/low light. Software is software and there is always room for improvement. Who does it first does not matter. Apple will eventually catch up. What matters to me is the Apple ecosystem (as long as it is holding up). To each his own...
I don't see why it would be tied to hardware. Photoshop and any photo manipulation program with level and/or curve adjustments can do this easily. Making it look good depends upon the skill of the user though. Of course, JPEG will have a lot of missing detail so it won't be as good as drawing out details from a RAW file.Can Google make this same thing work on iOS, or would it require different hardware or drivers?
I'd be willing to pay up to ~$100 for an app capable of doing this in iOS. This is the kind of feature that makes upgrading to a new phone meaningful... but you need at least a dozen of them for me to buy your $1000+ iPhone.
Come up with better examples/more realistic use cases then.Why would you take a picture of a fire extinguisher in the dark? Are you serious?! To show how the Pixel works with low light photos!!!
That picture of the fire extinguisher...why would you take a picture of that in the dark? The second photo looks like someone just turned on the lights. Maybe it’s not a good example of what this is doing. Seeing a photo that’s pitch black..to me that’s not low light. I think they need better examples to show this off.
I... I don't think I can get behind this.
If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.
Best part about this is that it's coming to Pixel 1 and 2 as well. Super excited to start using it on my 2 xl. I hope Apple adopts something similar. More importantly, I hope it's a feature they introduce to phones going back to the iPhone 8.
I've tried Cortex app to, and while the idea is the same of using multiple longer exposures to a degree, the results are literarily night and day different. Take a picture of a near pitch black scene outside with your phone then with cortex app and it helps ever so slightly but not much, compared to what the verge is showing here where near pitch black seems like a daylight.I have an app call Cortex Cam on my iPhone and it does the same thing the night mode does on the pixel. So no need for the pixel just an app will do.
I don't see why it would be tied to hardware.