Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Kinda creepy that someone can take a picture of you in the dark and have it come out this clear without a flash to give them away... but it’s a cool tech.

it's new for a camera on a phone, But not new from more advanced cameras and larger lens devices.
[doublepost=1540568338][/doublepost]
99.44% of the time, I also follow this rule, because it's less work afterwards. The .66% of the time the lighting is gawd awfully bad (impossible to expose properly). If it's landscape or still life, I shoot multiple shots at with 1 stop difference and merge them into one photo post production. For action, I set manual and hope for the best.

I've never done the merge thing before. Any guides to follow? I've definitely hit the limits on my camera at times with light where sometimes this would be beneficial, but without the knowledge I have avoided it.

I'm a bit picky. I am one of those bums who likes to take his time making the shot right, rather than taking multiple shots and picking them later (Guess that comes from growing up in the film age)
 
This is great for landscape photography, but will be useless with capturing people in the moment without being "smeary." It seems like an easy software implementation to just tell the camera to keep the exposure open for 5 seconds just like any other mirrorless or DSLR camera.

Now if this works with "instant shutter" then I'll be blown away.
Already been proven it's not just longer exposure time.
 
  • Like
Reactions: MEJHarrison
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.

100% agree with you
 
This is great for landscape photography, but will be useless with capturing people in the moment without being "smeary." It seems like an easy software implementation to just tell the camera to keep the exposure open for 5 seconds just like any other mirrorless or DSLR camera.

Now if this works with "instant shutter" then I'll be blown away.
You haven’t even tried it and you’re calling it useless?
The world is such a better place when you’re open minded. You should try it sometime.
 
  • Like
Reactions: MEJHarrison
OK. Kinda cool. Not anti-android/google nor pro-Apple by any stretch. But I'll be curious to hear of any outcry for Apple to incorporate a response. Will customers soon demand their phone be able to jumpstart their car or walk the dog next?
Well, you can get a USB power brick that can jump start a car, so maybe that is the next step. As for walking the dog, there's an app (several really) for that.

As for low light photography, this is actually an area where phones and even many point and shoot digital cameras have struggled for years and this brings a very usable option for this, especially when in low light situations where you can't use a flash for various reasons.

I installed the test app on my Pixel 2 XL and it works very well, which brings up an interesting point, Google has already said that ALL of the Pixel phones, including the original will get this new feature, something Apple wouldn't do with the iPhone because they want you to buy a new phone to get the feature (same applies to Samsung, etc.), yes sometimes they add minor new features, but this is a major feature that is getting added.
 
  • Like
Reactions: Tozovac
" that uses machine learning to choose the right colors based on the content of the image"
Long story short: Fake!!!
 
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.
I’m pretty sure you can turn “Night Sight” off. Problem solved.
I’d much rather have the feature than not have it.
 
The colors won't change. It draws out the color already there, not create new colors.

It's something RAW shooters have been doing for years: shoot underexposed and then draw out the details post production. We do it in tricky lighting. ie a spot where normal exposure would completely blowout one section of the picture. There is no way to retrieve data when the it's 255 255 255 RGB (completely white). The "black" in the raw isn't all 0 0 0 RGB (completely black) so some details can be recovered.

So Google does this now in the phone. Nice trick, but I prefer doing it in during post production.
If you are doing this professionally, yes I could see preferring to do it in post production, but if you are just an average consumer taking photos while on vacation or such, most aren't going to want to mess with it.
[doublepost=1540569268][/doublepost]
I’m pretty sure you can turn “Night Sight” off. Problem solved.
I’d much rather have the feature than not have it.
Most definitely can be, all of the photos in the article were taken with the same phone with Night mode on and off as applicable.
 
" that uses machine learning to choose the right colors based on the content of the image"
Long story short: Fake!!!
It uses a combination of a lot of information to figure it out, I am sure some of it is machine learning, perhaps even comparing it against photos taken at the same place in normal lighting conditions.
 
Probably done by taking multiple shots with fast shutter speed and then averaging together, re-aligning as necessary. With a slow shutter speed, the movement is integrated and details are lost. But with multiple fast shutter speeds, each image is sharp, and can be intelligently aligned with the other images...

Equivalent of a long exposure, but without the tripod.

ETA: Wouldn't help for subject motion though, so maybe not.
 
  • Like
Reactions: HacKage
" that uses machine learning to choose the right colors based on the content of the image"
Long story short: Fake!!!


Incorrect; Virtually all photograph's taken today regardless of what device is taking it applies some level of it's own built in calculations to adapt the scene to best approximation of reality. Doing this does not make the photo "fake".

Fake would be manipulating the scene to put something, or take something out that does not exist in the scene.

All cameras take a "best guess" on colour and saturation. that's essentially what this is doing here. But this is also why you can take a Canon Camera, a Nikon Camera, Set their photo settings identically, take the identical shot at the identical time, and get slightly different looking photos. Again, neither of these are fake.

This is also not unique or new to digital era. Back in the film days, what film you put into the camera would also change the shot slightly based on chemicals and materials used for the film. A kodak or fujifilm in identical cameras also might produce slightly different looking photos.

again, NEITHER are fake.
 
https://www.xda-developers.com/google-pixel-3-pixel-3-xl-issues-problems-help-list/

Lots of bugs on the Pixel 3. I doubt XDA developers and others are posting pages full of "nonsense".

Apple isn't the only company that has software issues to fix...
Yeah, which is why I don't buy a new phone the day it is released, I wait until it has been tested by others and fixes have been released, doesn't matter what company it is. But in this case, I will keep my Pixel 2XL for a while since Google is adding most of the features that I actually care about to it in the next month or so.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.