Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I give google props and I think this competition is good for Apple. But, to me, its creating light where there actually isn’t. And how many people take pictures in pitch black environment?? Isn’t this why we have a flash?? I also have an app that can take similar pictures (Nightcap).
 
TheVerge is an interesting beast. Since Vox took over there's been an interesting dynamic between it and Apple products.

Often, the reviews in person will be scathing. "looks at these problems", but then the TLDR "BUY BUY BUY! 9/10!"

you'll get reviews where (historical example) they'll slam an Android phone for not having NFC, and then when reviewing a non NFC iPhone, completely ignore it on the list of cons.

there's very little consistency left there regarding balanced reporting. Some of their stuff is pretty good still, but then they'll give mixed messages. Another for example, shipping their entire team down to the iPhone event, but deeming the Note 9 event "not worth our time"

It's almost a passive aggressive "We don't actually LIKE Apple, but, we kind of somehow have to keep supporting them" vibe
Nearly everyone who posts at the Verge seem to think they have an Apple bias but I’m not sure where they’re getting it from. I do wish they’d get rid of the numerical score in reviews. It just turns the comments section in to fighting over why one phone scored a 9 while another scored an 8.8 or whatever.
 
It's a cool feature, but some of those Pixel 3 pictures look totally stupid and unnatural.

Looking forward to Apple implementing this feature.
I agree I think this is good for competition for Apple, but the camera is creating light where there actually isn’t. Idk.
 
Nearly everyone who posts at the Verge seem to think they have an Apple bias but I’m not sure where they’re getting it from. I do wish they’d get rid of the numerical score in reviews. It just turns the comments section in to fighting over why one phone scored a 9 while another scored an 8.8 or whatever.

and I've also noticed that the numbers are very arbitrary and don't often reflect the content of the review itself. I think they literally just pick the numbers out oa hat. But yes, so many people who don't read the reviews, read the numbers and then blow a gasket without ever actually reading the reviewers comments/opinions.


Either way, in recent years I've lost a lot of faith in TheVerge's writing and editorialization of things.
 
except, you can do this on virtually any camera, Google just moved the feature to the stock app.
Wait! What? This is just an app feature not a culmination of software and hardware which the Pixel phone has? There's no mention of the article being just an app software.
 
It's a cool feature, but some of those Pixel 3 pictures look totally stupid and unnatural.
(...)

Keep in mind that it's still unreleased software, so maybe by the time it reaches the final version we can see some overall improvements on the color accuracy and a more natural contrast/sharpness.
 
Nice try. Lowering the exposure all the way down from iOS and then doing a photo compare of the Night Sight feature. Only fooling the fools.
 
Photos in pitch black darkness. OK. Kinda cool. Not anti-android/google nor pro-Apple by any stretch. But I'll be curious to hear of any outcry for Apple to incorporate a response. Will customers soon demand their phone be able to jumpstart their car or walk the dog next?

I’d like my iPhone to walk my neighbors dog. :apple:
[doublepost=1540579012][/doublepost]
People have been saying that for some time now, but let me know when smartphones can compete with the glass of SLR - they can't. Sure, they're good enought to pass for on demand needs, but they cannot replace an SLR.

Absolutely. Some of us still use film now and then just to reconnect. ;)
 
Nicely done Google, Apple, your turn.

I am not sure if Apple is going to match anything like in the next year or two. Limited use cases for the feature, but cool regardless.

I think I can project that this real-time image processing in the future can lead to the demise of DSLRs and acceleration of mirror-less camera adoption and maybe even make large body cameras a niche product.
 
It's important to realize that physics limits what you can do in low light. You can either lengthen the exposure or increase the ISO; lengthening the exposure risks more blur, and increasing the ISO creates more noise. Post-processing can clean it up to some degree. So Google must be doing one of those things. I suspect they're just doing post-processing automatically.
 
  • Like
Reactions: LordVic
It's important to realize that physics limits what you can do in low light. You can either lengthen the exposure or increase the ISO; lengthening the exposure risks more blur, and increasing the ISO creates more noise. Post-processing can clean it up to some degree. So Google must be doing one of those things. I suspect they're just doing post-processing automatically.

Yes it's basically "elongated" exposure HDR. It's not just one long exposure but a select group of longer exposures compiled and post-processed to produce the images. And it's kicking butt at it.
 
  • Like
Reactions: MEJHarrison
Multiple images with a fast shutter speed = one image with same shutter speed. No .. matter .. how .. well .. aligned.
No, you're mistaken. It's called signal averaging. Look it up.

Opening the shutter N times for time t lets in the same light as opening it once for a time N*t. And if you set the gain correctly, you can get the same image apart from camera shake. With intelligent alignment, you can get rid of camera shake in the second case, but not the first.
 
Last edited:
  • Like
Reactions: joeblough
They aren't Google promo pictures. They were taken by Verge.
Not the top photo comparison. I saw that photo at the Google event before anyone outside of Google touched it. But yes, the photos below (I recognize the HTML 5 sliders that Verge uses) are not Google so you are right about those.
 
This is great for landscape photography, but will be useless with capturing people in the moment without being "smeary." It seems like an easy software implementation to just tell the camera to keep the exposure open for 5 seconds just like any other mirrorless or DSLR camera.

Now if this works with "instant shutter" then I'll be blown away.

It's close to impossible to get a non-blurry shot with even moderate exposure times when the camera is being held by hand. Even a half-second is far, far too long (IIRC around a 30th of a second or less is about what's necessary, and even that can be difficult if you don't have a steady hand). Looking at the photos I can guess what's going on; it's not really a new trick, the trick is doing it all at once using the camera software in the phone in real-time. It's the same thing you might do if you wanted to take candid photos at night or in a dark environment:

1) Set the shutter-speed to a slightly-long, but not overlong duration, so it will work when held by hand or with some movement in the scene.

2) Set the aperture wide open to allow maximum light. That (probably) still won't be enough light and you will (probably) still get in an underexposed image, but don't worry about that. In other words, take the pic underexposed with the knowledge that you'll be pushing the exposure later.

3) "Push" the "processing" until the scene is exposed approximately as you wish (or, in this case, the phone will surely estimate what the scene should look like). This will increase the appearance of grain.

4) If the grain is extreme, as it will likely be, (although most modern digital cameras are WAY better than older ones regarding this issue) then blur it out. If it's not too bad just leave it in, nothing wrong with some film grain!

Now, if you hide that entire process by using a phone and software smart enough to do it all automagically when you select the mode it looks like something brand new. It is a great idea to make a (slightly) complex process easy as selecting a mode though.
 
Last edited:
It’s amazing what they have been able to do...

Still pictures on smart phones these days are so elite
 
I give google props and I think this competition is good for Apple. But, to me, its creating light where there actually isn’t. And how many people take pictures in pitch black environment?? Isn’t this why we have a flash?? I also have an app that can take similar pictures (Nightcap).

Mobile phone cameras have a hard time with dark lighting. They cannot see in the dark as well as the human eye. Did you ever notice that the photo you took in low light looks nothing like what your eyes are seeing? Google is just trying to replicate what your eyes are seeing. They aren't adding any light that isn't there. Instead they are trying to make the camera see as much light as the naked eye sees.

Night Sight's great branding. Will Apple simply name their version a year from now Magic Sight?

Revolutionary and ground breaking iSight.
 
  • Like
Reactions: potaco
Keep in mind that it's still unreleased software, so maybe by the time it reaches the final version we can see some overall improvements on the color accuracy and a more natural contrast/sharpness.
Maybe...it's basically an artificial image. It may look brighter than another image, but that doesn't mean the image is good or realistic.

If I use my DSLR to take a picture at night, I might want to preserve some of the natural lighting versus just blowing everything out to make it look like day. What's the point of that? You can just use a flash if the subject is close enough.
 
hilarious how many people think this is fake. what do you think PhD students and other computer science researchers do all day? they come up with new algorithms, or they study how to apply new ideas to old problems.

the emergence of dedicated computing hardware for deep neural networks and huge data sets to train them have led to all kinds of interesting applications, and this is one of them.

google is way ahead of everyone on this ML stuff, because they bet on it early. it will take some time for apple and others to catch up.
 
hilarious how many people think this is fake. what do you think PhD students and other computer science researchers do all day? they come up with new algorithms, or they study how to apply new ideas to old problems.

the emergence of dedicated computing hardware for deep neural networks and huge data sets to train them have led to all kinds of interesting applications, and this is one of them.

google is way ahead of everyone on this ML stuff, because they bet on it early. it will take some time for apple and others to catch up.

Exactly! I mean if this is fake then all photos are fake because they are merely mathematical representations of what actually occurred.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.