Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This thread is funny. All of the Android faithful flock to an Apple site to get off on a feature that can be accomplished with a number of iOS apps. I guess you need to grasp for anything with the Pixel line today.
 
  • Like
Reactions: H3LL5P4WN
I found Smart HDR much more usefull and interesting. This is pretty interesting and since it is software we will see similar things in iOS or apps. Taking photos at night and turn it into day photos is pretty stupid anyway, but taking photos in low light and be able see details is indeed a great thing.

See, you get it. This is what I meant earlier about a low light photo rendering something close to what my eyes are seeing, not turning 11:30pm into 11:30am.
 
this paper was published 4 may 2018:

https://arxiv.org/pdf/1805.01934.pdf

here is one of the authors' website, with video:

http://cchen156.web.engr.illinois.edu/SID.html

it gives a technique for characterizing the low-light performance of a sensor with long-duration, properly exposed low light images. they then take underexposed images of the same scene and feed the underexposed images to the input of a neural network with the properly exposed images as the expected output.

the neural network then learns what the crappy, high ISO, short duration low-light images are supposed to represent. once the network is trained, you can put in a not-seen-before crappy low-light image and get back what that image would look like if it were properly exposed.

almost certainly google's stuff is building off of this paper. apple won't be far behind, especially since they have all this dedicated neural network capability in the Xs.
 
Not sure what this "machine learning to choose the right colors based on the content of the image" is (on) about, but just using shadows lifting in Preview (which is very basic software) gives about the same result in terms of colours, but with a lot more noise (of course). So it seems Google is a LOT better in NR, but maybe a dedicated NR program like DxO Photolab's Prime could give the same result?
low-light-3.jpg
 
I found Smart HDR much more usefull and interesting. This is pretty interesting and since it is software we will see similar things in iOS or apps. Taking photos at night and turn it into day photos is pretty stupid anyway, but taking photos in low light and be able see details is indeed a great thing.
You are confusing two things. Google already has an HDR feature that is comprable to Apple's and in some way's better, it's really good at preserving highlights.
 
I... I don't think I can get behind this.

If I'm shooting at night, it's for a reason. I'd much rather have a camera capable of rendering what I'm actually seeing versus one that can apply a filter to turn night into day. This is essentially the same kind of thing that Prism does, turning a photo into "art." It's no longer reality.

This is closer showing what the human eye actually sees. If anything it is more like reality.
 
I wish the before shots were more realistic. I'm not accusing Google of faking results exactly but they obviously darkened the blacks in the before shots to make the transformation more dramatic. There is zero data for AI to determine anything in those black areas of the without pics. Why not show the real before pics?
They aren't Google promo pictures. They were taken by Verge.
 
But the Verge isn't Apple-loving... like, at all...
Then that's changed because they used to drool over Apple.

Why didn't Apple come up with this idea first?
They'll come out with it second and call it innovative.

This is the most ridiculous thing I have ever read. If Apple introduced this you would think it was amazing. And rightly so.
Exactly! This is one of the most biased and even blinded forums I've ever joined.

Apple is the world most innovative company in history, they have a very long record of "world first..." or "best ...."
But they are not gods, it is nice that someone else can push good things.
But Apple isnt the most innovative company in history. Not even close.
 
Lets be clear this is machine learning ML.. Apple has invested in the processing of this kind of data with the "bionic" part of the A11 and A12 chip (the whole 5 trillion per second neural thingy).

So there's absolutely no excuse to build this feature into the next "device" instead this should/will be a matter of a software update.

If the multiple shots from the camera are available via the SDK then someone with a data science and ML background could create an app to do just this.

I've been an Apple ecosystem for years but you have to respect good innovation, Google has done an amazing job on this thanks to their focus on ML and the sheer amount of data they have to learn by.

Does that mean i'm ready to switch?... no. My last device iPhone 6 has lasted me 4 years (and would have been 5 had it not been for a shattered screen). To me, the commitment to being green, providing devices that aren't obsolete <2-3 years later and a clear path to upgrade, security, and privacy will keep me here over software features.

I did think about switching this time around?. you bet I did! However, Androids of similar spec'd products are also pushing $1000 price points. Sure, Apple has a premium price point, and I might be drinking the Apple cool-aid but I do believe they have our Earths best interest at heart. I'm purchasing another iPhone Xr I'll be here at least another 5-6 years
 
And all the usual Apple faithful come flocking to dismiss a solid Android feature that Apple has not implemented.

The feature has been available for years, just not implemented in the stock app. It's a fine feature, cool and all, just not as revolutionary as you all are claiming. It's just funny to witness.
 
Then that's changed because they used to drool over Apple.


They'll come out with it second and call it innovative.


Exactly! This is one of the most biased and even blinded forums I've ever joined.


But Apple isnt the most innovative company in history. Not even close.

TheVerge is an interesting beast. Since Vox took over there's been an interesting dynamic between it and Apple products.

Often, the reviews in person will be scathing. "looks at these problems", but then the TLDR "BUY BUY BUY! 9/10!"

you'll get reviews where (historical example) they'll slam an Android phone for not having NFC, and then when reviewing a non NFC iPhone, completely ignore it on the list of cons.

there's very little consistency left there regarding balanced reporting. Some of their stuff is pretty good still, but then they'll give mixed messages. Another for example, shipping their entire team down to the iPhone event, but deeming the Note 9 event "not worth our time"

It's almost a passive aggressive "We don't actually LIKE Apple, but, we kind of somehow have to keep supporting them" vibe
 
  • Like
Reactions: imtoomuch
Since Google can do this on the Pixel 1 (and how anemic its CPU is in comparison to the A series) Apple should probably be able to do it all the way back to the 6s. Get to it Apple engineers, would love to have this rolled out next summer.

Apple will say it only can work on the 2019 iPhones cause they are the only iPhones with retina liquid magnetic camera.
 
The human eye has a dynamic range far greater than any modern camera sensor, especially ones in smartphones. Granted there is some uniqueness to how well a person can see at night, some are better some are worse, but in general the Pixel 3 is doing a rather well job at reproducing the dynamic range we would see with our own eyes. I've had to use photography through long exposure to acclimate clients to their prospective employees on how well they can see at night using the exposure settings to dial in what they see in our reference field environment.

What Google is doing here is fantastic and I really hope Apple will come in and offer this multi-layer exposure algorithm on their camera natively soon.
 
Traditional camera companies like Canon will be completely obsolete within 5 years.

No they wont. Point and shoots are dead by smartphones, but smartphones will never be able to match mid to high range DSLRs and Mirrorless cameras just from a pure physics standpoint.

Canon, Nikon, and Sony are all offering cameras past 40 MegaPixels, which is enabled largely through 2 advantages DLRS/Mirrorless cameras posses smartphones will never have. First advantage being big image sensors with bigger pixels to capture more light, allowing resolution to increase without a penalty on noise. Yes these low light images are impressive for a smartphone, but will still pale in comparison to what say a Sony A7 series can do. The second big advantage is that big cameras have big lenses, which aside from letting in more light and using more elements to achieve crisper images, allow for more detail to pass to properly utilise all those pixels. Notice that few phones go beyond 16 megapixels now. Thats not a limitation on the amount of pixels we can fit onto the sensor, but a limit of how much light information (detail) can fit through the smaller optics on a phone.

On top of all that, the flexibility you get from being able to swap lenses is indispensable, the massive range of optical zoom some lenses offer, and the control you get with a variable aperture (yes a few phones have variable aperture, but not the massive range of f stops that DSLR lenses offer).

The abridged version of all this is that no, phones will not ever replace DSLR and Mirrorless cameras, just based alone on the physics of how light behaves.
 
Then that's changed because they used to drool over Apple.


They'll come out with it second and call it innovative.


Exactly! This is one of the most biased and even blinded forums I've ever joined.


But Apple isnt the most innovative company in history. Not even close.
Yeah, Apple just invented the current smartphone and iOS, which Android copied.

I'm not saying Apple is the most innovative company in the world, but they DID invent the modern smartphone, tablet, and software that runs them. No one is saying they were the first smartphone, but they were the first to get it right and Android copied them.

The iPhone (original) form factor and icon grid is still being used today.
 
  • Like
Reactions: ikir
WHere has that giant shadow behind the car come from? It’s being lit by a floodlight. Fake.
 
Then that's changed because they used to drool over Apple.

They've never drooled over Apple. Android fanboys fall over themselves when they see good review scores for iphones and don't bother reading text. They stopped assigning scores requiring people to read and understand nuance and think critically... and now those same fanboys can't figure out what to do. They're lost.
 
  • Like
Reactions: ikir and iMEric984
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.