Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Then that would be wanted flair. The question is now how does Apple determine wanted from unwanted.
Because Apple’s aware of the artifacts all of their phones capture, they could look at the photo data, scan the image for bright spots (captured in HDR) and then look at the appropriate areas of the image to find those little green blobs and do a “heal” on them.

However, I see no indication of this being done with the latest betas. I captured a few images with an obvious bright point and the artifact is still there. Has anyone other than the initial person even seen this in use?

It’s to the point where someone could post “Apple <something something false like everyone that uses the iPhone eventually loses a pinky>” and the pro-con back and forth would start even though it’s never indicated that the statement in question is true! Just include “appears” or “rumored” in your headline and you’re covered!

EDIT:
So, if you take a Live Photo with an obvious green hot spot floating around and THEN, after taking it, choose “long exposure” for the type of effect, that green spot goes away. So if “in certain conditions” means “when I choose long exposure”, then I think that’s expected. ;)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.