Now just photoshop an image of the moon taken from a telescope on it 😂View attachment 2172947This was a photo from my iPhone 14 just a few days ago and adjusted the brightness and saturation.
Now just photoshop an image of the moon taken from a telescope on it 😂View attachment 2172947This was a photo from my iPhone 14 just a few days ago and adjusted the brightness and saturation.
There are plenty of far-side photos you can find online. I'm not suggesting they send up a rocket! Their procedure involved blurring a photo on a computer and pointing the camera at the screen. So just substitute the photo.My, that would be a pretty expensive experiment! They could have also photograph Jupiter through a telescope and see if the picture is that of the Moon. In my opinion, the AI should handle both cases.
And now that he has been released, they took him back and made him an executive chairman. It’s like a mob family 🤣What do you expect from a company whose CEO has been to prison for fraud.
Oh, what do you know? He actually says exactly what everyone (else) here has been saying. That the Samsung phones change the image by adding detail that isn't there. Detail that simply can't be genuine based on the lens and sensor in the phone. Detail (and focus and exposure correction) that have been created by Samsung's method. He even says Samsung has an entire document that explains what they're doing.Oh what do you know? It's not actually "Fake". Sorry Apple sheep!
I have found the 14PM is under processing sunrise/sunsets via photos. It always to be muted compared to what is there. Videos(HDR) taken are a little better, but still are underwhelming compared to what is taking place.Let's not let iPhone photography off the hook here. They might not be inserting fake images, but the post-processing is so over the top now that I hesitate to even open my phone camera around sunset. "Golden hour" photos of people take on this horrible faked HDR look that is so processed it looks like an illustration more than a photo.
I get that there's a lot that has to happen to turn the output off a tiny photo sensor into a recognizable photo, but it's been getting very heavy handed with HDR in the past few years.
Damn! Apple is faking sunset pictures?!I have found the 14PM is under processing sunrise/sunsets via photos. It always to be muted compared to what is there. Videos taken are a little better, but still are underwhelming compared to what is taking place.
The orange was much more intense and vibrant that the photo shows.
I am an idiot, that's what's going on.🤦♂️🤦♂️ The moon isn't in the same location in the sky every time.🤦♂️ Because of that, it appears to rotate with its pole pointed at Earth. Dammit Jim, I'm a bean counter, not a stargazer.Mousse says they’ve taken hundreds of pictures of the full moon, though. What is going on!
In Photo, Edit, choose the circle in the middle and swipe to choose “Vivid” or “Vivid Warm”. The data is there and you can actually make it look more like what you remember by the other editing options.I have found the 14PM is under processing sunrise/sunsets via photos. It always to be muted compared to what is there. Videos(HDR) taken are a little better, but still are underwhelming compared to what is taking place.
The orange was much more intense and vibrant that the photo shows.
I just tried the same thing and get a consistently different result...
This was from my iPhone 11 Pro Max when I held it up to my telescope eyepiece.Now just photoshop an image of the moon taken from a telescope on it 😂
What is demonstrated in the Reddit discussion, most recently, is absolutely not possible via deconvolution. Any of the state-of-the-art deconvolution solutions available will not bring you anywhere close to that result. It is cleanly, and extensively, outside "realms of consideration."I am familiar with various astrophotography software, in particular this one...
Lunar and solar deconvolution | Astro-photo
Lunar and solar deconvolution | Astro-photo | by André van der Hoeven
www.astro-photo.nl
...and it works as demonstrated at that link...well within the, cough, "realms of consideration".
The point of my posting is to educate the readers of this thread who are either maybe not familiar with or maybe not even aware of what is possible with deconvolution (or, in another word, well, math).
It probably goes without saying, but you absolutely can capture material detail in a moon photo using a telescope and a phone as a simple capture device. Could get much sharper and more detailed than this, even. The primary limitations these smartphones are up against when it comes to the moon, outside their sensors, are the optics themselves.This was from my iPhone 11 Pro Max when I held it up to my telescope eyepiece.
I find the hard part is locating the tiny exit pupil with the phoneIt probably goes without saying, but you absolutely can capture material detail in a moon photo using a telescope and a phone as a simple capture device. Could get much sharper and more detailed than this, even. The primary limitations these smartphones are up against when it comes to the moon, outside their sensors, are the optics themselves.
Umm... This is an article at MacRumors, I'm guessing most of the readers here are Apple fans.Funny how Apple fans are so quick to jump all over this.