Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
My, that would be a pretty expensive experiment! They could have also photograph Jupiter through a telescope and see if the picture is that of the Moon. In my opinion, the AI should handle both cases.
There are plenty of far-side photos you can find online. I'm not suggesting they send up a rocket! Their procedure involved blurring a photo on a computer and pointing the camera at the screen. So just substitute the photo.
 
Oh what do you know? It's not actually "Fake". Sorry Apple sheep!

Oh, what do you know? He actually says exactly what everyone (else) here has been saying. That the Samsung phones change the image by adding detail that isn't there. Detail that simply can't be genuine based on the lens and sensor in the phone. Detail (and focus and exposure correction) that have been created by Samsung's method. He even says Samsung has an entire document that explains what they're doing.

He tries to handwave it away by saying "all phone cams change what we're looking at anyway" but he never, ever says the Samsung photos haven't been faked.

🐑🐑🐑
 
Last edited:
  • Like
Reactions: AlphaCentauri
Anyone still here? There is more:

Windows to Chinese characters:


Moon and sky got gray rectangles, then photographed with Samsung:
oa1iWz4.jpeg

 
Let's not let iPhone photography off the hook here. They might not be inserting fake images, but the post-processing is so over the top now that I hesitate to even open my phone camera around sunset. "Golden hour" photos of people take on this horrible faked HDR look that is so processed it looks like an illustration more than a photo.

I get that there's a lot that has to happen to turn the output off a tiny photo sensor into a recognizable photo, but it's been getting very heavy handed with HDR in the past few years.
I have found the 14PM is under processing sunrise/sunsets via photos. It always to be muted compared to what is there. Videos(HDR) taken are a little better, but still are underwhelming compared to what is taking place.

The orange was much more intense and vibrant that the photo shows.
 

Attachments

  • IMG-3966 - Copy (2).jpg
    IMG-3966 - Copy (2).jpg
    365.5 KB · Views: 45
  • Like
Reactions: JosephAW
I have found the 14PM is under processing sunrise/sunsets via photos. It always to be muted compared to what is there. Videos taken are a little better, but still are underwhelming compared to what is taking place.

The orange was much more intense and vibrant that the photo shows.
Damn! Apple is faking sunset pictures?!
 
  • Disagree
Reactions: AlphaCentauri
My friend at work has the vanilla Galaxy S21 and I told him about my Note20 Ultra and the moon. He got curious about his phone that we went outside and he adjusted the zoom from low #x until it gets perfectly clear at just 30x. This is S21's highest zoom capacity.

I later tested on Note20 Ultra at home and noticed that the moon looked blurry at 29x but once I hit 30x the screen suddenly showed the clear moon.

Try that between 29x to 30x on your android phone if it has 30x or higher and you will see.

Why so special at 30x?
 
Mousse says they’ve taken hundreds of pictures of the full moon, though. What is going on!
I am an idiot, that's what's going on.🤦‍♂️🤦‍♂️ The moon isn't in the same location in the sky every time.🤦‍♂️ Because of that, it appears to rotate with its pole pointed at Earth. Dammit Jim, I'm a bean counter, not a stargazer.

Hundreds of shots is nothing for someone who takes multiple shots in one sitting. I ain't gonna spend 15 minutes setting up my rig, centering the moon for a single shot. The first half dozen to get the exposure I want. The rest is experimenting with different shutter to aperature ratios.
 
I have found the 14PM is under processing sunrise/sunsets via photos. It always to be muted compared to what is there. Videos(HDR) taken are a little better, but still are underwhelming compared to what is taking place.

The orange was much more intense and vibrant that the photo shows.
In Photo, Edit, choose the circle in the middle and swipe to choose “Vivid” or “Vivid Warm”. The data is there and you can actually make it look more like what you remember by the other editing options.
 
  • Like
Reactions: Rocko99991
I am familiar with various astrophotography software, in particular this one...

Lunar and solar deconvolution | Astro-photo

...and it works as demonstrated at that link...well within the, cough, "realms of consideration".

The point of my posting is to educate the readers of this thread who are either maybe not familiar with or maybe not even aware of what is possible with deconvolution (or, in another word, well, math).
What is demonstrated in the Reddit discussion, most recently, is absolutely not possible via deconvolution. Any of the state-of-the-art deconvolution solutions available will not bring you anywhere close to that result. It is cleanly, and extensively, outside "realms of consideration."
 
This was from my iPhone 11 Pro Max when I held it up to my telescope eyepiece.
It probably goes without saying, but you absolutely can capture material detail in a moon photo using a telescope and a phone as a simple capture device. Could get much sharper and more detailed than this, even. The primary limitations these smartphones are up against when it comes to the moon, outside their sensors, are the optics themselves.
 
It probably goes without saying, but you absolutely can capture material detail in a moon photo using a telescope and a phone as a simple capture device. Could get much sharper and more detailed than this, even. The primary limitations these smartphones are up against when it comes to the moon, outside their sensors, are the optics themselves.
I find the hard part is locating the tiny exit pupil with the phone
 
I think a better test is to take a moon photo from Saturn and retry it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.