Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,494
37,783


Samsung's "Space Zoom" feature has come under fire amid complaints that images of the moon are being artificially enhanced to an extreme extent.

samsung-s23-ultra-rear.jpg

Samsung introduced a 100x zoom feature with the Galaxy S20 Ultra in 2020, becoming a mainstay on recent flagship handsets from the company. Since its debut, Samsung has touted its devices' ability to take impressive pictures of the moon. Unlike brands such as Huawei, which simply overlay a PNG of the moon on such images, Samsung says that no overlays or texture effects are applied.

Yet on Friday, a Samsung user on the subreddit r/Android shared a detailed post purporting to "prove" that Samsung's moon shots are "fake." Their methodology involved downloading a high-resolution image of the moon, downsizing it to just 170 by 170 pixels, clipping the highlights, and applying a gaussian blur to heavily obscure the moon's surface details. This low-resolution image was then displayed on a monitor and captured at a distance from a Samsung Galaxy device. The resulting image has considerably more detail than its source.

samsung-moon.jpeg

Samsung devices seemingly achieve this effect by applying machine learning trained on a large number of moon images, making the photography effect purely computational. This has led to accusations that a texture is functionally still being applied to images of the moon and that the feature is a disingenuous representation of the camera hardware's actual capabilities, triggering heated debate online, even bringing into question the iPhone's reliance on computational photography.

Article Link: Samsung 'Fake' Moon Shots Controversy Puts Computational Photography in the Spotlight
 
One of the things Samsung claimed about their moon shot feature is that it's using multiple frames to create the final image, but since this experiment used a permanently blurred image it's impossible for the phone to be using multiple frames to reconstruct a sharper image. I'd be interested to hear how Samsung explain that discrepancy.
 
Jesus wept. 🤦🏻‍♂️

So now even our own photographs get faked?

I just do not understand the mentality behind this at all…

"Your 100X zoom 50,000,000,000 pixel phone camera is incapable of taking a truly good photo of the moon?

Never mind, here's a faked version for you to brag about on Insta!

'Cos you're a winner baby!"



SMFH.
 


Samsung's "Space Zoom" feature has come under fire amid complaints that images of the moon are being artificially enhanced to an extreme extent.

samsung-s23-ultra-rear.jpg

Samsung introduced a 100x zoom feature with the Galaxy S20 Ultra in 2020, becoming a mainstay on recent flagship handsets from the company. Since its debut, Samsung has touted its devices' ability to take impressive pictures of the moon. Unlike brands such as Huawei, which simply overlay a PNG of the moon on such images, Samsung says that no overlays or texture effects are applied.

Yet on Friday, a Samsung user on the subreddit r/Android shared a detailed post purporting to "prove" that Samsung's moon shots are "fake." Their methodology involved downloading a high-resolution image of the moon, downsizing it to just 170 by 170 pixels, clipping the highlights, and applying a gaussian blur to heavily obscure the moon's surface details. This low-resolution image was then displayed on a monitor and captured at a distance from a Samsung Galaxy device. The resulting image has considerably more detail than its source.

samsung-moon.jpeg

Samsung devices seemingly achieve this effect by applying machine learning trained on a large number of moon images, making the photography effect purely computational. This has led to accusations that a texture is functionally still being applied to images of the moon and that the feature is a disingenuous representation of the camera hardware's actual capabilities, triggering heated debate online, even bringing into question the iPhone's reliance on computational photography.

Article Link: Samsung 'Fake' Moon Shots Controversy Puts Computational Photography in the Spotlight
Samsung Ultra's always get overhyped by both Samsung & YouTube techies and then people buy them and the truth comes out 😏
 
I think we can expect more and more of computational / AI-enhanced photography.

I just wish we get more control over whether all these features are enabled, or to which extent.

As a photographer, I already find the images from the iPhone quite unnatural looking / HDR-ry. You can mitigate some of that by playing with Photographic Styles, or using a third party app, but I'd prefer something more straightforward.
 
It must do it in realtime.
I used to have a Note 20 Ultra and when zooming in on objects really far away, it can "focus" and show relatively clear images as you're zooming.
I was never able to get a clear Moon photo. Just white blob. But when zooming in to take the actual pic, it would have brief moments were you could clearly see some surface details.
 
  • Disagree
Reactions: iGMX
I don't feel like reading the blog post now (at work), but did they compare the resulting image with the original to see how much of the extra detail was "real" and how much was hallucinated? Also, they could have done the same procedure with a photo of the far side of the moon (presumably not trained in the AI) to see if it would still result in more detail or not.
 
Using computation to enhance a camera's capacity to take better pictures has been around for a very long time. My first computationally-enhanced film camera (meaning it had computer chips offering suggestions on settings based on lighting templates and "scenes") came out around 40 years ago. Digital sensors in actual digital cameras, from the get-go have had multiple layers of computationally enhanced capabilities, with quantitatively increasing levels of enhancements. This latest "enhancement" though is something of a qualitatively different nature. This is taking "scene" (e.g. moonshot) to an entirely different level, and basically replacing much of what the camera is capturing in the moment and replacing it with elements of photos stored on the camera or on the web. This is more like pointing your camera at an image, and then the camera compares what it captures to all similar images it already has (somewhere) and giving you the best version it has. Not good. You might as well skip the camera altogether and say ("Hey Google, download to my phone the best blank, clear sky shot of the moon you have").
 
Let's not let iPhone photography off the hook here. They might not be inserting fake images, but the post-processing is so over the top now that I hesitate to even open my phone camera around sunset. "Golden hour" photos of people take on this horrible faked HDR look that is so processed it looks like an illustration more than a photo.

I get that there's a lot that has to happen to turn the output off a tiny photo sensor into a recognizable photo, but it's been getting very heavy handed with HDR in the past few years.
 
Last edited:
A lot of processing is already happening on your smartphone photos. If I see what AI can do in general, it's no surprise smartphone manufacturers also explore that area to improve their products. We'll probably get to see it more in the future.

But my biggest problem is: why wasn't Samsung being honest about this in the first place? If you like it or not, it's still a nice achievement, hence all the reviewers until now thought it was real and didn't even noticed it. But as a customer I wouldn't like to be lied to.
 
Let's not let iPhone photography off the hook here. They might not be inserting fake images, but the post-processing is so over the top now that I hesitate to even open my phone camera around sunset. "Golden hour" photos of people take on this horrible faked HDR look that is so over the top it looks like an illustration more than a photo.

I get that there's a lot that has to happen to turn the output off a tiny photo sensor into a recognizable photo, but it's been getting very heavy handed with HDR in the past few years.
Now I wonder if this makes the Bokeh effect on iPhone look fake.
 
I think we can expect more and more of computational / AI-enhanced photography.

I just wish we get more control over whether all these features are enabled, or to which extent.

As a photographer, I already find the images from the iPhone quite unnatural looking / HDR-ry. You can mitigate some of that by playing with Photographic Styles, or using a third party app, but I'd prefer something more straightforward.
It's like the trend for profile pics on social media.
Photos that are so manipulated that you would be hard pressed to recognize the same person in real life.
 
Using computation to enhance a camera's capacity to take better pictures has been around for a very long time. My first computationally-enhanced film camera (meaning it had computer chips offering suggestions on settings based on lighting templates and "scenes") came out around 40 years ago. Digital sensors in actual digital cameras, from the get-go have had multiple layers of computationally enhanced capabilities, with quantitatively increasing levels of enhancements. This latest "enhancement" though is something of a qualitatively different nature.
Indeed - so much so that now I really don't want to use my phone for photos because what comes out usually looks... too good. A while back people said that they wanted Apple to make an SLR, but it would be awful - all that processing added to full frame images? Helping you accentuate what is there is fine, but making up detail or replacing detail really doesn't seem like photography anymore.

(I feel the same way about photographers who replace skies in images. You're an artist that works with photography, but the output is no longer a photo)
 
This makes me wonder if I was taking a picture of a painting that included detail of the moon if the Samsung Galaxy S20 might replace the painting of the moon with an ML version. That would be frustrating and disturbing! As this expands, how many other items would be replaced?
 
The thing is, why did Samsung have to lie and make it seem to be purely optical? They should've actually bragged about their ML and used this as an example, and pointed out the science (we only see that side of the moon from here anyway). I mean jeez, so many marketing opportunity. Yet they had to lie.

Huawei would laugh as they already did this many moons ago (sure, not ML, but the end is practically the same, you're faking it).
 
Now I wonder if this makes the Bokeh effect on iPhone look fake.
In my experience, it's hit and miss. With a simple object it can do a nice job of masking out what's in the foreground to fake depth of field. Try it on a subject with frizzy hair, and it gets real weird real quick. I generally leave it off, and when I do use it, I turn it down to at least ƒ8 or so to keep it from looking quite so cheesy.

To be fair, I haven't tried Portrait Mode on a Pro iPhone with LiDAR, and maybe it works better with that added data?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.