Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Even editing of existing skies in images bothers me. People neglect to think about the fact that the sky is a big part of the original "mood" of the surroundings when the photo was snapped. If it's kind of "bleah" looking on an overcast day? Attempts to brighten it up or change the hue to make it more visually pleasing are dishonest.
This is what photographers usually do after capturing images, anyway. In most cases, they’re trying to sell their work and the more pleasing it is, the more likely a larger number of folks are likely to see and purchase it (or they’re doing work on contract and the person doling out the money wants all the pictures to look ‘good’). Or they’re trying to win a contest. I suppose it could be labeled as dishonest, but that level of dishonesty has been around since the dawn of photography.
 
  • Like
Reactions: sflagel
Hi Folks,

Below, some of my own personal insights and knowledge with telephoto lunar photography. FWIW, my experience in observing and recording the night sky spans a lifetime of amateur astronomy since before I got my first telescope in 1967.

A few quick items that folks should be educated on and familiar with before casting one's public opinion on this thread's topic...

1. The, er, "evidentiary" image of the moon purportedly shot with a Sony a7R III/200-600mm lens here...

Fake Samsung Galaxy S21 Ultra moon shots debunked - MSPoweruser
https://mspoweruser.com/fake-samsung-galaxy-s21-ultra-moon-shots-debunked/

...speaks to either A. the photographer's inability to know how to use that camera/lens combo for excellent results, and/or B. an attempt to use fake or misleading evidence for their argument. A simple search of the Sony 200-600mm moon images will reveal such...

Search: Sony 200-600mm moon | Flickr
https://www.flickr.com/search/?group_id=14620456@N20&view_all=1&text=moon

...clearly, that lens and pretty much any Sony body can provide a much more detailed lunar image than the one posted as evidence.
True.
I read this article and I was a bit sorry for the A7 owner. I had much better results 15 years ago with a FZ18.
2. The use of combining numerous images and using deconvolution methods in post processing has been a, um, "thing" in the world of astrophotography since the dawn of digital imaging...

moon deconvolution at DuckDuckGo
https://duckduckgo.com/?t=ffab&q=moon+deconvolution&ia=web

My take on Samsung's "100x Space Zoom" feature is that it's nothing more than an automated stacking and deconvolution software routine labeled/marketed as "AI".
Should be easy to check for someone with the appropriate software: take the 170x170 blured picture that was used and deconvolute it. See what you can recover.
A major difference is that in astrophoto, you stack to reverse the negative impact of the atmoshere. Because it is not still, each frame have a different piece of information, and selecting the better ones and stack them can dramatically improve the results. Here, the source material is blurred, so stacking won’t help.
 
Last edited:
1. The, er, "evidentiary" image of the moon purportedly shot with a Sony a7R III/200-600mm lens here...

Fake Samsung Galaxy S21 Ultra moon shots debunked - MSPoweruser
https://mspoweruser.com/fake-samsung-galaxy-s21-ultra-moon-shots-debunked/

...speaks to either A. the photographer's inability to know how to use that camera/lens combo for excellent results, and/or B. an attempt to use fake or misleading evidence for their argument. A simple search of the Sony 200-600mm moon images will reveal such...
When I saw the image of the moon they said was taken with the Sony mirrorless, and his description of not being able to get anything better even while using manual focus, I cringed. Those photos are so bad, so soft, that I thought the same: If the photographer is that bad, they should have asked someone else to take the photos. And if, like you said, they are just trying to mislead readers in order to make the phone images look better, then their entire post goes out the window.
 
this bar trick only works because it is taking a photo of a stationary object like the moon
its a good thing for them since you can only photograph 1 side of the moon so they can pull this kind of trick but if lets say the object is the Eiffel Tower and people are taking from multiple angles and sides then their tricks would've came undone way sooner
its funny that they tried to hide this from their customer and tried to pass it off as a real legit feature
how are there not a bunch of lawsuits towards them regarding this 'lie'? how are they still not bankrupt yet? only apple is still being honest in the photography field.... for now.
 
Look over in the iPhone pictures thread here on MR. How many of those are edited in some kind of editing software to enhance the original? does that make them fake?
I fake my photos in camera. I use a neutral density filter (sunglasses for cameras) to take long exposure shots in broad daylight. I used a polarizer to reduce the glare during a powerboat race. I use a sepia toned filter to make my Texas photos look like a Mexico photo. (Hollywood trope) I've used cut outs to alter the bokeh. Instead of the boring normal bokeh, I get shaped bokeh: hearts, stars, '?', or whatever.:cool: I've used pantyhose for glamour shots when I didn't have a diffusion filter handy. It blurs away wrinkles and harshness, gives the subject a "glow", while still retaining sharpness.

Snapshots are boring in my book. Taking pictures is like driving anyone can do it, but the gap between regular drivers and highly skilled drivers is wide.
 
  • Like
Reactions: jamezr
For all the criticism of Samsung, I finally understand what AI means in photography. It is not about computation to process the information that the lens captured. It is using a memory chip that has been separately trained on various objects (in this case the moon, but it could also be faces, meadows, landmarks, snow, etc) and ADDING colour, structure, patterns, etc of best-in-class photos to your photo. What you end up with is a composite of your photo with information from thousands of other photos of the same subject. Scary.
 
This is what photographers usually do after capturing images, anyway. In most cases, they’re trying to sell their work and the more pleasing it is, the more likely a larger number of folks are likely to see and purchase it (or they’re doing work on contract and the person doling out the money wants all the pictures to look ‘good’). Or they’re trying to win a contest. I suppose it could be labeled as dishonest, but that level of dishonesty has been around since the dawn of photography.
Spot on, humans have been altering photos using knowledge of best in class pictures of the same or similar subject for hundreds of years, even drawing lines and colours by hand on film (CGI in the 70s and 80's). But it was an active conscious step.
 
  • Like
Reactions: arkitect
It is not “enhancement”, it is a different photo “based on what you want to capture”. Take that photo in airplane mode and see how good is it.
By that logic, if I take a blurry photo my myself, I will get a photo of Tom Cruise?, or if I take a photo of my 2014 BMW, I will get a photo of a 2023 car?
 
No amount of “computational photography” can provide such a great photo of the moon from a phone camera. Really, there is not enough light to capture or right, nor the 100x zoom is really like a real telescope, a 100x zoom would allow photos like a microscope or like a telescope of anything, not just of the moon.
 
It is not “enhancement”, it is a different photo “based on what you want to capture”. Take that photo in airplane mode and see how good is it.
By that logic, if I take a blurry photo my myself, I will get a photo of Tom Cruise?, or if I take a photo of my 2014 BMW, I will get a photo of a 2023 car?
I would actually find an app that does this, quite obviously and quite intentionally, hilarious. :)
 
The funniest part about this is that taking a picture of the moon is literally the dumbest thing ever. It looks the same all the time depending on what phase it is in. If you want a picture of the moon you can easily google "moon" and you get the same or a better picture than you could take yourself. There's literally nothing unique about a picture of the moon.
 
It is not “enhancement”, it is a different photo “based on what you want to capture”. Take that photo in airplane mode and see how good is it.
By that logic, if I take a blurry photo my myself, I will get a photo of Tom Cruise?, or if I take a photo of my 2014 BMW, I will get a photo of a 2023 car?
I imagine that the "AI" in our cameras already enhance our skin, tone, teeth, hair etc based untrained images for kin, etc.
 
For at least a couple weeks. Then, they realize how hard it is to get developed and bemoan digital while using it. :)


I may be a bit more into it than the average enthusiast but I develop most black & white at home. So that helps a ton. Not difficult at all. Actually rather inexpensive, and the process is kinda zen. If you can follow a recipe in a cookbook you can develop film, and have your photos scanned or printed about as quickly as you can manage to dry the negatives.

The color chemistry is a bit more volatile and not so economical, so I don’t mess with that stuff at home. The cost/necessity to ship color film out to a lab… and wait… and wait… and wait… does suck the fun out of it to some extent. I do miss the days of having a 1hr minilab on every corner. Grab some pizza next door and your pictures are ready for pickup by the time you’re done. But I still ultimately enjoy the process of shooting a fully manual/fully mechanical camera, and still very much enjoy the end results, so the additional hassle is not the end of the world for me.
 
Yeah, no kidding. We all get what HDR photography is. The effect is being applied very heavy-handedly at times -- to the point where the resulting image looks overcooked and unnatural.

What we're all used to, is that sometimes parts of a photo will naturally be over- or under-exposed, which reflects real-life lighting conditions. That's what we've all been used to seeing in photos for decades. Apple is HDR stacking so aggressively that it tries to weed out all clipped parts of an image, and the result can at times just doesn't end up looking like a naturalistic photo at all, but more like some freakish AI interpretation (which is what it is). Again, go out on a nice golden hour when it's clear, plop someone in the direct sunlight, and see what happens to their face when you take a photo of them with a recent iPhone. It's... not great.

I'm not the only one who thinks so:
Eh... you're being hyperbolic, but I do agree with part of your argument, here. I think iOS could do a markedly better job of handling its initial presentation of extremely high dynamic range captures simply by compressing the data less (in terms of distribution across shadows and highlights, not the definition of compression people here might be more familiar with). I can see how it goes wrong pretty easily photographing my dog, which has white, in sunlight with dark elements in the scene (e.g. laying in the grass in the sun). It captures dynamic range in the highlights of her head, but compresses the overall dynamic range too much. Instead, in my opinion, they should allow the shadows and highlights (highlights in particular—it handles shadows more respectably) fall further into highlights for a more natural initial presentation. Those highlights could then be recovered in post-processing if someone wished (ProRAW has the bit depth) and people shooting general photographs are going to start with something which feels closer to what the human eye experienced in the moment.

Hi Folks,

Below, some of my own personal insights and knowledge with telephoto lunar photography. FWIW, my experience in observing and recording the night sky spans a lifetime of amateur astronomy since before I got my first telescope in 1967.

A few quick items that folks should be educated on and familiar with before casting one's public opinion on this thread's topic...

1. The, er, "evidentiary" image of the moon purportedly shot with a Sony a7R III/200-600mm lens here...

Fake Samsung Galaxy S21 Ultra moon shots debunked - MSPoweruser
https://mspoweruser.com/fake-samsung-galaxy-s21-ultra-moon-shots-debunked/

...speaks to either A. the photographer's inability to know how to use that camera/lens combo for excellent results, and/or B. an attempt to use fake or misleading evidence for their argument. A simple search of the Sony 200-600mm moon images will reveal such...
They (in defending Samsung) failed rather terribly at debunking the evidence at hand, and did so by taking Saumsung’s word at face value. The example given—a heavily Gaussian blurred moon, which was initially subject to its own degradation from atmospheric turbulence and the like—was "recovered" beyond anything even remotely realistic for this sort of technology. The only way this level of "recovery" would be possible would be with a machine learning model which has been specifically trained on the moon to use its knowledge of the moon to inject data that could not have been extrapolated from the source data. I covered the differences between this and "honest" machine learning in my earlier reply.

And if you’re experienced with lunar/planetary photography (lucky imaging and the usual suspects like PIPP, RegiStax, AutoStakkert!) you should also have a passable understanding of data integrity and what is possible to extrapolate through conventional means so you should at least know that this if falling well outside those realms of consideration.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.