Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If I want a fake shady device I will just buy a cheap Motorola.. but why people spend top dollar to buy a scamsung device is totally beyond me.
 
  • Like
Reactions: gusmula
It's pretty funny for people on either side to get worked up about this. It's part of Samsung's DNA. Here's a "scandal" from their TV division: https://www.flatpanelshd.com/news.php?subaction=showfull&id=1654235588

Samsung has been caught cheating by designing its TVs to recognize and react to test patterns used by reviewers. The company promises to provide software updates to address the situation... The QN95B not only changes its color and luminance tracking during measurements to appear very accurate, it also boosts peak brightness momentarily by up to 80%, from approx. 1300 nits to 2300 nits.

The difference there is that there are industry standards for display accuracy.
 
Lets do a level set here. Most of the time companies brag about their AI computational photography. Apple does it Google does it and people love the outcome.

I think Samsung's mistake here was not being upfront and honest about it.

Yeah, that’s most of the problem…

I mean is bokeh effects fake?

It is, but I also see a difference between destroying information with these effects and secretly inventing information and inserting it.

Is Googles Magic Eraser pictures fake?

Look over in the iPhone pictures thread here on MR. How many of those are edited in some kind of editing software to enhance the original? does that make them fake?

I think the difference is what is done by machine and what is the creative input of the photographer. Dark room manipulation has been around about as long as photography, it’s part of the photographic process. Stalin was removing people from photos long before machine learning could help.

Inpainting is a mix of both. The photographer is involved in choosing what is edited out but the machine is providing the infill rather than the old method of cloning.

The debate here is legitimate— new tools always raise these questions. I’m sure painters in the day thought cameras were automating too much of the artistic painting process.
 
  • Like
Reactions: gusmula and Mousse
This makes me wonder if I was taking a picture of a painting that included detail of the moon if the Samsung Galaxy S20 might replace the painting of the moon with an ML version. That would be frustrating and disturbing! As this expands, how many other items would be replaced?
The Taj Mahal, the Eiffel Tower, the leaning tower of Pisa… all these things of which there’s already billions of photos in the cloud of. Here, why bother adding your crappy pic to it, have a super duper UHD pic generated from the billions of existing ones. Without all the other tourists cluttering up the frame!
 
The debate here is legitimate— new tools always raise these questions. I’m sure painters in the day thought cameras were automating too much of the artistic painting process.
I am sure they did...but still the photos speak for themselves. Heck Samsung should should have been upfront about it and bragged about it. Everyone uses computational photgraphy. People would be surprised if all the phone camera did was take a picture of what was infront of it and AI did not enhance it in anyway.
 
I am sure they did...but still the photos speak for themselves. Heck Samsung should should have been upfront about it and bragged about it. Everyone uses computational photgraphy. People would be surprised if all the phone camera did was take a picture of what was infront of it and AI did not enhance it in anyway.

Yeah, though as I'm reading deeper I'm not finding anywhere that Samsung denied this is just AI inpainting. Most of what they say is that it's a combination of AI and super resolution in an ambiguous enough way that you might think they're using AI for "detail enhancement" but which you can also read as "we saw the moon and fixed it". They're doing the latter and their statements don't seem to directly contradict that.
 
  • Like
Reactions: jamezr
And?

The iPhone's boca is entirely fake. Was there an article that ripped Apple for this? A number of the camera's features are done entirely by software.
You can't see the difference between bokah* and this?

Would you like if a camera, at the software level that speaks to the hardware, removed blemishes from your face? Or slimmed your waist?

Why are you ok with the camera capturing detail that isn't there? Bokah is an aspect of photography that simulates how the eyeball sees a subject separated from its background. By adding it you are NOT adding details to the subject that aren't there. This is a huge difference.
 
Last edited:
  • Love
Reactions: AlphaCentauri
Next up, the feature where it automatically uses your movie streaming history to computationally enhance pictures of your average-looking spouse with details from your favorite Hollywood actors!
 
(shakes my Note20 Ultra violently)

Several tiny white figures fell out of it - those looked like AI robots from the movie "I, Robot"

Ah, just as I suspected....
 
The funniest part about this is that taking a picture of the moon is literally the dumbest thing ever. It looks the same all the time depending on what phase it is in.
Sounds like someone who has never taken a picture of the moon before. I've taken hundreds of full moon pictures and guess what? Each full moon looks different because...wait for it...the moon rotates. There is also the color change do to atmospheric conditions--everything from yellow to red to purple to grey.
I think the difference is what is done by machine and what is the creative input of the photographer. Dark room manipulation has been around about as long as photography, it’s part of the photographic process. Stalin was removing people from photos long before machine learning could help.

Inpainting is a mix of both. The photographer is involved in choosing what is edited out but the machine is providing the infill rather than the old method of cloning.

The debate here is legitimate— new tools always raise these questions. I’m sure painters in the day thought cameras were automating too much of the artistic painting process.
Exactly. AI is another tool to add to the arsenal. It's the artist's to use it or not. Personally, I don't like it. Hail, I'm not a fan of heavy manipulation in Photoshop either. I prefer to do photo manipulation in camera. Because I am not a fan doesn't mean I am going to criticize some who makes use of that tool.

New ways of doing things comes along all the time. How many under 40 year old photographer knows how to change film on an SLR? I've forgotten myself, having gone digital for nearly 30 years.
 
I don’t understand why you all are surprised. Samsung has been replacing people’s faces with fakes for more than a decade. This is just a natural progression of Samsung photography. 🤷‍♂️
 
  • Like
Reactions: gusmula
I don’t understand why you all are surprised. Samsung has been replacing people’s faces with fakes for more than a decade. This is just a natural progression of Samsung photography. 🤷‍♂️
Wait! Is that why I took a selfie on my S23 Ultra and everyone thought I was Alan Thicke???
 
To be honest, this seems like the right thing to do: the moon always looks the same anyway.

I see two cases:
1. The moon is part of a landscape shot. In this case augmenting the details of the moon will result in a better picture overall, even if the moon details were AI-generated.
2. The moon is the main subject. In this case, why are you even taking a picture when there is a billion identical images of the moon that are freely available for you to watch on the internet?
 
Interesting. Most of the photos I have seen look blurry and what I would expect from a camera phone. I haven't seen many shots as clear as the shots posted on the article, nor have I seen the change in color temp.

Not a Moontographer™ but happy with my 2nd hand Canon SX60 HS that I picked up for ~$200 (Far less than an iPhone or Ultra Galaxy).

IMG_0695_Original.jpg
 
I tried the same thing with the image above with my Galaxy 22 Ultra and it didn't change at all. What was different about the experiment?
 
I think the key takeaway here is you can only realistically do so much with smartphone camera tech, and always to be skeptical when a smartphone company promises you the moon and the stars (sometimes literally) with their "revolutionary new camera tech".

It also puts into perspective Apple's iPhone camera performance relative to all the resources they sink into it. The main difference here is that Apple is very careful to not overpromise what sort of camera performance to expect from your iPhone, and when they sometimes seem to fall short of the competition, perhaps the reason isn't because they are falling behind.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.