Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I get it now!



Those attributes you mentioned (depth, color, dynamic range), are those software or hardware things? If Apple wanted, could they allow you, through software, to turn off all the processing and present a photo of what you're actually seeing?



That makes sense. To paraphrase, it's more of an "I was there!" that's desirable, as opposed to "This is the best looking processing pipeline!"?

It's a combination of both. Smartphone camera sensors (and lenses) were absolutely miniscule compared to most purpose built cameras and so you run into the problem of not being able to capture much information full stop, information that encompasses all the attributes of what make a photograph a photograph: light, color, texture, etc.

For the most part you can't get around the physics of a small sensor and lens, the pace of advancement in those kinds of physical sciences is a lot slower than the pace for software development so naturally the solution was to build massive photo processing pipelines to overcome the inherent weaknesses of smartphone cameras.

The bulk of these processes come from HDR and different sharpening/smoothing algorithms.

HDR was introduced to combat the poor dynamic range of smartphone camera sensors. Dynamic range is the range of light a sensor can pick up on without overexposing (where parts of the image turn pure white) or underexposing (where parts of the image turn pure black). To combat poor dynamic range, the cameras take multiple photos in quick succession wherein each photo adjusts the exposure (the light sensitivity of the camera per se) to capture 'highlights' (for example clouds in a bright sky), mids, and shadow areas (say, the shaded area under a tree) and then combine those photos to virtually extend the dynamic range of the final output. This was introduced on the iPhone 4. Here's an example of HDR on the iPhone 4, you can see parts of the highlights are no longer overexposed, the detail has been brought back:

n4zYvIa.png


One potential consequence of HDR is you kill off the subtle qualities of shadow detail that give the image a sense of depth in the first place. The above photo is a good example actually because while HDR successfully recovers highlight detail it comes at the expense of making her face look a little flatter and less '3D.' Smartphones are obviously better at HDR now vs. the iPhone 4 from a decade and a half ago but those subtle shadow qualities are still very hard to retain.

Another major trick is using sharpening, smoothing, and anti-noise functions to compensate for lack of detail from a small sensor and sensor noise from bumping up the ISO (the adjustable light sensitivity of the sensor). One way to adjust exposure is to adjust the 'aperture' of the lens itself which is the physical size of the lens opening (click for image example) but most smartphones do not have an adjustable aperture, unlike the Xiaomi 14 Ultra. If you cannot adjust the aperture, you must adjust the ISO instead. Increasing the ISO of a sensor typically results in more noise from the sensor (click for image), so you can get rid of it with anti noise functions.

The problem with anti noise is you're somewhat forced to remove image detail so to give the illusion of bringing it back you introduce sharpening and smoothing functions. You might sharpen the edges of subjects and smooth/saturate some of the other elements of the subject to exaggerate the colors and shape of a subject. Apple have many names for this process like 'Smart HDR' or 'Photonic Engine' and it's caused a lot of controversy. In the past you could do what you're proposing and turn off Smart HDR to get more realistic images but Apple disabled that feature since the 12 Pro.

Apple engineers decided the overall 'artistic' look of a final uncropped image on a smartphone is more important to their users than realism or preserving cropped detail. In an interview Apple said they model iPhone images after the look of oil paintings, ironic considering many iPhone customers have been complaining about an 'oil painting effect' for many years.

The below photo is a good example of the shift Apple have taken in image processing. One is a crop from the 13 Pro which radically boosted the image processing vs. the 11 Pro which takes a moderate approach to post processing. The 11 Pro has more grain but the image is more realistic, it's closer to what a typical camera might take. To a casual observer viewing the full image the 13 Pro may look more aesthetically pleasing (to a point) and indeed many portrait shots I took with my 13 Pro looked a lot nicer on my phone screen than with my 11 Pro.

One targets an 'artistic' look, the other targets a 'realism' look. A lot of people like the artistic look, they want things to look better than real life.

nmN8yPp.png


Ultimately there is a balance that must be struck. The Xiaomi is interesting because the camera hardware is a lot more capable than most competitors, primarily because all the cameras on that phone are "1 inch sensors" (not actually 1 inch lol, that's just the name of that size class), they have adjustable aperture (hardly any phones have this), and naturally the lenses are bigger so the phone's dynamic range straight from the sensor (without HDR) is a lot wider. More capable hardware means you have the freedom to tone down post processing and HDR. Yes the Xiaomi is doing a lot of post processing but it doesn't have to be as dramatic, both because the hardware enables it and because the engineers chose to target a more natural look vs. the highly stylized approach taken by Apple, Google, and Samsung. Leica, Xiaomi's partner, are famous for their '3D look' and realistic, rich colors so they also likely had a big say in Xiaomi's image target.

I prefer what Xiaomi and Leica are doing. I hope that if Apple use larger sensors in the future they can tone down some of the 'oil painting effect' and HDR processing to make the image look a bit more realistic rather than artistic. Again, it's a balance, I want the best of both worlds and I think it's possible to get there. At the very least more control over the process on a shot by shot basis would be nice.
 
  • Wow
  • Like
Reactions: I7guy and Dj64Mk7
It's a combination of both. Smartphone camera sensors (and lenses) were absolutely miniscule compared to most purpose built cameras and so you run into the problem of not being able to capture much information full stop, information that encompasses all the attributes of what make a photograph a photograph: light, color, texture, etc.

For the most part you can't get around the physics of a small sensor and lens, the pace of advancement in those kinds of physical sciences is a lot slower than the pace for software development so naturally the solution was to build massive photo processing pipelines to overcome the inherent weaknesses of smartphone cameras.

The bulk of these processes come from HDR and different sharpening/smoothing algorithms.

HDR was introduced to combat the poor dynamic range of smartphone camera sensors. Dynamic range is the range of light a sensor can pick up on without overexposing (where parts of the image turn pure white) or underexposing (where parts of the image turn pure black). To combat poor dynamic range, the cameras take multiple photos in quick succession wherein each photo adjusts the exposure (the light sensitivity of the camera per se) to capture 'highlights' (for example clouds in a bright sky), mids, and shadow areas (say, the shaded area under a tree) and then combine those photos to virtually extend the dynamic range of the final output. This was introduced on the iPhone 4. Here's an example of HDR on the iPhone 4, you can see parts of the highlights are no longer overexposed, the detail has been brought back:

n4zYvIa.png


One potential consequence of HDR is you kill off the subtle qualities of shadow detail that give the image a sense of depth in the first place. The above photo is a good example actually because while HDR successfully recovers highlight detail it comes at the expense of making her face look a little flatter and less '3D.' Smartphones are obviously better at HDR now vs. the iPhone 4 from a decade and a half ago but those subtle shadow qualities are still very hard to retain.

Another major trick is using sharpening, smoothing, and anti-noise functions to compensate for lack of detail from a small sensor and sensor noise from bumping up the ISO (the adjustable light sensitivity of the sensor). One way to adjust exposure is to adjust the 'aperture' of the lens itself which is the physical size of the lens opening (click for image example) but most smartphones do not have an adjustable aperture, unlike the Xiaomi 14 Ultra. If you cannot adjust the aperture, you must adjust the ISO instead. Increasing the ISO of a sensor typically results in more noise from the sensor (click for image), so you can get rid of it with anti noise functions.

The problem with anti noise is you're somewhat forced to remove image detail so to give the illusion of bringing it back you introduce sharpening and smoothing functions. You might sharpen the edges of subjects and smooth/saturate some of the other elements of the subject to exaggerate the colors and shape of a subject. Apple have many names for this process like 'Smart HDR' or 'Photonic Engine' and it's caused a lot of controversy. In the past you could do what you're proposing and turn off Smart HDR to get more realistic images but Apple disabled that feature since the 12 Pro.

Apple engineers decided the overall 'artistic' look of a final uncropped image on a smartphone is more important to their users than realism or preserving cropped detail. In an interview Apple said they model iPhone images after the look of oil paintings, ironic considering many iPhone customers have been complaining about an 'oil painting effect' for many years.

The below photo is a good example of the shift Apple have taken in image processing. One is a crop from the 13 Pro which radically boosted the image processing vs. the 11 Pro which takes a moderate approach to post processing. The 11 Pro has more grain but the image is more realistic, it's closer to what a typical camera might take. To a casual observer viewing the full image the 13 Pro may look more aesthetically pleasing (to a point) and indeed many portrait shots I took with my 13 Pro looked a lot nicer on my phone screen than with my 11 Pro.

One targets an 'artistic' look, the other targets a 'realism' look. A lot of people like the artistic look, they want things to look better than real life.

nmN8yPp.png


Ultimately there is a balance that must be struck. The Xiaomi is interesting because the camera hardware is a lot more capable than most competitors, primarily because all the cameras on that phone are "1 inch sensors" (not actually 1 inch lol, that's just the name of that size class), they have adjustable aperture (hardly any phones have this), and naturally the lenses are bigger so the phone's dynamic range straight from the sensor (without HDR) is a lot wider. More capable hardware means you have the freedom to tone down post processing and HDR. Yes the Xiaomi is doing a lot of post processing but it doesn't have to be as dramatic, both because the hardware enables it and because the engineers chose to target a more natural look vs. the highly stylized approach taken by Apple, Google, and Samsung. Leica, Xiaomi's partner, are famous for their '3D look' and realistic, rich colors so they also likely had a big say in Xiaomi's image target.

I prefer what Xiaomi and Leica are doing. I hope that if Apple use larger sensors in the future they can tone down some of the 'oil painting effect' and HDR processing to make the image look a bit more realistic rather than artistic. Again, it's a balance, I want the best of both worlds and I think it's possible to get there. At the very least more control over the process on a shot by shot basis would be nice.
That’s such a detailed explanation. Thank you so much! You didn’t need to go anywhere near that far, yet I’m so grateful you did!
 
  • Love
Reactions: zakarhino
How do all the foldable phones show up in pawn shops if no one buys them and you never see them?

Because pawn shops are a place where you can see the results of bad purchasing decisions and failed products. By looking at the quantity of goods in pawnshops by their category it is possible to see which products are failing for make their users happy.
I will say that I own a Fold 4.

Sorry.
 
I agree with many people here. I hope they fall on their face. I'm switching to Samsung mid year. I never was an apple fan but all of my family had them so I had to get one. I'm done now. Android gives me options and I feel their phones are developed with the consumer in mind.
 
The high price does not help. This along with small yearly improvements, many will not be upgrading.
 
Time for Apple to return to the iPhone as its #1 priority. The iPhone hardware has little change over the last few years other than the camera and S.O.C
With all the iPhone related patents that Apple has that have never made it to fruition it shouldn’t be hard to improve the iPhone
When I see people with 4 and 5 year old iPhones it’s because of lack of innovation and the same boring design year after year
 
I get why this is happening, but as an American until there is a computer better than the Mac and a tablet better than the iPad, I ain't going nowhere.

Individually the iPhone may be getting it's butt kicked, but the seamless/easy-as-hell integration of iPhone, iPad, Mac, Watch, AirPods, and iCloud is second to none.

Many of competitors devices have a better feature here or there, but "whole-istically"? Not even close.
 
  • Like
Reactions: I7guy
Time for Apple to return to the iPhone as its #1 […]
When I see people with 4 and 5 year old iPhones it’s because of lack of innovation and the same boring design year after year
Citation or in your opinion? Apple has stated publicly they don’t expect people to keep upgrading serially. Unlike competitors spoke customers can provide revenue with many of apples other products and services.
 
What you call "greed" is Just how business works. Other companies sell cheaper phone because they wouldn't sell otherwise, not because they're less greedy. The market has no human-like soul. Never.

What you describe as "just how business works" is called greed, plain and simple. People have a human soul and people run the market. The fact that you want to simp for big business and pretend like that's just the natural order of things and that we have no control over it would be laughable if it weren't so sad.
 
What you describe as "just how business works" is called greed, plain and simple. People have a human soul and people run the market. The fact that you want to simp for big business and pretend like that's just the natural order of things and that we have no control over it would be laughable if it weren't so sad.
Who told you I’m a simp for businesses?
I sad that’s how it works, never said I like it.
People don’t run the market. Well, they do, but the market naturally selects soulless ones. Nobody who puts ethics over money goes very far.
Never humanize corporations. That’s they’re game, they have teams of people choosing how they want to be perceived by customers and faking it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.