Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

scorpio187

macrumors member
Original poster
Sep 22, 2013
30
3
Just curious. Does it have to do with the lenses they use, their software or both?
 
I just tried the portrait mode on the pixel on a friends phone. It is, I'm sorry tell this, far superior in that mode. Compared to the iPhone X, I did a side by side test, its night and day. If you believe that the iPhones portrait mode is better, you're brainwashed.

The hope is, that Apple gets to that level in software. The pixels portrait mode uses ML to do that stuff. So it's possible to improve the camera with a firmware update.
 
Last edited:
OK, folks if you're gonna just leave comments like ''not really better'' better ignore my question entirely. From what I've seen it IS (how much is for you to decide) better, which is a shame because I love my iPhone and I really hope that Apple will improve the camera in the future.

When it comes to phone cameras my knowledge is very limited. Anyway, could someone explain why is it better?
 
Not really better.
In denial, with no reasoning or sources? Well that's just splendid... Majority of people vote pixel over iPhone camera, regardless if it's a blind test or not.

It would be a combination of lens, sensor, and software that make Google's camera stand out. It's quite impressive that it is beating several phones that have two sensors, with only one! Google made the right decision in becoming an AI-first company.
 
  • Like
Reactions: scorpio187
In denial, with no reasoning or sources? Well that's just splendid... Majority of people vote pixel over iPhone camera, regardless if it's a blind test or not.

It would be a combination of lens, sensor, and software that make Google's camera stand out. It's quite impressive that it is beating several phones that have two sensors, with only one! Google made the right decision in becoming an AI-first company.
yep....then the Google Visual Core chip hasn't been fully enabled yet. So it will take even better pictures after the software update in future. With AI learning the Pixel camera/software will get better over time.
 
  • Like
Reactions: Regime2008
Software. Algorithms.

https://www.theverge.com/2016/10/18/13315168/google-pixel-camera-software-marc-levoy

Clearly, this is by far the most competitive Google has ever been in mobile photography. But the Pixel phones, on paper, don't have cutting-edge hardware, relying on an f/2.0 lens without optical image stabilization. Instead, and in typical Google fashion, Google has turned to complex software smarts in order to power the Pixel camera. I spoke with Marc Levoy, a renowned computer graphics researcher who now leads a computational photography team at Google Research, about how software helps make the Pixel camera as good as it is.

"Mathematically speaking, take a picture of a shadowed area — it's got the right color, it's just very noisy because not many photons landed in those pixels," says Levoy. "But the way the mathematics works, if I take nine shots, the noise will go down by a factor of three — by the square root of the number of shots that I take. And so just taking more shots will make that shot look fine. Maybe it's still dark, maybe I want to boost it with tone mapping, but it won't be noisy." Why take this approach? It makes it easier to align the shots without leaving artifacts of the merge, according to Levoy. "One of the design principles we wanted to adhere to was no. ghosts. ever."

"The notion of a software-defined camera or computational photography camera is a very promising direction and I think we're just beginning to scratch the surface," says Levoy, citing experimental research he's conducted into extreme low-light photography. "I think the excitement is actually just starting in this area, as we move away from single-shot hardware-dominated photography to this new area of software-defined computational photography."
_

The Google Camera app is what saved the Essential's camera.
 
I just tried the portrait mode on the pixel on a friends phone. It is, I'm sorry tell this, far superior in that mode. Compared to the iPhone X, I did a side by side test, its night and day. If you believe that the iPhones portrait mode is better, you're brainwashed.

The hope is, that Apple gets to that level in software. The pixels portrait mode uses ML to do that stuff. So it's possible to improve the camera with a firmware update.

No doubt the Pixel 2 has an excellent camera and better in some ways than the X. The Pixel photos are sharper with more detail and that is a huge plus. But as far as portrait mode the X is definitely better. The X portraits have a smoother more natural look as far as the bokeh goes. With the Pixel portrait people look like cardboard cutouts. It is too defined. There is a hard cold line from in focus to out of focus. Go look at some photos taken with a DSLR. Notice how on those shots how there is a smooth transition to the out of focus areas ? I am sure Google will get better but right now you are the one that is brainwashed.
 
The X portraits have a smoother more natural look as far as the bokeh goes. With the Pixel portrait people look like cardboard cutouts. It is too defined. There is a hard cold line from in focus to out of focus.
To contrast, I've found literally the exact opposite result in my time with both cameras. In my experience, the Pixel image processing does a better job with gradual focus of subjects and their surroundings.

The X really, really struggled in particular with portraits of my cat. Now, it's a tabby so I'm sure the fur patterns, etc played a part in 'confusing' it, but I don't find the same issue on my Pixel 2. That's only one example, but any portrait photos I've taken thus far have been just point and shoot and it's done a phenomenal, I'd even say DSLR-like, job.
 
To contrast, I've found literally the exact opposite result in my time with both cameras. In my experience, the Pixel image processing does a better job with gradual focus of subjects and their surroundings.

The X really, really struggled in particular with portraits of my cat. Now, it's a tabby so I'm sure the fur patterns, etc played a part in 'confusing' it, but I don't find the same issue on my Pixel 2. That's only one example, but any portrait photos I've taken thus far have been just point and shoot and it's done a phenomenal, I'd even say DSLR-like, job.
One of my kitties is black and white. My X was blowing out the white parts of her fur when I was trying to get a pic of her lying in a patch of sunlight. The fur almost looked like it was glowing. The Pixel just captures what I see with my own eyes. Now my 7 Plus didn't blow out highlights as strongly as my ex-X and my current 8 Plus do. I think this is something Apple can fix in software and I don't know why they aren't right on it because it is something reviews have mentioned and photography is important to Apple.

I do love the warmth and some aspects of how light is handled in an iPhone 7/8/X photo. But that can be reproduced in editing. What can't be done in editing is recapturing detail that the eye sees but the camera failed to pick up in the first place. And this is where the Pixels excel, they get some nuanced detail and textures the iPhones somehow miss, especially in highlights.

Even though there is no room lens, I'm finding the Pixel does a good job holding detail in digital zooming. I will nevertheless appreciate it if Google adds a second lens. But I won't suffer poor zoom quality like I thought I would with the camera currently available on the Pixel 2.

I do prefer the smooth panning of an iPhone video, though. Apple needs to up the game on tbe audio and they will have a near perfect smartphone video recorder.
 
Cloud compute processing is more powerful and limitless compared to local processing.

It'll be interesting to see how Google processing handles this shot where Apple portrait mode cuts out part of the glass.

VXrf571.jpg
 
To contrast, I've found literally the exact opposite result in my time with both cameras. In my experience, the Pixel image processing does a better job with gradual focus of subjects and their surroundings.

The X really, really struggled in particular with portraits of my cat. Now, it's a tabby so I'm sure the fur patterns, etc played a part in 'confusing' it, but I don't find the same issue on my Pixel 2. That's only one example, but any portrait photos I've taken thus far have been just point and shoot and it's done a phenomenal, I'd even say DSLR-like, job.

Have any picture examples to share with us?
 
Have any picture examples to share with us?
Sure, I've linked them below - Brief write up on each one ... not much to say about the Pixel photos, because they are great. Same room, same cat, similar lighting conditions, crazy different results.

Pixel 2 XL - Image 1
Crazy image, captured while he was playing. Notice the soft gradual change in focus towards the background of the picture. Mostly clean edges, which are not easy with the fur detail.
https://i.imgur.com/CqR8jxK.jpg

Pixel 2 XL - Image 2
The cutest cat. Again, gradual focus shift, with nice clean edges. Color and lighting looks fantastic in this one.
https://i.imgur.com/9oALK79.jpg

iPhone X - Image 1
There is some very weird processing going on here. His right eye is blurred for some reason, half of his paw is blurred, choppy edges, hard focal edging on the couch, extreme white blowout behind him - this an example of bad image processing.
https://i.imgur.com/gnTpxCc.jpg

iPhone X - Image 2

This one isn't quite as bad, but notice the hard/weird focal edge along the botom of the cat, the random blur at the bend of his leg and the blowout beyond his face.
https://i.imgur.com/lhvsE0M.jpg
 
Last edited:
Sure, I've linked them below - Brief write up on each one ... not much to say about the Pixel photos, because they are great. Same room, same cat, similar lighting conditions, crazy different results.

Pixel 2 XL - Image 1
Crazy image, captured while he was playing. Notice the soft gradual change in focus towards the background of the picture. Mostly clean edges, which are not easy with the fur detail.
https://i.imgur.com/CqR8jxK.jpg

Pixel 2 XL - Image 2
The cutest cat. Again, gradual focus shift, with nice clean edges. Color and lighting looks fantastic in this one.
https://i.imgur.com/9oALK79.jpg

iPhone X - Image 1
There is some very weird processing going on here. His right eye is blurred for some reason, half of his paw is blurred, choppy edges, hard focal edging on the couch, extreme white blowout behind him - this an example of bad image processing.
https://i.imgur.com/gnTpxCc.jpg

iPhone X - Image 2

This one isn't quite as bad, but notice the hard/weird focal edge along the botom of the cat, the random blur at the bend of his leg and the blowout beyond his face.
https://i.imgur.com/lhvsE0M.jpg
The iPhone photos are very warm. I know the Pixel photos tend to be slightly cooler than what the eye sees. Just slightly. Which camera do you think best captures his fur color? And he is adorable, by the way.
[doublepost=1515374971][/doublepost]
Cloud compute processing is more powerful and limitless compared to local processing.

It'll be interesting to see how Google processing handles this shot where Apple portrait mode cuts out part of the glass.

VXrf571.jpg
Whoa, what sorcery is this? :eek: I know that has gone horribly wrong but it also looks kind of cool!
 
The iPhone photos are very warm. I know the Pixel photos tend to be slightly cooler than what the eye sees. Just slightly. Which camera do you think best captures his fur color? And he is adorable, by the way.
Thank you haha. The Pixel images definitely capture the fur color better. The first X image did have a filter on it for the record.
 
Thank you haha. The Pixel images definitely capture the fur color better. The first X image did have a filter on it for the record.
Here are cropped in photos of my cat. Sorry for the cropping but it’s necessary to get the forum to accept the pictures.

The struggle is real. I have photo after photo of blown out highlights on this cat on my 8 Plus and overly warmed tones under the icandescent lighting.

This is my brand new iPhone 8 Plus (returned the prior one due to problems with call quality, defective lightning jack)


DC1F05CC-4361-49AD-A528-CAB8EBA93455.jpeg


This is from my Pixel 2, the colors are perfect. White balance actually isn’t cold. Her color is as I see her with my eyes.

B4940A52-8F52-4FF8-B2D4-678697548C77.jpeg
 
Cloud compute processing is more powerful and limitless compared to local processing.

It'll be interesting to see how Google processing handles this shot where Apple portrait mode cuts out part of the glass.

VXrf571.jpg
Perfect exposé of poor Portrait Mode.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.