Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sockatume

macrumors member
Original poster
Jul 21, 2010
59
14
I took my iPhone 4 (bought last August) in for a replacement (last October or so) and the new handset had the dreaded "coloured blob" where it's got a pink cast on the edges and green in the middle. A few weeks ago I had that handset replaced because the home button stopped working, and the new one has the issue too.

Are we just stuck with this now? Clearly there are iPhones that didn't suffer from the problem (my near-launch model) but probably they've added or changed component suppliers since then. If all the new iPhone 4s now have this problem then obviously it's not worth going in for a swap.

As a sidenote, I've seen a lot of speculation about the origin of the issue that's clearly not true. I see the green blob in two situations:

1) On the viewfinder, but not in photos, in moderate light (e.g. an overcast day)
2) On viewfinder and in photos, in low light (e.g. indoors or at twilight)

It's not dependent upon artificial lighting at all, except that obviously artificial light is much dimmer than daylight. The green spot is clearly visible on the viewfinder if I'm taking pictures outside on a dim day.

It's almost certainly a problem with the sensor itself, probably its baseline calibration at the supplier. It's not like iPhone 4s are the only handsts with this sort of issue.
 
Last edited:

iPhysicist

macrumors 65816
Nov 9, 2009
1,343
1,004
Dresden
You will see it on screen and photos because its wysiwyg. Check out Wikipedia for a fast introduction in wavelengths of both daylight and artificial light and you will see that your mistaken. Its clearly an artificial light problem. And yes it's common issue but better sensors are expensive - and the iP4 Sensor is already one of the better ones in todays smart phones.
 

sockatume

macrumors member
Original poster
Jul 21, 2010
59
14
You will see it on screen and photos because its wysiwyg. Check out Wikipedia for a fast introduction in wavelengths of both daylight and artificial light and you will see that your mistaken. Its clearly an artificial light problem. And yes it's common issue but better sensors are expensive - and the iP4 Sensor is already one of the better ones in todays smart phones.

Maybe you should read a little more closely:

1) There are situations where it is visible on the screen, but not on photos. Don't patronise if you're not going to read properly.

2) It occurs under natural daylight, incandescent light, and fluorescent light, provided the light level is low enough. I have the images in front of me.

I struggle to think of a mechanism by which a reduced colour rendering index of the light source would cause the tint to vary, inhomogeniously, across the sensor.
 
Last edited:

sockatume

macrumors member
Original poster
Jul 21, 2010
59
14
Regarding baseline calibration, I shouldn't have to point out to someone who calls himself "iPhysicist" that baseline errors are more of a problem with a low signal.

The errors are more prominent on the viewfinder than in photographs because the iPhone's viewfinder is run at 60fps, i.e. a fixed 1/60 shutter speed, while photographs are taken at an arbitrary shutter speed, which will certainly be less than 1/60 in low light. Therefore in low light, the viewfinder is working with a weaker signal than the photographs it generates.

That's my hypothesis. It's trivial to control for the light level by using a light meter - measure the level when you take a snap under artificial light, and try to get a daylight shot in similar conditions. You should see the same green blotch.

(Yes, it is one of the best cameraphone sensors out there. However it's a little disappointing to have such a wide variation between phones that are night perfect down to the lowest light levels, and ones that go wonky on colour when it's even slightly dim.)
 
Last edited:

patp

Guest
Apr 10, 2008
849
1
Maybe you should read a little more closely:

1) There are situations where it is visible on the screen, but not on photos. Don't patronise if you're not going to read properly.

2) It occurs under natural daylight, incandescent light, and fluorescent light, provided the light level is low enough. I have the images in front of me.

I struggle to think of a mechanism by which a reduced colour rendering index of the light source would cause the tint to vary, inhomogeniously, across the sensor.

Maybe you should stop acting like such a pompous dick. Amazing, you ask for help and slag the first person offering some help. Whether or not it was exactly what you're looking for you have no reason to be so condescending.

Take your phone back to Apple and be a dick to them. You'll get lots more bees with vinegar.



Sent from my iPhone using Tapatalk
 

sockatume

macrumors member
Original poster
Jul 21, 2010
59
14
You'll have to excuse me if I was a little short with somebody who decides to condascend to me about the fact that photographs on my screen look like the photographs the cameras take, and then points me to the Wikipedia page about "wavelengths" because "your mistaken" about what my iPhone is actually doing. I think that repeating and elaborating upon what I'd written in the original post is not an inexcusable reaction to such a comment. I've no problem with people attempting to be helpful and getting it wrong, but iphysicist's post came across as more dismissive than helpful. If I'm mistaken, I can only apologise.

I don't have a lot of patience of people who call me a "pompous dick" either but I'll let that slide.

Like I wrote, I'd take it back to the Apple store and chat with them, but it's well out of my way, the splotch is not a huge issue, and there's a good chance they're just all like that because of the tolerances in making the sensors. So I'd like to know whether it's a common problem or just bad luck.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.