http://geeksframework.com/what-is-an-isocell-camera/
The ISOCELL camera sensor is a new innovative technology in the mobile phone industry. I first noticed the term when I was going through the specifications of OnePlus X. I did a small research and found out that the ISOCELL sensor has higher image quality than the other sensors used in smartphones and tablets. The technology is a brainchild of the Samsung Company and is set to revolutionize the phone industry. ISOCELL uses a technology that is unique from other past camera sensors. ISOCELL camera has integrated barriers between the individual pixels.
The ISOCELL camera being a relatively new term in the market, one would wonder how it works, and its benefits over the backside-illuminated (BSI) sensors. Picture quality is usually affected by the amount of light a camera can capture in each of its pixel. With the ISOCELL camera’s barriers, it is possible to reduce crosstalk between them by 30 percent hence reducing noise in low-light conditions. The cells also improve on the back side illumination, each pixel is said to have a light increasing capacity of 30 percent enhancing dynamic range of light.
The conventional BSI sensors on the contrary have photons and photoelectrons between pixels that leak light which often reduces image sharpness and color accuracy. According to sources from Samsung, ISOCELL sensors are slimmer than the BSI sensors making them suitable for slim superphones. This is the trend nowadays, as smartphone manufacturers tend to prefer thin and smaller pixels on sensors to increase resolution.
The ISOCELL camera technology truly has many advantages. As we have seen by just isolating each pixel with a barrier, the correct photons remain directed to the right cells and therefore increasing photo clarity. The BSI sensors on the other hand have high crosstalk leading to lower color fidelity. That is why the ISOCELL camera is attributed to improved sharpness and richness of photos.
Also, with the ISOCELL sensor’s barriers, the photodiode can be increased. This allows for more light from different angles, allowing the lens to be adjusted. It allows reducing the height of the camera module, which is suitable for slim mobile devices.
_
https://www.androidpit.com/samsung-isocell-sensor
http://www.androidauthority.com/isocell-how-it-works-344628/
Samsung announced ISOCELL in 2013. How is any of this surprising? Wait when it keeps improving. Just reaping the fruits (non-Apples, BSI sensors) to their labor.
Again, Samsung wins because they are the ones coming up with new breakthrough technologies, engineering, and manufacturing it themselves. Others just stamp their brand on custom designs.
Probably why the iPhone 6 & 7 series is still plagued with an ugly camera hump.
Look up ISOCELL if you don't know what it means. I can already hear iPhone users wishing deep down they had Samsung ISOCELL too. Me, me, me, MINE! Then have the audacity to call Samsung copycats again for offering the same color options as iPhones.
Proving once again that Apple is the inferior
original equipment manufacturer than Samsung. Until Apple comes up with more homegrown technologies, they will always be in a disadvantage against Samsung.
Thanks for those links! The Android Authority one is the best. The android pit one is just a couple paragraphs quoting a Samsung press release and the Geek Framework article that you've quoted extensively is mostly gibberish-- despite the comment claiming the reader now has "total knowledge" about the sensor now.
First point is that there is a false competition being set up here between BSI and ISOCELL. ISOCELL is BSI. They aren't alternatives. "ISOCELL is actually the commercial name of what Samsung calls 3D-Backside Illuminated Pixel with Front-Side Deep-Trench Isolation (F-DTI) and Vertical Transfer Gate (VTG)"
BSI is meant to get the wiring under the active part of the photodiode so that it doesn't shade the pixel. Typically the photodiode is doped into the silicon wafer, and the wiring is built on top of it so all the wires get between the light source and the photodiode. BSI grinds down the back of the wafer until the photodiode is exposed from the backside, and then lights it up from there, leaving the wiring behind it.
As pixels get smaller, the fact that silicon is translucent and these crazy thin cameras force lens designs with very oblique angles of incidence start to become a bigger problem. Light meant for the green pixel can pass through the silicon at an oblique angle into, for example, the red pixel. This hurts color separation. The other problem is that as the photodiodes start to charge up, electrons migrate into adjacent cells leading to bloom.
In addition to BSI, ISOCELL uses deep trench isolation and vertical transfer gates to address the color bleed and bloom. The overall objective was to kill cross talk between pixels. We know how to do that-- create a moat around the pixel and fill it with insulator (deep trench isolation). The problem is that you can have either insulator or pixel, but you can't have both in the same place-- as pixels get smaller it's harder to sacrifice photodiode area to insulator. Samsung's answer was to create a vertical transfer gate to read the pixel data out-- since the gate doesn't have to lay next to the pixel anymore, but can lay below it, the pixel area can be somewhat larger maintaining dynamic range
This cut cross talk between pixels from 19% to 12.5%-- or by about 6.5 percentage points. Steps in the right direction, but incremental improvement, for sure.
All cool stuff, but it is not unique to Samsung--
this paper from ST Micro does a good job of explaining the technique and the reasons behind it..
The 6s also used deep trench isolation.
ST Micro also has a patent on vertical transfer gates.
So while the S7 certainly holds its own in lower light, I would give more credit to the larger overall sensor size than I would give to ISOCELL.
This also has nothing to do with the camera bump-- the S7 has a bump too. The S6 had a pretty agressive lump on the back. The S7 bump looks lower profile than the S6 and iPhone 7 mainly because the phone itself is thicker-- it's 7.9mm
with something like a 0.8mm bump versus the iPhone 7's 7.1mm thickness plus whatever it's bump is (I can't find a measurement, but it looks like less than 1.6mm to me).
The camera bump is to accommodate a 6 element optically stabilized autofocus lens. Think about that: 6 lens elements, a sensor and housing with room to extend and focus and slide to stabilize all in 7 or 8mm of depth.
It's commendable that Samsung can make a competitive image sensors to use in their phones along with all of the other semiconductor technologies they develop. They are, without a doubt, a top tier technology company. I don't think it necessarily makes for a better phone though, if they're forced to only use Samsung developed technologies. They're smart folks, but they can't be first with all the answers. By sourcing components from 3rd parties, Apple can make the most of everyone's R&D.
The benefit to Samsung, of course, is that they get to keep the profits for the parts, as well as the finished devices, and the profits from the parts they sell to Apple to boot.
Anyway, cool deep dive into the tech, but I don't think it supports your conclusion that Apple is inherently inferior or that the soup-to-nuts model of Samsung has any real advantage in the final product.
[doublepost=1474241497][/doublepost]
I didn't read the last couple of pages, but it seems the argument for the iPhone is that it is more natural with less saturated colors, be it the photos or the AMOLED screen of the Samsung. Meanwhile we have Apple increasing the color on the iPhone 7 screen by 25%, and adding "wide color" capture on the camera.
If the next iPhone has more saturated colors than the Samsung, the argument will then be that the Samsung is too washed out. Remember the first Galaxy Note how it was being ridiculed by many Apple diehards? And now we have the plus sized iPhone outselling the small iPhone.
No, you're misunderstanding how color is measured.
Images are stored with color information. Accurate color means making what you see on the display match what the file says. The Samsung super-saturated display means that they take a color and make it about 30% more colorful than the file says it should be. The new iPhone display supports rendering of files that express a broader range of colors, and renders them true to the file data.