Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just what is "SLR quality?"

[...]

In the end, maybe "SLR quality" could turn out to be "quality of a 5-year-old cropped-sensor DSLR." Nobody said, "Quality of today's best available DSLR."

If I had a penny for every "It can't be done" uttered over the past 200 years, my lifestyle would be very different than it is. I'd probably be off right now attending a private chamber music performance, or reading a print book in some exotic, off-the-grid location, away from all this noise. Or maybe I'd own a 4 x 5 view camera with an imaging sensor that would blow today's best DSLRs out of the water.

Outstanding post. :cool:
 
We most assuredly will at some point. Never say never.

The comparison isn't between a moving point and a fixed point. Both types of cameras are growing. Just imagine the kind of image you'll be able to get if you scale up one of these snazzy sensors and put it in a DSLR.
 
Although I think it'll take a year or two to this tech into an iPhone, I'm excited by the possibilities. I'm already astounded by the quality of the images I can capture with my 6 Plus. Sure, I can still get better results from my DSLR, but I don't have it with me most of the time.
 
Just shows how much more valuable brand is than innovation.

3 billion for Beats, 20 million for incredibly innovative camera technology.

I wouldn't complain about the 20 million, but it's not like this company consisted of 1 individual.. perhaps stock was given out as well.
 
We assume this acquisition was made for camera purposes. I think it could possibly have been made for Apple Watch purposes. The 4X sensor looks similar to the sensors on the back of the Watch.
 
Quality-of-glass is also not a given - manufacturing tolerances must be tighter when producing miniaturized components. There's no law of physics I'm aware of that precludes miniaturization to smart phone dimensions. .

actually, this is the very problem

Light doesn't 'miniatiarize'

yes, you can shrink the sensors and lenses and focal length down, but as these decrease, so does the ammount of light that psses through the optics.

this is the very argument that precludes smartphone sensors from achieving "SLR" like results.

Photography is the art of light. its very nature depends on light. the more light that hits the sensor, the better it can be at understanding what that light represents, the colours it carries and all that information

whhen you miniaturize that, those sensors have to become smaller. and as the MPX count has gone up, each 'dot' has to become smaller. that light catching bucket has less chance of catching light.

the counter to this is driving up the ISO on each bucket. but when there's just not enough light at all getting to the sensor, it picks up nothing. This is what noise essentially is. nothing hitting the sensor in the picture.

so far, physics is a big limitation to how much miniaturization can be done to cell phone cameras. nobody has yet discovered a way of multiplying the light inside a camera to increase it beyond what the glass and it's opening allows.

and there is some very complex math out there (well beyond my head) that proves this if you'd like to find it
http://www.cambridgeincolour.com/tutorials.htm
an awesome site that has tonnes of relevant info
 
Last edited:
looks good...

so, we will see us all not having any standalone SLR's anymore ?:cool:

What a future we will live...

I guess there'll be still a need.. The pro photographer probably ain't gonna use iPhone to snap a close up of a bug. But if this is the future,,, i like it.


I guess it could happen..... Providing software bugs are sorted out first.
 
So if you make an app to apply filters to and share smartphone photos, that's worth $1 billion, but if you make the actual hardware to take those photos, that's only worth $20 million.

Sometimes I really hate this industry.

And now guess how much the scientists are getting paid who discovered the physics to make the digital sensors possible.
 
Sorry, but the reason you won't get "image quality on par with SLR cameras" isn't because of the sensors or resolution, it's because of the lenses. The quality, quantity, and size of glass you can fit into a phone or compact camera is the bottleneck and the reason phones and compacts still haven't, and probably won't replace SLR-sized camera platforms for a loooooong time for professional quality images.

You can pack as many sensors as you want at resolutions that are stupid-high, but all you're capturing is as good as what's coming through the "glass".
Don't be so sure...
right now you think that you need bigger lens to catch more light, thus increase the quality of the image.
but who knows, maybe they develop better lens ( with new glass material), and with more sensitive sensors that can really compete to SLR quality.

If you look on iphone 6 image today, it win totally over most of the pocket cameras in the last decade, and even win the most advanced pocket cameras available today.
 
This is terrible news. I was hoping for minimum updates on the 6s and planned to skip it. Therefore I used the saved money to buy the watch. Now, I will stew with envy for a year until I can buy the 7. :mad::eek::D

Ha! Relax, it's too early. This technology will probably make it jsut in time for the 7, so you and I will be the first to enjoy it :D

Hang on to your 6
 
It's all cool, but let's stop pretending; it's not SLR quality.

Sensor size does matter. You will never get the performance of an SLR out of a small sensor.

Never say Never. The same was said about digitial not matching film, or digital not matching medium format.

considering that there are SLRS with smallish sensors that perform really well.. it could happen. BTW Im a Photographer .. I have a Canon 6D with many L lenses, and the idea of a phone replacing my SLR feels silly but i wouldnt be surprised if it happens one day
 
"Hardware [...] can be ripped off by any competitor because patents are becoming worthless" sounds like a good reason to hate this industry. :( (Not trying to be sassy.)

Yea, hate it or love it. I go back and forth personally. I was merely trying to point out that it's difficult to reconcile the "destroy all patents" and "good hardware design should be worth more than silly gimmicky software" arguments that are very frequently vocalized in this forum.

Seems to me we have the system we deserve. I guess that is something to hate.
 
actually, this is the very problem

Light doesn't 'miniatiarize'

yes, you can shrink the sensors and lenses and focal length down, but as these decrease, so does the ammount of light that psses through the optics.

this is the very argument that precludes smartphone sensors from achieving "SLR" like results.

Photography is the art of light. its very nature depends on light. the more light that hits the sensor, the better it can be at understanding what that light represents, the colours it carries and all that information

whhen you miniaturize that, those sensors have to become smaller. and as the MPX count has gone up, each 'dot' has to become smaller. that light catching bucket has less chance of catching light.

the counter to this is driving up the ISO on each bucket. but when there's just not enough light at all getting to the sensor, it picks up nothing. This is what noise essentially is. nothing hitting the sensor in the picture.

so far, physics is a big limitation to how much miniaturization can be done to cell phone cameras. nobody has yet discovered a way of multiplying the light inside a camera to increase it beyond what the glass and it's opening allows.

and there is some very complex math out there (well beyond my head) that proves this if you'd like to find it
http://www.cambridgeincolour.com/tutorials.htm
an awesome site that has tonnes of relevant info

The "there ain't enough photons" argument is based on several assumptions - most of which have to do with whatever the current state of the art happens to be. As far as I know, we are not yet at the quantum mechanical limits.

I happily concede that bigger is indeed better - all other things being equal, larger photo sites collect more photons. In some circles, that's called "brute force engineering." If you took two multi-lens/multi-sensor cameras, one using phone-sized sensors, the other 35mm-sized, the larger sensor would undoubtedly win, for the reasons you present.

But the complete statement is, "All things being equal (which they never are)..."

The point here is that the multi-lens/multi-sensor approach allows signal processing techniques that are unavailable in a single-lens/single-sensor configuration. The only question we can really ask is, "Are those techniques sufficient to overcome the normal shortcomings of that small sensor?"

Now, imaging noise is not the "absence of light." (see Wikipedia: http://en.wikipedia.org/wiki/Image_noise ). A sensor doesn't start generating noise in a zero-photon environment. It's more accurate to say that noise is more apparent in the absence of a masking signal.

Noise in an image sensor has a variety of causes. "Shot noise" is related to a shortage of photons, but it's more specifically due to (per Wikipedia), "statistical quantum fluctuations, that is, variation in the number of photons sensed at a given exposure level." And yes, the fewer photons there are, the greater the variation. Regardless, it's a random fluctuation - multiple sensors focused on the same scene will generate different noise patterns/placements. Compare the images, and the noise component becomes easier to identify and remove. The other noise components are not related to the number of photons hitting the sensor - they're part of the noise floor, artifacts that are present regardless of signal level.

It's the classic analog signal to noise ratio problem (and an imaging sensor is an analog device). The electronics generate constant, low-level noise as a matter of course. Provide enough signal, and that noise is obscured. In low-signal situations, the noise becomes apparent. Push the noise floor lower (such as cooling the sensor to reduce thermal noise), or make it possible to separate the noise component from the signal, and the game changes.

An imaging sensor cannot better "understand what that light represents." Leaving aside the anthropomorphism and romanticism, the sensor simply converts incoming photons to electrons. The light is either detectable with a reasonable amount of accuracy, or not. If you want the scene to be "intelligently analyzed" you run the output of the sensor through a computer.

The notion of "SLR quality" is not some sort of cosmological constant. It is a qualitative judgement, based on an ever-changing baseline. Yesterday's SLR quality is tomorrow's crap. We compare test scores and 100x enlargements and conveniently forget that, in many circumstances, the differences would not be discernible to the naked eye in a double-blind (excuse the term) test.

The practical test of photographic quality has always been, "To what degree can it be enlarged before the defects become perceptible?" Back when a 10x enlargement was the practical upper limit for an exhibition-quality print from a 35mm negative, "SLR quality" was nothing to be proud of. It was (and still is) just one point on a continuum.

So again, it's not about two sensors, one small, one large, going mano a mano. It's not about violating the laws of physics. It's about what comes out the far end, after signal processing.

A rough analogy is what happens with our own eyes. Unless blessed with perfect vision in both eyes (a situation I certainly don't enjoy), our ability to see improves when both eyes are open - the strengths of one eye compensate for the weaknesses of the other, thanks to the power of our brains.
 
So if you make an app to apply filters to and share smartphone photos, that's worth $1 billion, but if you make the actual hardware to take those photos, that's only worth $20 million.

Sometimes I really hate this industry.

Why do you hate it. What did it do to you?
 
SLR quality from an iPhone would be flat out terrific. The iPhone already takes incredible pictures as is. The only thing I want from it is good depth of field on certain shots.

Unless the tech discussed in the article has a huge impact on IQ (which is unlikely) the only way that can happen is if they increase the sensor size, which then increases the size of the lens, which then increases the thickness of the whole module, then you have to accommodate for it all in the chassis of the phone, which is the last thing Apple want to do.

Just look at the Nokia 808 .. its still the benchmark in mobile imaging, but its 18mm thick at the camera module. The capacitors for the Xenon flash also add to that, but even without those you are still looking at 13-14mm which is way to much for our obsession whit thinness.

If they continue to use tiny image sensors, the dream of having "dslr like" IQ is far fetched.

I think that Apple should just have a 3rd variety of the iPhone aimed at photographers.. no compromise imaging device running iOS would appeal to a lot of people out there.

In the year that Ari Partinen's been at :apple: should be very interesting to see what he's come up with!

He played a big role in the development of the Nokia mentioned above.. so yeah... good things are coming :)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.