Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
See below



if this is whats to be expected, They're still a long way off. its good. Does a good job at smoothing out the noise via software, but there are still significant image degradation issues that in no way even matches what you will get on a large sensor format camera.

9to5 Mac posted some nice examples.

screen-shot-2015-04-14-at-2-04-33-pm.png
 
This is terrible news. I was hoping for minimum updates on the 6s and planned to skip it. Therefore I used the saved money to buy the watch. Now, I will stew with envy for a year until I can buy the 7. :mad::eek::D

I don't think we will see this tech in the next iPhone, the earliest we will see this is in the iPhone 7 (if it will be called that). I am pretty sure the iPhone 6s has already been designed and the new key feature was already clear by the time they bought this company. That will wee on my fireworks as I intended to buy the 6s this year, on the other hand it gives me time to recover from the cost of Apple Watch.
 
So if you make an app to apply filters to and share smartphone photos, that's worth $1 billion, but if you make the actual hardware to take those photos, that's only worth $20 million.

Sometimes I really hate this industry.
The difference is that the former has already convinced 100 million 'users'/'customers' to use their product. If that company had managed to convince Apple to buy 100 million of their camera modules, then they'd be worth much more than $20 million. Creating a product isn't enough, you also have to succeed in selling it in large numbers.
 
It's more like, if you make an app that has 50 million registered users that create their own content, most of which compulsively check for new content several times per day, translating into 50 million sets of eyeballs for ads, it's worth $1 billion; but if you make comoditized hardware that can be ripped off by any competitor because patents are becoming worthless, that's only worth $20 million.

Yeah, you're right, of course. Still, it's strange. I could build a reasonably functional Instagram clone in a few weeks, but I wouldn't have the first clue how to build a camera lens/sensor. I wouldn't mind the billion, though I'd settle for the $20 million in a pinch.
 
if this is whats to be expected, They're still a long way off. its good. Does a good job at smoothing out the noise via software, but there are still significant image degradation issues that in no way even matches what you will get on a large sensor format camera.

Note what I mentioned below. The lack of both banding and edge loss (due to overly aggressive gradient smoothing) is what makes this impressive. Those aren't really the same as noise, which is more an issue of captured signal range.

here is the difference in picture quality. Absolutely amazing!
Image

The lack of visible banding is really impressive, as it doesn't have that scrubbed look that you get from gradient methods of banding reduction.
 
It's all cool, but let's stop pretending; it's not SLR quality.

Sensor size does matter. You will never get the performance of an SLR out of a small sensor.
If they dedicate an area equivalent to a DSLR sensor to camera modules, the combined sensor area of those camera modules can approach that of a DSLR sensor. Of course, you have to resolve parallax issues for anything within a few metres of the camera which will cost some resolution.

Another solution is to capture images at a capture rate high enough that there is no significant subject movement during the capture burst (eg, 10 images during 1/10 s burst of 1/200 s exposures at 100 fps gives the equivalent of 1/20 s exposure but by aligning all images digitally one does essentially digital image stabilisation).
 
It's funny how they talk about being close to leaving your SLR at home when in my experience I wouldn't leave my point-and-shoot at home for important pictures. I have an iPhone 5s and a Canon S110. People have asked before why I have so much trouble with the iPhone's camera and I finally figured it out. My hands have a tremor. I simply can't hold the phone still enough to get good pictures most of the time. I don't have this issue with the Canon because of its optical image stabilization. I can also crop my shots with the Canon due to higher megapixels, as well as do optical zoom.

I think the biggest advancement, at least from my perspective would be to bring optical image stabilization to more phones (I know the iPhone 6 plus already has it). After that, optical zoom would be next.
 

What about when he predicted the discontinuation of the 17 inch MacBook? Or what about when he predicted an iPad 4 being released a mere 5 months after the iPad 3? Or what about when he predicted that the discontinued iPad 4 would reenter production? He was also the first to report a gold option for the iPhone 5s.

He should enter the lottery, he's very good at guessing.
 
It's all cool, but let's stop pretending; it's not SLR quality.

Sensor size does matter. You will never get the performance of an SLR out of a small sensor.

We most assuredly will at some point. Never say never.
 
FT's @Tim Bradshaw suggested perhaps it could also be used to control depth of field in post. Lenses use set apertures, and you can switch between them and combine data. I've used a camera like this, Lytro Illum, which captures light ray data, and re-presents it. You could take a photo and later focus on a different part of the image, for creative effect or otherwise. Plus photos have some perspective to it (much like iOS 8's parallax), see this shot I took, tap anywhere to refocus: https://pictures.lytro.com/lyt-86439183875824/pictures/900015/modal or try tilting your iPhone around / drag around on desktop. This comes from the camera RAW being reinterpreted ... otherwise it's just better low light stuff, or maybe even faster burst modes?
 
Just what is "SLR quality?" It can't be pixel count, because a smaller imaging sensor can have the same count but deliver poorer performance. It might be low-noise performance, which is an intrinsic advantage of larger-size image sensors (all other things being equal). But if a noise-reduction method can do its stuff without affecting detail/acuity, then a large sensor is not essential, either. Quality-of-glass is also not a given - manufacturing tolerances must be tighter when producing miniaturized components. There's no law of physics I'm aware of that precludes miniaturization to smart phone dimensions. One can get into the "no bokeh" and DoF arguments, but those are aesthetic things - who says the shape of the iris should have an impact on the image? In a no-iris lens, it could be considered an aberration that has been eliminated. With depth data available, DoF effects can easily be constructed, and, when combined with a bokeh effect...

And so on.

In terms of the hypothetical (not knowing if this is how LinX does it)... As noise is a random function, two sensors imaging the same scene will likely have the noise artifacts in different locations. Sum-and-difference techniques can separate the random anomalies from the elements that are common to both images. The same approach can be used to reduce lens artifacts.

In the end, maybe "SLR quality" could turn out to be "quality of a 5-year-old cropped-sensor DSLR." Nobody said, "Quality of today's best available DSLR."

If I had a penny for every "It can't be done" uttered over the past 200 years, my lifestyle would be very different than it is. I'd probably be off right now attending a private chamber music performance, or reading a print book in some exotic, off-the-grid location, away from all this noise. Or maybe I'd own a 4 x 5 view camera with an imaging sensor that would blow today's best DSLRs out of the water.
 
Note what I mentioned below. The lack of both banding and edge loss (due to overly aggressive gradient smoothing) is what makes this impressive. Those aren't really the same as noise, which is more an issue of captured signal range.



The lack of visible banding is really impressive, as it doesn't have that scrubbed look that you get from gradient methods of banding reduction.

but I see lots of banding and gradient.

For example in that first picture linked. look at the phone's screen.

look at the cheek.

flatness and gradients EVERYWHERE :eek:

i'm going to go fire up my desktop at home and look at it on 3 different display types to see how it looks.
 
So if you make an app to apply filters to and share smartphone photos, that's worth $1 billion, but if you make the actual hardware to take those photos, that's only worth $20 million.

Sometimes I really hate this industry.

lol Like that's all that Instagram does like it isn't a big social network and didn't have 100 million active users when it was acquired...

Sometimes I really hate these dumb tech forums.
 
but I see lots of banding and gradient.

For example in that first picture linked. look at the phone's screen.

look at the cheek.

flatness and gradients EVERYWHERE :eek:

i'm going to go fire up my desktop at home and look at it on 3 different display types to see how it looks.

I can see some, and I just looked at it closer than the first time. Note that I'm comparing between the reference images here, not to an slr or digital 645 format or anything of the sort. There's still some banding, but I don't see the pronounced color banding, and it's not as pronounced. In the iphone one I see full banding edges (note the well defined cross-hatch pattern).
 
It's more like, if you make an app that has 50 million registered users that create their own content, most of which compulsively check for new content several times per day, translating into 50 million sets of eyeballs for ads, it's worth $1 billion; but if you make comoditized hardware that can be ripped off by any competitor because patents are becoming worthless, that's only worth $20 million.

"Hardware [...] can be ripped off by any competitor because patents are becoming worthless" sounds like a good reason to hate this industry. :( (Not trying to be sassy.)
 
Sorry, but the reason you won't get "image quality on par with SLR cameras" isn't because of the sensors or resolution, it's because of the lenses. The quality, quantity, and size of glass you can fit into a phone or compact camera is the bottleneck and the reason phones and compacts still haven't, and probably won't replace SLR-sized camera platforms for a loooooong time for professional quality images.

You can pack as many sensors as you want at resolutions that are stupid-high, but all you're capturing is as good as what's coming through the "glass".

Harvard has that covered:

http://www.seas.harvard.edu/news/2015/02/perfect-colors-captured-with-one-ultra-thin-lens
 
Camera updates are always exciting, so it'll be cool to see what Apple does with this. Maybe better front cameras for iOS devices and Macs in the future too?
 
So if you make an app to apply filters to and share smartphone photos, that's worth $1 billion, but if you make the actual hardware to take those photos, that's only worth $20 million.

Sometimes I really hate this industry.

Such as almost everything else in life? Sorry, nothing against you just can't help but to echo the bigger picture.
 
So if you make an app to apply filters to and share smartphone photos, that's worth $1 billion, but if you make the actual hardware to take those photos, that's only worth $20 million.

Sometimes I really hate this industry.

And if you make an app that replicates a system feature (Messages: Text Messaging and iMessage) you get $19 billion in stock and cash (Facebook, WhatsApp).

:cool:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.