Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No need to get silly with light field tech. Just look at what's needed (high-end imaging and rapid publishing) and implement a solution for that.
I'm glad not all photographers require high-end imaging and rapid publishing. :rolleyes: Thankfully there are those that do not want to be commercial photogs.

----------

I'm sure you're very familiar with ALL of Apple's patent applications....

This is the internet, where everybody is an expert on everything. ;)
 
I looked at the Lytro camera before. the problem with it was that you could not see the picture to do the refocusing without special apps or plugins. I am sure that the technology is improving, but until I can send the picture to a friend or print it out with needed anything special, this will have limited use IMO. I hope that if apple implements this correctly, these issues will be solved. Then this could be a cool new feature to play with.

:cool:

All image formats need "special" software to view them, strictly speaking... I think what you're meaning to say is that you hope that light field images will get a standard open format across platforms, which can then be incorporated into all image viewing software.

I'm sure this will happen as light field photography proliferates.
 
Phone and tablet device cameras in the near future will just be the sensor. No more moving parts, no more complex stacks of lenses. Light Field tech will evolve at an incredible pace. The sensor simply absorbs everything it is exposed to and software pieces it together, more specifically the way you want it.
 
All image formats need "special" software to view them, strictly speaking... I think what you're meaning to say is that you hope that light field images will get a standard open format across platforms, which can then be incorporated into all image viewing software.

I'm sure this will happen as light field photography proliferates.

Strictly speaking.... I stand corrected.
 
This is stupid. Nobody has ever had a need to refocus after the shot, because you can focus when you TAKE the shot in the first place. Also, smartphones small sensors have a huge depth-of-field anyways. You only have shallow/unfocused images in large sensors.

It's a dead end technology.

The most important and useful photography technology that Apple could implement would be to add optical image stabilization. The next would be larger sensors.

Other options would be to allow for interchangeable lenses, and to provide Aperture capability on a mobile device.

A professional photographer has a need to edit and publish photos as quickly as possible. The genius thing about smart phones is that they allow the editing/publishing part to happen in mobile devices in field. The next step would be to implement a higher-quality imaging system (35mm full-frame sensors, various lenses, flash/strobe mounts, other SLR features, etc..)

No need to get silly with light field tech. Just look at what's needed (high-end imaging and rapid publishing) and implement a solution for that.
This is just using lightfield technology to refocus on the fly before taking a picture. the iPhone is not taking a lightfield picture.
 
This is just using lightfield technology to refocus on the fly before taking a picture. the iPhone is not taking a lightfield picture.

This is actually already done via on-chip Phase Detection systems in many mirror less digital cameras.

Phase detection uses 2 light-fields, vs 16 or more for the bigger light-field camera.
 
Phone and tablet device cameras in the near future will just be the sensor. No more moving parts, no more complex stacks of lenses. Light Field tech will evolve at an incredible pace. The sensor simply absorbs everything it is exposed to and software pieces it together, more specifically the way you want it.

This would be heaven indeed. As a professional photographer I would welcome not only the freedom to create and compose freely post shooting session plus not having to carry an assortment of (sometimes heavy) lenses around.

One can only dream I suppose. #

I played with a Lytro camera and yeah it has cons like any emerging technology but I definitely enjoy the refocusing capabilities and I'm sure this will take photography to new places.

I can see some people commenting that this tech is going nowhere just because they don't see any advantage or practical use in their personal life which just makes me so glad the world is vast and full of different minds. #
 
As I understand it (not sure if correctly), the problem with light field technology is that it requires software to "emulate" the blur in the out of focus areas so you won't get natural bokeh, which is the hallmark of any decent lens. It basically just puts something slightly more realistic than a gaussian blur on things that should be out of focus.

As for small cameras such as the ones on the iPhone, you have a small sensor, small aperture and very small short focal length (8 mm), so everything will be in focus anyway. There is no point in focusing other than for macro shots and close ups. Everything beyond that is infinity for the lens.

Light field tech is therefore best suited for non-professional applications but then those applications don't even allow selective focusing anyway.
 
I find it interesting that Apple was awarded such a patent when Lytro already claimed such a technology. I guess this is similar to many camera manufactures having their own CMOS and CCD chips that pretty much all have the same or similar technology in them.
 
Megapixels aren't what's important.

The Lytro camera's sensor is bigger than the 5S's, but only turns out 1.2MP images because of all the light data it needs to absorb. More megapixels would actually help it.

And Nokia already jumped on lightfield smartphone integration last Spring when they licensed Pelican's Camera Array. Their phone is due 2014.
 
Although this is very cool, I would much appreciate a megapixel update on the next iphone if possible apple... even just a little to keep up with the nokia lumia!!

You do realize what Nokia is doing with "41 Mpx" is marketing gimmick? Why would Apple even bother heading that way?
 
You do realize what Nokia is doing with "41 Mpx" is marketing gimmick? Why would Apple even bother heading that way?

It's not a marketing gimmick. They're using that 41MP to allow for lossless digital zoom. Apple's digital zoom solution is to zoom directly into a static image and post process it, creating a blurry piece of junk. Nokia's 41MP sensor is to allow pixel binning so image data isn't lost.
 
It's not a marketing gimmick. They're using that 41MP to allow for lossless digital zoom. Apple's digital zoom solution is to zoom directly into a static image and post process it, creating a blurry piece of junk. Nokia's 41MP sensor is to allow pixel binning so image data isn't lost.

If you believe that, you believe anything!
 
You could turn it around backwards too. Take a re-focusable image of a room. Now you can drop a virtual camera into the scene and move the camera around. In a game you could place the chargers in your environment but for way a real-estate sales you can make better presentations because you have the 3D data to allow perspective changes with viewpoint changes.

Refocusing wouldn't really help with that though - you still need at least a panoramic shot and even still you would only have the front sides of objects in the room.

Now maybe they could develop software that allows you to just walk around and the software would combine movement data with the 3d sensor and camera to create a mapping that could be moved around in 3D. With only a single perspective you will only be able to tilt around slightly before there is no data available.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.