Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
All my Canon DSLRs can preview depth-of-field. Doesn't matter if you are using a mirror viewfinder or digital-only screen. However -- most lenses are "wide-open" to facilitate a bright view and make critical focusing easier. There is usually a button on the lens or camera body that forces the lens to "stop down" to its actually selected F-stop -- this feature allows you to "preview" the ACTUAL depth-of-field based on the lens focal length and F-stop.
 
Last edited:
To me the most impressive aspect is the real time preview of the blur effect. Yes it's artificial blur but no DSLR on the market does that.

All mirrorless cameras do this by simply showing you what the sensor sees on the screen/view finder.
 
  • Like
Reactions: friedmud
I think it's pretty cool, but I don't see the effect being able to be improved much with the current hardware.
 
The effect doesn't look real to me. It looks like (good) photoshopping.

The issue is lack of transition between the in focus object and everything else. Bokeh from a DSLR (or your eyes) will have areas that are almost in focus transitioning to completely out of focus. Apple's does not.
 
It's pretty astonishing if you think about it.

Only 10 years ago the average consumer was carrying around a point and shoot camera which was all the rage, we're now carrying around a phone that mimics DOF and has greater resolution and better features than many point and shoot 10 years ago.

Yet...people still have something to gripe about!
 
Last edited:
All mirrorless cameras do this by simply showing you what the sensor sees on the screen/view finder.

What confuses the author is that cameras keep the aperture in the full open position normally so that the viewfinder image is nice and bright, and on mirrorless models so that the preview image is fluid. You have to push a button or select a menu option to turn on DOF Preview.

Also the button on DSLRs is always tiny, stuck out of the way and invariably unmarked.
 
The effect doesn't look real to me. It looks like (good) photoshopping.

The issue is lack of transition between the in focus object and everything else. Bokeh from a DSLR (or your eyes) will have areas that are almost in focus transitioning to completely out of focus. Apple's does not.

It doesn't always get it right, but it does have gradation. Check out the chair compared to the laptop compared to outside the window:
 
That's pretty damn good..

Think this is the start of something with smart phones in general. Make users use their DLSR's less by incorporating these unique features. Not replace, but help.
 
  • Like
Reactions: waitandwait
To me the most impressive aspect is the real time preview of the blur effect. Yes it's artificial blur but no DSLR on the market does that.
HUH? Thats the point of a DSLR you get a real through the lens viewfinder or live view. You most certainly do get the aperture you are set to. For example if you have a 50mm 1.4 the depth of field is super narrow and you get what you see. I think you might be confusing step down, if you have the 50mm set to f11 you're still going to get f1.4 in the viewfinder unless you depress the aperture preview button, however the image will darken or gets noisier through live view.

The effect on the iPhone is impressive compared to many other attempts but is never going to render hair and fur properly no matter what they try to do. It's pretty obvious in all images, it won't fool anyone but it's close enough to use as an effect like tilt shift is.
 
  • Like
Reactions: Starship77 and JCrz
I agree javco, I don't which dslrs they are talking about. My Nikon 7200 does that in real time both in the viewfinder and in the LCD display.
That said, I still think it is a great feature.
 
  • Like
Reactions: JCrz
I don't understand why I keep reading/hearing that DSLRs can't show depth of field live. Of course they can - it's not an effect that requires processing using a DSLR, it's just the natural characteristics of the lens optics and aperture. If this were true, you'd never be able to manually focus an image using live view mode on a DSLR.
There are a lot of asterisks on this. For once, the optical viewfinder of DSLRs normally shows a view with the lens wide-open and you have to press (and usually hold down) a button to get an accurate DOF preview if the picture is not taken with the aperture wide-open. But then the focussing screens on DSLRs don't show the shallow DOF of lenses wider than f/2.8 (this is a property of the focussing screen and there is some choice, mostly via third-parties, to choose a different focussing screen to get a better DOF preview of faster lenses). Live View and thus any mirrorless camera does not have this problem but they often also use the lens wide-open during focussing and framing though that varies from camera to camera and often this can be changed by the user.
 
Shouldn't have called it portrait though. Probably confuse some users into thinking its only for when the phone is upright. Whats wrong with called it Bokeh?
Most people know what a portrait is, very, very few people (among the general iPhone user base) know what Bokeh is. And the vast majority of people also know that the word 'portrait' can be used to describe the orientation of a non-square rectangle as well as images of (usually) single persons.
 
DSLR's and great lenses will always have a place on my shelf however this development is another (final?) nail in the coffin of the point-and-shoot market?

Do Canon and Nikon watch these developments and just think 'crap we need to release another heavy full frame $3000 DSLR'?
 
I need to see more tests, especially from photographers. Curious about about how it renders distant point sources of light, as I mentioned in post #178 in the previous thread.

Not expecting the same rendering as from my dslr and wide aperture lens on distant points source lights, but would like to see how close it can come.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.