Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Most people know what a portrait is, very, very few people (among the general iPhone user base) know what Bokeh is. And the vast majority of people also know that the word 'portrait' can be used to describe the orientation of a non-square rectangle as well as images of (usually) single persons.


What, they couldn't have called it the blurry background mode :D
 
I'm sure others have already said it: but you are 1000% incorrect.
...
In summary: you (and Apple!!) are VERY wrong. A DSLR can (and does by default!) give you a preview of a shallow DoF.
I don't think Apple ever said that DSLRs can't do this. Actually I remember they were saying that ONLY DSLRs could do this (until now).
 
Can you please point me to where Apple have said "DSLR cameras don't do live DOF preview and iPhone 7+ does? Or are we making stuff up as it goes now?

http://www.apple.com/apple-events/september-2016/

At 1:16:39

"Even high end digital slrs can't do deep depth preview on their screens"

He is wrong in so many ways:

1. "Deep depth" is the opposite of the effect he's showing! "Deep" depth of field would mean that everything is in focus! This new feature is about SHALLOW depth of field (to give you a blurred background).

2. Every camera that gives any sort of a "live view" through the lens does this. Whether that's through a glass viewfinder through a prism or if it's projected on a focusing screen or if it's a mirrorless camera that uses an LCD screen... if you're seeing a live image and it's coming in through a lens with a large aperture YOU ARE SEEING THIS EFFECT LIVE.

(Pedantics: aperture size doesn't actually matter... there is always a depth of field and things in front of it or beyond it will be blurry... the aperture simply sets how wide the depth of field is. But, realistically, with narrow apertures (like in a camera on a phone because it's so small) it looks like your depth of field is essentially infinite: i.e. Everything *looks* in focus)
 
  • Like
Reactions: mixel
I agree that 'bokeh' is not a widely-known term among non-photographers. But I also think calling it 'portrait' is problematic, especially since they already have a mode called 'square' that describes the aspect ratio, and 'portrait' is commonly used in that context. 'Blur' or 'Depth' or 'Focused' or something might have been better terms.
Maybe that is the photographer in me speaking, but when I hear the word 'portrait' I immediately think of an image of a person (and similarly for 'landscape', I think of an actual landscape) and I only interpret it to mean the orientation of a non-square rectangle if it is used together with the word 'orientation' (or similar).

Note that the 'Portrait' mode also automatically switches you to the longer focal length thus already indicating to the user that this is for a tighter shot which is associated with people shots. And people aren't stupid, they know very well that can turn the phone to switch between portrait and landscape orientation in the default 'Photo' mode (but also in the Time-Lapse, Slo-Mo and Video mode). And once they tried the 'Portrait' mode a couple of times, they'll know that it results in a blur of the background. And in many languages, 'portrait orientation' actually is described using a word that is not a synonym for 'picture of a person', among them Spanish (formato alto vs retrato), Italian (formato verticale vs ritratto), German (Hochformat vs Porträt), Polish (format poziomy vs portret), Russian (вертика́льный форма́т vs портре́т).
[doublepost=1474549731][/doublepost]
What, they couldn't have called it the blurry background mode :D
There are some limitations on the length of such a label.
 
getting tired of all the "modes" swipe list in the camera app, they should rethink the app to be more convenient, or shrink that list somehow
 
  • Like
Reactions: friedmud
It was interesting when I saw during the keynote. But when I saw this video, everything has change. Apple is claiming something about taking advantage of the two camera technology. This can easily be done with a any camera and a regular photo app. It can be done better with Pixelmator and Photoshop.
[doublepost=1474548189][/doublepost]
You know it's a Japanese word right!

No. The difference is that things shouldn't be uniformly blurry. Things closer to the focal plane (i.e. where you've focused) should be less blurry... as you move further away from the focal plane things should get more blurry.

The iPhone 7+ can do this because it uses the two cameras to sense depth... so it can apply the right amount of blur to the right portions of the scene. That's really tough to do with postprocessing a regular photo.

I found this to be a good write up with several examples of the blur gradient:

https://techcrunch.com/2016/09/21/hands-on-with-the-iphone-7-plus-crazy-new-portrait-mode/
 
This is one of the most embarrassing features I've ever seen them add. The blur looks horrific! Especially around the edges of whatever in focus, which is particularly obvious on the photo of the succulent plant - half the leaves are blurred away with the background! They should be pin sharp while the background is out of focus.

I realise they've done what they can with limited technology, but what they've done is poor quality.



Even I felt the same, see the edges. I understand with software can do this but I still prefer DSLR. Good for fun pics but edges look horrible.

upload_2016-9-22_9-13-10.png
 
  • Like
Reactions: mixel and yesjam
This just show how misinformed or lack of information some Apple fan can be. This dual camera and blur effect wasn't invented by Apple. HTC done that with its M8 phone and similar kind of implementation already existed with serveral other phones. Apple is only late to the game and of course Apple fan will think it is all because of Apple


Of course there's always going to be a subset of any large group of people who don't follow all tech advances as closely as others. Particularly when your talking about hundreds of millions of people.

Some people are likely to never have seen this in a phone before and assume its completely new. I don't really think they should be chastised for having a more limited pool of information.

Those of us who do follow all forms of new technologies know that others have tried this before. We also know that there are many ways to achieve this effect. Some are considerably worse than others.

As for Apple, they are not always first to the table with cutting edge tech. Instead they tend to spend a ridiculous amount of time and money perfecting both the technology and the user experience before bringing it to market.

You need only look at the impact made on the portable music player made by the iPod. Or the huge change to the smartphone market made by the iPhone. Remember what smartphones were like before the iPhone hit the scene? I do, I had dozens of them and every one pales in comparison to what the iPhone has achieved.

Apple have a better than average chance than most of making effective use of the available technology to make a decent, passable bokeh effect that doesn't just work, but works seamlessly and simply for the end user.

Thus far the most acceptable implementation has probably been from HTC but even then, with practice and the perfect subject, setting and lighting it can be horribly inconsistent. That's something I hope to see Apple improve on.

I've never seen bokeh implemented on a smartphone that I'd consider to be overall acceptable. Even including Apples implementation as it stands I still haven't. But this is a first beta of the effect on iPhone and for a first beta they're doing pretty damn good. I can't wait to see what the final product is like because if this is anything to go by they may yet achieve what they set out to do, make the best fake bokeh we've seen yet.
 
There are some limitations on the length of such a label.

Hence the :D it wasn't a serious suggestion. It would be like calling it "that thing that makes some bits of your photo out of focus" :D
[doublepost=1474550881][/doublepost]
DSLRs most certainly CAN do this- it's called auto focus. Why is this so hard to understand? A DSLR doesn't need to apply a blur effect because the blur is inherent to the large lens and wide open aperture. If you want to be technical, then yes, technically DSLRs aren't capable of displaying a photo at infinite focus with an artificial blur effect applied concurrently, through the use of an offset dual lens design which captures a depth map. But I didn't see any asterisks under Phil when he said what he said. He said it because he has limited knowledge of DSLRs and figured the majority of consumers share this limited knowledge.


The only possible aspect I can consider what Phil said to be acceptable is if he intended it to mean the processing involved and that the iPhone is doing it by means of constantly analysing the image to apply the effect computationally. As opposed to the natural effect from a DSLR lens.
 
err, yes it does. it has an aperture that changes size, and a max aperture rating based on its widest opening.

saying 'the iphone doesnt have an f-stop' is like saying a person doesnt have height.
You might be surprised, but at least all iPhones so far (can't be sure about the 7 yet) have a fixed aperture. All pictures taken with my iPhone 4 are taken at f/2.8, all photos from my iPhone 5 at f/2.4 and all from my iPhone 6 at f/2.2. Check the EXIF data on your iPhone pictures.

There are probably two reasons: (1) it simplifies the camera module if there is no need to build-in an adjustable aperture and (2) given the small size of the sensor, even wide-open we are already past the optimal aperture, ie, closing down the aperture further degrades the image quality more due to increasing diffraction effects than it reduces other optical aberrations.
 
Maybe that is the photographer in me speaking, but when I hear the word 'portrait' I immediately think of an image of a person (and similarly for 'landscape', I think of an actual landscape) and I only interpret it to mean the orientation of a non-square rectangle if it is used together with the word 'orientation' (or similar).

Note that the 'Portrait' mode also automatically switches you to the longer focal length thus already indicating to the user that this is for a tighter shot which is associated with people shots. And people aren't stupid, they know very well that can turn the phone to switch between portrait and landscape orientation in the default 'Photo' mode (but also in the Time-Lapse, Slo-Mo and Video mode). And once they tried the 'Portrait' mode a couple of times, they'll know that it results in a blur of the background. And in many languages, 'portrait orientation' actually is described using a word that is not a synonym for 'picture of a person', among them Spanish (formato alto vs retrato), Italian (formato verticale vs ritratto), German (Hochformat vs Porträt), Polish (format poziomy vs portret), Russian (вертика́льный форма́т vs портре́т).
[doublepost=1474549731][/doublepost]
There are some limitations on the length of such a label.

Agreed, surprised Apple didn't give it a new trademarked name like "Depth-o-Rama" which would become the new standard in depth perception via screen based effect trickery.
 
It will get better of course. But as of right now it's just gimmicky.
With future software upgrades or future cameras on newer phones I'm sure it will give us a lot of amazing possibilities. As of today? Not too impressed
For what it does today, I can see Instagram or someone else quickly push out a filter that does the same thing without the need for the separate sensor.

Now if it could get the same results like this on video, that would be impressive.
 
I don't think of it as a gimmick, but it does feel like another one of those developments where an effect that used to take a bit of photography knowhow is now available to the masses. Makes me wonder what professionals in the field think.

As a professional in the field, this is as relevant to us as those sous-viday-E-home gizmos are to an actual chef.

Apple didn't invent names for time-lapse or slo-mo or panorama.

Talk about missed opportunity!

Magic Time-Lapse
Magic Slo-Mo
Magic Panorama

just off the top of my head ;)
 
As a professional in the field, this is as relevant to us as those sous-viday-E-home gizmos are to an actual chef.



Talk about missed opportunity!

Magic Time-Lapse
Magic Slo-Mo
Magic Panorama

just off the top of my head ;)


Well there's always

IBlurry
IGoslo
IPano
IVerylongPhoto
IMakePicturesMove

So many choices :D

Kidding aside, I'd like to see them implement a Lytro style refocus mode. Which of course would be IRefocus.
 
You might be surprised, but at least all iPhones so far (can't be sure about the 7 yet) have a fixed aperture. All pictures taken with my iPhone 4 are taken at f/2.8, all photos from my iPhone 5 at f/2.4 and all from my iPhone 6 at f/2.2. Check the EXIF data on your iPhone pictures.

There are probably two reasons: (1) it simplifies the camera module if there is no need to build-in an adjustable aperture and (2) given the small size of the sensor, even wide-open we are already past the optimal aperture, ie, closing down the aperture further degrades the image quality more due to increasing diffraction effects than it reduces other optical aberrations.

fair point... aperture blades that small would be difficult to engineer. youre probably right.

however its aperture could be quantified (re: original post)
 
tired to read at 5 page, will take a rest

my list:

IGoslo
IPano
IMakePicturesMove
Magic Panorama
 
tired to read at 5 page, will take a rest

my list:

IGoslo
IPano
IMakePicturesMove
Magic Panorama
Apple has renamed iCal into Calendar. It has renamed iPhoto into Photos. It has renamed the iBook into MacBook. There is no iTV or iWatch. The iPad was the last new 'i' product.
 

I don't think he's claiming what you think. When I watch the entire section on the camera (and not cut out a single sentence) then to me it seems he's talking about the processing going on to generate the live view. He spent a lot of time leading up to his remark talking about the speed of the ISP in the A10, using two cameras to form a depth map and that this is done in real time.

This is like taking a single quote from Steve Jobs about having a stylus and making assumptions about what he really meant (which is that needing a stylus to operate a device is bad, not that a stylus itself is bad).
 
So..cool video, but where are the actual photo samples?
[doublepost=1474555097][/doublepost]
getting tired of all the "modes" swipe list in the camera app, they should rethink the app to be more convenient, or shrink that list somehow
They should change it to use force touch to bring up the camera types, and just one switch to swap from video to photo. Having to swipe through ten options to get from photo to video is terrible.

Also burying the 4k/1080 and slow motion speeds in the settings app is so dumb. I'm never going to switch them when they are in there, and not every video needs to be 4k...why can't I just tap on the "4K" icon in the camera app to change it?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.