Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't think this lives up to what was advertised in the keynote.This is pretty disappointing. I mean it's good they are marching in this direction but this is no where near a prime time finish and not what I was expecting from this.

-.02
 
Some of those images definitely look like were made using blur on Photoshop. And that's not a good thing. The definition on the edges must improve so it can have real value over what I would make in a few minutes using a simple editing app... They could shrink the border inside the subject so it wouldn't look so fake, but that's just in my theory.
 
  • Like
Reactions: SirRahikkala
I don't understand why I keep reading/hearing that DSLRs can't show depth of field live. Of course they can - it's not an effect that requires processing using a DSLR, it's just the natural characteristics of the lens optics and aperture.

Exactly!

People who say that stuff obviously have never used an SLR camera with a prime lens before. They're just regurgitating an incorrect statement that Schiller made at the keynote.
 
  • Like
Reactions: SirRahikkala
This looks really cool, but the size of the plus is the deal breaker for me. I'm on the tick cycle so I went with the 7. Hopefully by 8, or whatever they call it (anniversary?) this cool double camera comes to the 4.7 in screen.
 
What confuses the author is that cameras keep the aperture in the full open position normally so that the viewfinder image is nice and bright, and on mirrorless models so that the preview image is fluid. You have to push a button or select a menu option to turn on DOF Preview.

Also the button on DSLRs is always tiny, stuck out of the way and invariably unmarked.

No. That is not how mirrorless cameras work.

The aperture on the lens changes as you change it in the camera and shows you what the sensor sees. Since it is a live feed of the actual sensor readout and not a optical viewfinder developed in the 1950s it simply ups the ISO (i.e. sensor gain) to give you a 'fluid image'. The graininess of the image, of course, increases as the ISO gets high, but no mirrorless cameras do not work like DSLRs. DSLRs with optical viewfinders that need to stay wide so that you can see through the prism and then change the actual settings before taking the picture which is why you have to then check your exposure, etc. after taking the pictures. Mirrorless cameras can show you what the image will actually look like before you talk the picture.
 
good for you



can you relax?

you know your novel response was an opinion, well i shared mine too. no need to get all angry about it. this forum is filled with 15 year old girls it seems.

all i was saying was that to ME it is NOT worth the MONEY to have that FEATURE when I can can get the SAME EFFECT using photoshop.

No you can't, not it a meaningful way cause doing it real time is the whole game changer.
And, your off the cuff glib initial remark (you think we can't read?) implied that this was something trivial;
its not, even for you "little genius" (sic).

PS: I haven't been 15 since around 1980, got that buddy....
Keep your condescending comments to yourself.
 
This just show how misinformed or lack of information some Apple fan can be. This dual camera and blur effect wasn't invented by Apple. HTC done that with its M8 phone and similar kind of implementation already existed with serveral other phones. Apple is only late to the game and of course Apple fan will think it is all because of Apple

Being first to market doesn't automatically make something a game-changer.
 
The issue is lack of transition between the in focus object and everything else. Bokeh from a DSLR (or your eyes) will have areas that are almost in focus transitioning to completely out of focus. Apple's does not.
they should do gradation. thats the point of computing 9 depth layers. to apply variable blur

I need to see more tests, especially from photographers. Curious about about how it renders distant point sources of light, as I mentioned in post #178 in the previous thread.

Not expecting the same rendering as from my dslr and wide aperture lens on distant points source lights, but would like to see how close it can come.
doubtful it'll emulate specular highlights... because it can correctly detect depth via geometry/foreshortening/parallax.... bright light sources are also easy to detect, but blurring is an easier process to emulate than sharpening... or the extreme sharpening required to emulate bokeh balls. and moreover the blurring is the more immediate, desired effect of portraiture for many
 
I don't understand why I keep reading/hearing that DSLRs can't show depth of field live. Of course they can - it's not an effect that requires processing using a DSLR, it's just the natural characteristics of the lens optics and aperture. If this were true, you'd never be able to manually focus an image using live view mode on a DSLR.

Depends on the dslr - depth of field comes from the setting of the iris - if your dslr has a preview mode where you can get it to throw the iris where it belongs, you can preview depth of field. some of them let you do the focus, but won't adjust the iris until the instant it takes the shot. The only DoF preview you get is with it wide open otherwise, which may not be what you were looking for.
 
Wow! Surprised how good it looks, even when compared to the sony camera!

I actually had to pause the video to view the differences between the 7 Plus and Sony AS300. I honestly studied between the two pictures and I could hardly notice the difference, if at all. Then again, perhaps a professional photographer would be able to decipher the differences between both the Plus and Sony camera.
 
people make claim that tech exist long ago. Nothing is new, and apple is late to the game.

Microsoft tablet exists long before ipad. but it was failure.
music player exists long before ipod.
laptop exists long before macbook.
smart phone exists long before iphone.
laptop fingerprint exists long before touchID (had one on Dell XPS and it was garbage)
NFC exists long before Apple Pay

that is true that many tech exist long before Apple implements it. but Apple won't releases its products until implementation of tech is close to perfection, simple to use for most people.

that is a big difference. this philosophy is the main reason why ipad, ipod, iphone, mac-book are so successful.

That is really not true. Apple doesn't do everything better. It's their Eco system that helps them not their implementation. This argument has no base that they do it better. The fragmentation of android kills them and that is the reason I don't buy them not because they don't implement their features correctly. It's the Eco system not innovation.
 
you missed the "convenience" part in my post.

an effect like that doesn't take hours. if it takes you hours then you don't know wtf you are doing in photoshop.

it takes a second on the phone

it'll take you how to long to move photo from your phone to computer, manually do your photoshopping, and move back to the phone....while "in the field" where people are using the phone?
 
because that's even more misleading and downright wrong.
Synthetic bokeh is still bokeh. Bokeh just means the quality of the blur.

1. No Bokeh
2. Synthetic Bokeh
3. Gaussian blur (photoshop)

500px-Faux-bokeh-comparison.jpg
 
Last edited:
  • Like
Reactions: CB1234
meh. I'm not impress. I still use my Sony a5100 that I bought in 2014 for $400. You don't need a expensive get depth of field. It's an entry level mirror less but still better than any phone. Very fast autofocus for an entry level camera. It even has motion tracking on people of objects for video. The one thing that phones won't come close is video. Try switching to video on the iPhone and it looks like crap in low light. Since it's using the 56mm lens the aperture is 2.8 the low light looks grainy.
 
But how else can people bash the iPhone?

Nobody wants to hear "The iPhone is the best smartphone camera in the world, and their portrait feature is superior to every other phone."

So they change the narrative to:

"Meh, it's not as good as a DSLR and a portrait lens. Therefore it's a useless gimmick."

So many people upset that Apple has once again brought a feature others have tried before and came up with a superior implementation of it.

Agreed. I know I wasn't the only one that CLEARLY heard Phil Schiller be very careful not to make any false claims about this feature whatsoever.

From my perspective Apple adding this feature to the 7plus is strategic genius. As supply catches up they will drop the update. Brilliant. They know that driving people to buy more Pluses will equate to keeping up their ASP and that will drive profits while they retool for next years all new model.

What's the complaining about? What?

Show me ONE study that proves buyers magnate to Samesong because of a "lack" of functionality or innovation. BS.

I am in the business of creative marketing and Apple and Samesong DO NOT duplicate in every demographic on this planet people. Get over it. iPhone is alive and well and so are the Galaxies. Yes the two co-exist.
 
  • Like
Reactions: MLVC
It could go either way though, you can't know. They could roll out updates that fix it, like soon. .....or not.
It's a $120 bet I didn't want to make this year.

Next year I'll go back to the Plus club after a redesign. After holding a Note 7 with a bigger screen size but much smaller foot print I realized how flawed the design of the 3 year old Plus is. I do miss the screen size but I don't miss the bulk at all.
 
I think HTC did this better with a different tech. The software technique has already been used by Sony, Smaug and GoPro. All this a few years back. Not saying that iPhone 7 is not good, just saying this magic happened a few years ago for other phones and every brand has improved their tech. Here is the link from 2014 but I am sure I have seen this tech way before. That being said I am glad Apple finally did it.

http://www.phonearena.com/news/HTC-One-M8-Duo-Camera-explained-always-on-refocus_id54244

So I wonder why HTC abandoned the dual-cameras for the M9 and 10 ?

It's one thing to do it... it's another thing to KEEP doing it in subsequent models.
 
I don't understand why I keep reading/hearing that DSLRs can't show depth of field live. Of course they can - it's not an effect that requires processing using a DSLR, it's just the natural characteristics of the lens optics and aperture. If this were true, you'd never be able to manually focus an image using live view mode on a DSLR.

Because DSLR always show the frame at depth of field of f/8, even when taking a photo at f/2.8.

Some more advanced DSLRs have a DOF preview button, but that makes the image look a lot darker.
 
  • Like
Reactions: coldsweat
IMG_3448.JPG
IMG_3446.JPG
I'm interested to hear what iPhone 7 Plus users think about this feature. Looks like a gimmick to me.
Not bad for a beta, it can only improve from here. Big buying point for me and amazing a cell phone can do this. Here's a few sample pics I just took after downloading the dev preview. Introduces a bit of noise, but I imagine in better lighting it will be pretty great. Also, if true that it works even better with a human using facial recognition, I'm thoroughly impressed.
 
Because DSLR always show the frame at depth of field of f/8, even when taking a photo at f/2.8.

Some more advanced DSLRs have a DOF preview button, but that makes the image look a lot darker.
err, pretty much all dslr views at the lens' widest aperture to aid focusing. DOF Preview will close down aperture to set value.
 
That is really not true. Apple doesn't do everything better. It's their Eco system that helps them not their implementation. This argument has no base that they do it better. The fragmentation of android kills them and that is the reason I don't buy them not because they don't implement their features correctly. It's the Eco system not innovation.

I never said apple do EVERYTHING better. the list I provided is enough to show the consistency of their better implementation of tech. if it has no base, then what is?

Eco system as a whole is an innovation. an individual product feature may not innovative enough but a collection of products together as a whole can be a killing feature, innovation that makes apple unique that can't be replaced.

is it why you are hooked into this eco? or are you stuck with apple eco because there are no alternative?
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.