Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Maybe someone in the know could chime in, but what does optical image stab offer that software image stab doesn't already?

1, SW (=electronic) stab doesn't work in still mode, only during video shooting. The 5s has some kind of a still "stabilization", but it's pretty useless (as most IIS systems are) compared to a true, properly implemented OIS system according to, say, the DPReview test. (See the bottom section at http://connect.dpreview.com/post/7518611407/apple-iphone5s-camera-review?page=4 for more info.)

2, even in video mode, it has a VERY detrimental effect on the net result: it severely reduces the FoV to around 36mm on the 5s, 39mm on the 5c/5 and 42mm on the 4S/4. Those figures mean iPhones just can't shoot high-res videos with as wide a field-of-view as most other flagships, particularly those of Nokia.

----------

iPhone 6 is becoming more and more into the phone to get
so glad i skipped the 5S

Me too. I'm perfectly happy with the 5, particulalry now that it's JB'n and can record calls.
 
The video looks far better than what Apple has been using. Perhaps I will no longer get nauseous and sea-sick when watching my wife's videos in the future.

Now, if we could only get TV producers to stop purposefully using camera motion in an attempt to add action to their shows.
 
Sounds like a waste of money tbh. hy not just put a gyro on the camera's daughter board, and feed that signal into your software for stabilization? why spend all this money for basically the exact same thing from someone else?
 
Sounds like a waste of money tbh. hy not just put a gyro on the camera's daughter board, and feed that signal into your software for stabilization? why spend all this money for basically the exact same thing from someone else?

Unfortunately, the way you described can't be used when shooting stills. Then, the electronics outside the sensor only receives one(!) image from the sensor. There is no point in trying to, say, read the sensor 10 times while shooting a, say, 1/15s image. Why? Because the result will be useless - the "photon buckets" just won't be filled in necessarily during the first, say, 7-8 reads, only in the last two or three. That is, you won't be able to combine those images or select the "best" of them.

What you've described is only workable in inter-frame stabilization; that is, typically with video stab. Actually, this is how electronic IS works (with an additional 10-20% outer frame area so that they can also crop). Unfortunately, EIS is still inferior to OIS systems when it comes to sensor usage. (See my earlier comments here for the why's.)

All in all, OIS is sorely missing from iDevices and I can only welcome it to the iPhone 6.

----------

The first could be taken with less exposure time due to image stabilization.

You meant the other way around? ;) With a decent OIS, you will have a lot of keepers even of shots you take at 1/4s or 1/8s, as is also shown by the Nokia 92x / 1020. This keeper rate is plain impossible with no-IS systems like the iPhone. Then, most of even the 1/15s shots are blurred.

BTW, the ability to use longer exposure means using lower ISO's, which, in turn, may result in better color saturation. (Many noise reduction algorithms severely reduce color saturation at higher ISO's in order to reduce color noise, which is, in general, much more disturbing than luminance noise. Hence the seemingly unnecessary dialed-down color saturation in many consumer cameras.) In this case, however, the "blurred" photo doesn't show the effects of reduced color saturation.
 
C'mon Apple get some cue from Windows Phone pureview technology. I mean you are copying everyone else anyway, so why not the better addition to the camera from Windows Phone?
 
C'mon Apple get some cue from Windows Phone pureview technology. I mean you are copying everyone else anyway, so why not the better addition to the camera from Windows Phone?

Sorry for being a bearer of bad news, but no one should expect large (larger than 1/1.5") sensors in any thin (less than 8 mm) phones. It's physically impossible. No wonder Nokia's 808 and 1020 have such huge camera "bumps". We'll be happy if we see as large sensors as 1/2.5" in an iPhone, and even that is highly unlikely, unless Apple increases the thickness of the phones. The latter is VERY unlikely.

Another completely unrelated issue is having OIS. It's certainly possible to have a proper OIS module even so thin a device, as is also proved by the LG G2 or the HTC One.
 
Neat... I like this...

I wonder why Samsung didn't use it in their Galaxy 5. Maybe they didn't need it?
 
The iPhone will become an all-rounder, if Apple pursues tech innovation such as this.

When it comes with large memory (128+ GB), it can be an entertainment center, phone, near-pro camera, jukebox, and internet/email computer, among other things. It takes the place of more and more devices scattered about the home and office which take space and energy.

"near-pro camera"? I'm not exactly sure what you mean by this, but the sensor in these cameras is tiny.

A professional camera has a sensor around the size of an ipod shuffle. The iPhone camera sensor is about the size of the 'play' symbol on an ipod shuffle.

An iPhone is to a professional camera, as a toothpick is to a professional baseball bat.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.