This article is only meant for
- people wanting to shoot video at 60 fps (that is, double the framerate) on their iOS7 iPhone 4S / 5 (no iPad / iPhone 4 users should read further as, to my knowledge, their hardware doesn't support 60fps)
- programmers wanting to support the new 60 fps mode in their apps.
Executive Summary
iOS7 (as opposed to iOS6) will support 60 fps recording. While it does have image quality problems on current, compatible iPhones (4S and 5), at least it works (again).
The full article:
One of the, by camera users, most discussed announcements of Apple was 60 fps video recording. The following video framegrab (annotation by me) shows Apple did put a lot of emphasis of the (re)introduction of 60 fps on the keynote two weeks ago:
Having written several JB tweaks allowing for recording 720p60 using the stock Camera client back in the iO5 + iPhone 4S times, I've been asked by several of my readers to elaborate on how iOS 7 (re)introduced the same feature.
1.1 A Little Bit of History
Back in the iOS5 times, the iPhone 4S (but no other iOS5-capable iDevices, not even the iPad 3 released long after the 4S) could record 60 frame-per-second videos at 720p. (But not at Full HD, that is, 1080p.) While the image quality of these recordings weren't the best (see below), at least there weren't framedrops.
On jailbroken devices, you could do this right in the stock Camera app by using the framerate changer capabilities of the absolutely essential JB tweak “CameraTweak” (report, with a section on 60 fps, HERE). Or, you could just edit /System/Library/ PrivateFrameworks/ MediaToolbox.framework/N94/AVCaptureSession.plist to make the 720p60 mode the default, as is also explained HERE.
On non-jailbroken devices, you could use third-party, 60 fps-capable camcorder apps like FiLMiC Pro 2, SloPro and Better Camcorder. (Note that Better Camcorder is the worst of the bunch when it comes to actual, recorded framerate, as is also explained in my dedicated article.)
Then came iOS 6 – and all kinds of 60 fps recordings, both jailbroken and non-jailbroken, have been removed. That is, currently, there is no way to record 60 fps videos on any iPhone 4S (or the later 5) if it runs iOS6. This has caused a lot of anger in the iOS community – absolutely understandably.
Now, a year after iOS6 (and a year living without any kind of 60 fps recording capabilities), Apple reintroduced 60 fps. Let's see how it compares to both the ideal case (true 720p resolution) and the iOS5 implementation!
2. What Can You Expect, Image Quality-Wise?
Unfortunately, I have some bad news for you all: Apple's current implementation uses pixel binning. Readers of my past 60 fps articles already know what this means: yes, not only the vertical (as was the case with 60 fps recordings under iOS5), but also the horizontal resolution is halved. This is a major blow: after all, the actual, effective resolution of will be 1280/2 * 720/2 = 640*360 pixels only, not counting in the antialiasing introduced by demosaicing done by later components in the image path.
Let me show you proof of this, all recorded on my, via exchange by Apple, brand new (manufacture date: week 22 of 2013) iPhone 5. The test setup was as follows:
This is a crop of the binned 720p60 mode's true resolution (mode 12 in the AVCaptureDevice.formats list shown in Section 3.1 below):
(click for the original, full-quality, full size reschart framegrab!)
This is how it looks in non-binned, that is, full-resolution 30 fps 720p mode (here, mode 10):
Again, you won't have the same quality as under iOS5 (on the iPhone 4S only) or most current 60 fps-capable regular, standalone cameras. However, at least you can record at 60 fps now on your iPhone 4S / 5 if you really need to.
2.1 Want to Save Some Storage?
As the effective resolution of the video is 640*360 pixels only, you may ask whether you can recompress the originally 1280*720 footage into quarter-sized (that is, 640*360) footage to save a lot of storage (or, say, upload bandwidth, should you want to, later, upload the file somewhere). 640*360 60p footage, when properly transcoded (with a quality H.264 encoder like X264, also used by the excellent encoder HandBrake), only needs to have the bitrate of about 600-1000 kbps, which is at least 3-4 times less than that of 720p60 footage – and at least 15 times less than the originally 15 Mbps bitrate of the iPhone 5's recording. That is, you would indeed save a lot of storage.
I've downsized the above framegrab (the first one) to show its effects. As you can see, there's virtually no difference between the rendition of high-frequency components (converging lines beyond the “6” mark both vertically and horizontally). After all, the sensor input just couldn't cope with that kind of resolution and there was just no usable input. I've annotated these regions with red rectangles in the first crop below.
The situation is entirely different in low-frequency regions, that is, areas where there are multiple-pixel thick lines, particularly before the mark 3 (in both dimensions). There, while the sensor still produced an input of effectively 640*360 pixels, the demosaicing / antialiasing algorithms later in the signal processing do smooth the edges. That is, the difference is pretty much visible in low-frequency areas. (Blue has been used below to annotate these regions.)
Let's compare two crops: one from the above 720p framegrab to that of a downsized version (here, I used the default settings of the OS X-based Preview, exporting the results in a lossless PNG):
(non-downsized)
(downsized)
As you can see, the low-frequency areas are indeed much more pixelizated in the 640*360 image while there's almost no discernible difference in the high-frequency regions.
All in all, while downsizing isn't have as detrimental effect on the overall quality of 720p60 footage, you still don't want to use it, unless you really need to reduce storage usage with any means necessary.
3. Configuring iOS to Record at 60 fps
The following section is only meant for programmers. As iOS7 requires an entirely different way of setting up 720p60 recording than iOS5, existing 60 fps-capable AppStore apps (I've specifically tested SloPro and Better Camcorder in this respect), as they still haven't been updated to iOS7, don't record at 60 fps. That is, if you do want to record 60 fps footage currently, you (or a friend of yours) must be able to compile and deploy apps on your iPhone.
Below, I both explain how
- you can get a list of the available video modes,
- the ones with 30+ fps capability can be found and, finally,
- how this 60 fps mode can be made the one to record with.
Only the above is different from how pre-iOS7 camera recording workflow is done; therefore, I elaborate mostly on these three bullets.
3.1 Getting All Available Modes (incl.60 fps Ones) Via AVCaptureDevice.formats
A (publicly - more on this later) new, iOS7+-only property in AVCaptureDevice is formats. It returns an array of AVCaptureDeviceFormat objects. For the iPhone5 running b2, the complete set (acquired by a simple NSLog(), without the leading, unnecessary info) is as follows (trailed by the array index I've added):
'vide'/'420v' 192x 144, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @8.50)> - 0
'vide'/'420f' 192x 144, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @8.50)> - 1
'vide'/'420v' 352x 288, { 1- 30 fps}, fov:51.975, binned, max zoom:76.50 (upscales @4.25)> - 2
'vide'/'420f' 352x 288, { 1- 30 fps}, fov:51.975, binned, max zoom:76.50 (upscales @4.25)> - 3
'vide'/'420v' 480x 360, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @3.40)> - 4
'vide'/'420f' 480x 360, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @3.40)> - 5
'vide'/'420v' 640x 480, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @2.55)> - 6
'vide'/'420f' 640x 480, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @2.55)> - 7
'vide'/'420v' 960x 540, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.91)> - 8
'vide'/'420f' 960x 540, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.91)> - 9
'vide'/'420v' 1280x 720, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.18)> - 10
'vide'/'420f' 1280x 720, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.18)> - 11
'vide'/'420v' 1280x 720, { 1- 60 fps}, fov:51.940, binned, supports vis, max zoom:51.75 (upscales @1.05)> - 12
'vide'/'420f' 1280x 720, { 1- 60 fps}, fov:51.940, binned, supports vis, max zoom:51.75 (upscales @1.05)> - 13
'vide'/'420v' 1920x1080, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @1.45)> - 14
'vide'/'420f' 1920x1080, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @1.45)> - 15
'vide'/'420v' 2592x1936, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.26)> - 16
'vide'/'420f' 2592x1936, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.26)> - 17
'vide'/'420v' 3264x2448, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.00)> - 18
'vide'/'420f' 3264x2448, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.00)> - 19
(Note that, in b1, the first four records all had 2.55 as the maximal lossless zoom factor. This has been fixed in b2.)
Note that, despite their type ('vide'/'420f') being of video, the last four records (index 16...19) are not valid video modes. Should you want to use them, you will get a runtime exception. All the other modes work just fine (I've tested them all). Interestingly, the frame range field of all of them prove I was right when I stated in several of my iPhone oversampling-related articles and forum posts that the iPhone 5 can only sample the sensor up to 20 fps only (and that's only under good light).
The first field, which has the format 'vide'/'420X' YxZ, defines the following:
- X=V: Video range, which means the Y component only uses the byte values from 16 to 235 (for some historical reasons). 'F' Full range uses the full range of a byte, namely, 0 to 255.
YxZ: the resolution.
Programmatically (instead of just parsing the above NSString representation – which is, after all, not very reliable), you can access the first media type (here, always “vide”) as a simple NSString via AVCaptureDeviceFormat.mediaType, the media subtype (here, either 420v or 420f) via the global function CMFormatDescriptionGetMediaSubType(AVCaptureDeviceFormat.formatDescription). The resolution can be accessed via CMVideoFormatDescriptionGetDimensions(AVCaptureDeviceFormat.formatDescription).width/.height.
The second field defines the valid frame range. It's 1...30 for all traditional video modes (between the indexes 0 and 11 and at 14 and 15). For the two 720p60 modes, it's 1...60. Programmatically, it can be accessed via ((AVFrameRateRange*)[AVCaptureDeviceFormat .videoSupportedFrameRateRanges objectAtIndex:0]).minFrameRate/maxFrameRate.
The third field provides you with the field-of-view (FoV) in degrees. (Accessing it is possible via AVCaptureDeviceFormat.videoFieldOfView.) With full-sensor modes (all modes with the 4:3 aspect ratio), it's the same (56.7 degrees) as the still mode (the last four records at the bottom). 352* 288, which has the aspect ratio of ~1.222:1, has a significantly narrower FoV. Finally, as has also been explained in my yesterday's IS article, all 16:9 modes have narrower FoV, the narrowest, with 51.940 degrees, being mode 12 and 13, that is, 720p60 – the subject of this entire article.
The fourth, when existing, shows whether the given mode is binned. (Access: AVCaptureDeviceFormat.videoBinned) It's always on for low-resolution modes, which isn't a problem as the resolution hit introduced by binning isn't an issue with those low-res output formats. No high-res 16:9 modes have binned mode, except for our own 720p60. Unfortunately, as you can see, there's not a single 720p60 mode without binning – that is, at that (comparatively) high output resolution, severe effective resolution decrease.
The fifth, when existing, provides information on whether you can explicitly enable electronic video image stabilization (here: “vis”). (Access: AVCaptureDeviceFormat.videoStabilizationSupported.) As has been explained in yesterday's IS article, you will want to make IS optional for your users – after all, not everybody needs stabilization at the expense of the wideness of the FoV.
Finally, the last field shows the maximal zoom and the upscale ratio. Please read my dedicated article HERE for more info on these fields. Programmatic access via AVCaptureDeviceFormat.videoZoomFactorUpscaleThreshold and AVCaptureDeviceFormat.videoMaxZoomFactor.
3.1.1 AVCaptureDevice.formats and iOS6
If you compile your project in Xcode 5 with the target of iOS 6.0, your code accessing AVCaptureDevice.formats/activeFormat will compile just fine – and also run on iOS6-only devices. This won't be the case under previous, iOS6-only Xcode versions; for example, the now-current 4.6.3, these properties having been private in the iOS6 times.
The results obtained by reading AVCaptureDevice.formats only contain the recording and encoding resolutions and the available framerates, though; therefore, they'll be of limited utility.
For the iPhone 5 running 6.1.4, it'll be as follows:
'vide'/'420v' enc dims = 480x360, pres dims = 480x360 { 1 - 30 fps }> - 0
'vide'/'420f' enc dims = 480x360, pres dims = 480x360 { 1 - 30 fps }> - 1
'vide'/'420v' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 2
'vide'/'420f' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 3
'vide'/'420v' enc dims = 960x540, pres dims = 960x540 { 1 - 30 fps }> - 4
'vide'/'420f' enc dims = 960x540, pres dims = 960x540 { 1 - 30 fps }> - 5
'vide'/'420v' enc dims = 1280x720, pres dims = 1280x720 { 1 - 30 fps }> - 6
'vide'/'420f' enc dims = 1280x720, pres dims = 1280x720 { 1 - 30 fps }> - 7
'vide'/'420v' enc dims = 1920x1080, pres dims = 1920x1080 { 1 - 30 fps }> - 8
'vide'/'420f' enc dims = 1920x1080, pres dims = 1920x1080 { 1 - 30 fps }> - 9
'vide'/'420v' enc dims = 2592x1936, pres dims = 2592x1936 { 1 - 20 fps }> - 10
'vide'/'420f' enc dims = 2592x1936, pres dims = 2592x1936 { 1 - 20 fps }> - 11
'vide'/'420v' enc dims = 3264x2448, pres dims = 3264x2448 { 1 - 20 fps }> - 12
'vide'/'420f' enc dims = 3264x2448, pres dims = 3264x2448 { 1 - 20 fps }> - 13
For the iPhone 3GS on 6.1.3, the following:
'vide'/'420v' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 0
'vide'/'420f' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 1
'vide'/'420v' enc dims = 2048x1536, pres dims = 2048x1536 { 1 - 17 fps }> - 2
'vide'/'420f' enc dims = 2048x1536, pres dims = 2048x1536 { 1 - 17 fps }> - 3
3.1.2 Source Code: Getting and Displaying AVCaptureDevice.formats
I've created a very simple Xcode 5 project to obtain the list above. As usual, I provide you with the full sources – download HERE. Basically, it's very simple: in the view controller (the only place I've changed anything)'s header, I #import <AVFoundation/AVFoundation.h>. (Note that, unlike with previous Xcode versions, Xcode 5 automatically resolves references to AVFoundation, which means you don't need to add it by hand under General > Linked Frameworks and Libraries.)
Then, I declare two properties:
@property (retain) AVCaptureSession *captureSession;
@property (retain) AVCaptureDevice *videoDevice;
In the implementation file, right upon loading (in viewDidLoad), I create a capture session and add the video input to it. As I don't even start streaming, I don't add a AVCaptureVideoPreviewLayer to display the camera view or any kind of output (e.g., AVCaptureMovieFileOutput) to the session - just getting the device model-specific video modes doesn't require an active connection.
Finally, I just iterate over AVCaptureDevice.formats, postfixing the record's printed representation with its actual array index:
int idx=0;
for (AVCaptureDeviceFormat* currdf in self.videoDevice.formats)
NSLog(@"%@ - %i", currdf, idx++);
Note that, below, when presenting a full 720p60 recorder, I'll provide you with the sources of a much more useful demo app, which parses all the above-explained properties and selects the (currently, with the iPhone 4S / iPhone 5, only one) full range + 60 fps combo to set the current video mode to it.
(continues below)
- people wanting to shoot video at 60 fps (that is, double the framerate) on their iOS7 iPhone 4S / 5 (no iPad / iPhone 4 users should read further as, to my knowledge, their hardware doesn't support 60fps)
- programmers wanting to support the new 60 fps mode in their apps.
Executive Summary
iOS7 (as opposed to iOS6) will support 60 fps recording. While it does have image quality problems on current, compatible iPhones (4S and 5), at least it works (again).
The full article:
One of the, by camera users, most discussed announcements of Apple was 60 fps video recording. The following video framegrab (annotation by me) shows Apple did put a lot of emphasis of the (re)introduction of 60 fps on the keynote two weeks ago:
Having written several JB tweaks allowing for recording 720p60 using the stock Camera client back in the iO5 + iPhone 4S times, I've been asked by several of my readers to elaborate on how iOS 7 (re)introduced the same feature.
1.1 A Little Bit of History
Back in the iOS5 times, the iPhone 4S (but no other iOS5-capable iDevices, not even the iPad 3 released long after the 4S) could record 60 frame-per-second videos at 720p. (But not at Full HD, that is, 1080p.) While the image quality of these recordings weren't the best (see below), at least there weren't framedrops.
On jailbroken devices, you could do this right in the stock Camera app by using the framerate changer capabilities of the absolutely essential JB tweak “CameraTweak” (report, with a section on 60 fps, HERE). Or, you could just edit /System/Library/ PrivateFrameworks/ MediaToolbox.framework/N94/AVCaptureSession.plist to make the 720p60 mode the default, as is also explained HERE.
On non-jailbroken devices, you could use third-party, 60 fps-capable camcorder apps like FiLMiC Pro 2, SloPro and Better Camcorder. (Note that Better Camcorder is the worst of the bunch when it comes to actual, recorded framerate, as is also explained in my dedicated article.)
Then came iOS 6 – and all kinds of 60 fps recordings, both jailbroken and non-jailbroken, have been removed. That is, currently, there is no way to record 60 fps videos on any iPhone 4S (or the later 5) if it runs iOS6. This has caused a lot of anger in the iOS community – absolutely understandably.
Now, a year after iOS6 (and a year living without any kind of 60 fps recording capabilities), Apple reintroduced 60 fps. Let's see how it compares to both the ideal case (true 720p resolution) and the iOS5 implementation!
2. What Can You Expect, Image Quality-Wise?
Unfortunately, I have some bad news for you all: Apple's current implementation uses pixel binning. Readers of my past 60 fps articles already know what this means: yes, not only the vertical (as was the case with 60 fps recordings under iOS5), but also the horizontal resolution is halved. This is a major blow: after all, the actual, effective resolution of will be 1280/2 * 720/2 = 640*360 pixels only, not counting in the antialiasing introduced by demosaicing done by later components in the image path.
Let me show you proof of this, all recorded on my, via exchange by Apple, brand new (manufacture date: week 22 of 2013) iPhone 5. The test setup was as follows:
This is a crop of the binned 720p60 mode's true resolution (mode 12 in the AVCaptureDevice.formats list shown in Section 3.1 below):
(click for the original, full-quality, full size reschart framegrab!)
This is how it looks in non-binned, that is, full-resolution 30 fps 720p mode (here, mode 10):
Again, you won't have the same quality as under iOS5 (on the iPhone 4S only) or most current 60 fps-capable regular, standalone cameras. However, at least you can record at 60 fps now on your iPhone 4S / 5 if you really need to.
2.1 Want to Save Some Storage?
As the effective resolution of the video is 640*360 pixels only, you may ask whether you can recompress the originally 1280*720 footage into quarter-sized (that is, 640*360) footage to save a lot of storage (or, say, upload bandwidth, should you want to, later, upload the file somewhere). 640*360 60p footage, when properly transcoded (with a quality H.264 encoder like X264, also used by the excellent encoder HandBrake), only needs to have the bitrate of about 600-1000 kbps, which is at least 3-4 times less than that of 720p60 footage – and at least 15 times less than the originally 15 Mbps bitrate of the iPhone 5's recording. That is, you would indeed save a lot of storage.
I've downsized the above framegrab (the first one) to show its effects. As you can see, there's virtually no difference between the rendition of high-frequency components (converging lines beyond the “6” mark both vertically and horizontally). After all, the sensor input just couldn't cope with that kind of resolution and there was just no usable input. I've annotated these regions with red rectangles in the first crop below.
The situation is entirely different in low-frequency regions, that is, areas where there are multiple-pixel thick lines, particularly before the mark 3 (in both dimensions). There, while the sensor still produced an input of effectively 640*360 pixels, the demosaicing / antialiasing algorithms later in the signal processing do smooth the edges. That is, the difference is pretty much visible in low-frequency areas. (Blue has been used below to annotate these regions.)
Let's compare two crops: one from the above 720p framegrab to that of a downsized version (here, I used the default settings of the OS X-based Preview, exporting the results in a lossless PNG):
(non-downsized)
(downsized)
As you can see, the low-frequency areas are indeed much more pixelizated in the 640*360 image while there's almost no discernible difference in the high-frequency regions.
All in all, while downsizing isn't have as detrimental effect on the overall quality of 720p60 footage, you still don't want to use it, unless you really need to reduce storage usage with any means necessary.
3. Configuring iOS to Record at 60 fps
The following section is only meant for programmers. As iOS7 requires an entirely different way of setting up 720p60 recording than iOS5, existing 60 fps-capable AppStore apps (I've specifically tested SloPro and Better Camcorder in this respect), as they still haven't been updated to iOS7, don't record at 60 fps. That is, if you do want to record 60 fps footage currently, you (or a friend of yours) must be able to compile and deploy apps on your iPhone.
Below, I both explain how
- you can get a list of the available video modes,
- the ones with 30+ fps capability can be found and, finally,
- how this 60 fps mode can be made the one to record with.
Only the above is different from how pre-iOS7 camera recording workflow is done; therefore, I elaborate mostly on these three bullets.
3.1 Getting All Available Modes (incl.60 fps Ones) Via AVCaptureDevice.formats
A (publicly - more on this later) new, iOS7+-only property in AVCaptureDevice is formats. It returns an array of AVCaptureDeviceFormat objects. For the iPhone5 running b2, the complete set (acquired by a simple NSLog(), without the leading, unnecessary info) is as follows (trailed by the array index I've added):
'vide'/'420v' 192x 144, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @8.50)> - 0
'vide'/'420f' 192x 144, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @8.50)> - 1
'vide'/'420v' 352x 288, { 1- 30 fps}, fov:51.975, binned, max zoom:76.50 (upscales @4.25)> - 2
'vide'/'420f' 352x 288, { 1- 30 fps}, fov:51.975, binned, max zoom:76.50 (upscales @4.25)> - 3
'vide'/'420v' 480x 360, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @3.40)> - 4
'vide'/'420f' 480x 360, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @3.40)> - 5
'vide'/'420v' 640x 480, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @2.55)> - 6
'vide'/'420f' 640x 480, { 1- 30 fps}, fov:56.700, binned, max zoom:76.50 (upscales @2.55)> - 7
'vide'/'420v' 960x 540, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.91)> - 8
'vide'/'420f' 960x 540, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.91)> - 9
'vide'/'420v' 1280x 720, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.18)> - 10
'vide'/'420f' 1280x 720, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @2.18)> - 11
'vide'/'420v' 1280x 720, { 1- 60 fps}, fov:51.940, binned, supports vis, max zoom:51.75 (upscales @1.05)> - 12
'vide'/'420f' 1280x 720, { 1- 60 fps}, fov:51.940, binned, supports vis, max zoom:51.75 (upscales @1.05)> - 13
'vide'/'420v' 1920x1080, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @1.45)> - 14
'vide'/'420f' 1920x1080, { 1- 30 fps}, fov:53.890, supports vis, max zoom:108.00 (upscales @1.45)> - 15
'vide'/'420v' 2592x1936, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.26)> - 16
'vide'/'420f' 2592x1936, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.26)> - 17
'vide'/'420v' 3264x2448, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.00)> - 18
'vide'/'420f' 3264x2448, { 1- 20 fps}, fov:56.700, max zoom:153.00 (upscales @1.00)> - 19
(Note that, in b1, the first four records all had 2.55 as the maximal lossless zoom factor. This has been fixed in b2.)
Note that, despite their type ('vide'/'420f') being of video, the last four records (index 16...19) are not valid video modes. Should you want to use them, you will get a runtime exception. All the other modes work just fine (I've tested them all). Interestingly, the frame range field of all of them prove I was right when I stated in several of my iPhone oversampling-related articles and forum posts that the iPhone 5 can only sample the sensor up to 20 fps only (and that's only under good light).
The first field, which has the format 'vide'/'420X' YxZ, defines the following:
- X=V: Video range, which means the Y component only uses the byte values from 16 to 235 (for some historical reasons). 'F' Full range uses the full range of a byte, namely, 0 to 255.
YxZ: the resolution.
Programmatically (instead of just parsing the above NSString representation – which is, after all, not very reliable), you can access the first media type (here, always “vide”) as a simple NSString via AVCaptureDeviceFormat.mediaType, the media subtype (here, either 420v or 420f) via the global function CMFormatDescriptionGetMediaSubType(AVCaptureDeviceFormat.formatDescription). The resolution can be accessed via CMVideoFormatDescriptionGetDimensions(AVCaptureDeviceFormat.formatDescription).width/.height.
The second field defines the valid frame range. It's 1...30 for all traditional video modes (between the indexes 0 and 11 and at 14 and 15). For the two 720p60 modes, it's 1...60. Programmatically, it can be accessed via ((AVFrameRateRange*)[AVCaptureDeviceFormat .videoSupportedFrameRateRanges objectAtIndex:0]).minFrameRate/maxFrameRate.
The third field provides you with the field-of-view (FoV) in degrees. (Accessing it is possible via AVCaptureDeviceFormat.videoFieldOfView.) With full-sensor modes (all modes with the 4:3 aspect ratio), it's the same (56.7 degrees) as the still mode (the last four records at the bottom). 352* 288, which has the aspect ratio of ~1.222:1, has a significantly narrower FoV. Finally, as has also been explained in my yesterday's IS article, all 16:9 modes have narrower FoV, the narrowest, with 51.940 degrees, being mode 12 and 13, that is, 720p60 – the subject of this entire article.
The fourth, when existing, shows whether the given mode is binned. (Access: AVCaptureDeviceFormat.videoBinned) It's always on for low-resolution modes, which isn't a problem as the resolution hit introduced by binning isn't an issue with those low-res output formats. No high-res 16:9 modes have binned mode, except for our own 720p60. Unfortunately, as you can see, there's not a single 720p60 mode without binning – that is, at that (comparatively) high output resolution, severe effective resolution decrease.
The fifth, when existing, provides information on whether you can explicitly enable electronic video image stabilization (here: “vis”). (Access: AVCaptureDeviceFormat.videoStabilizationSupported.) As has been explained in yesterday's IS article, you will want to make IS optional for your users – after all, not everybody needs stabilization at the expense of the wideness of the FoV.
Finally, the last field shows the maximal zoom and the upscale ratio. Please read my dedicated article HERE for more info on these fields. Programmatic access via AVCaptureDeviceFormat.videoZoomFactorUpscaleThreshold and AVCaptureDeviceFormat.videoMaxZoomFactor.
3.1.1 AVCaptureDevice.formats and iOS6
If you compile your project in Xcode 5 with the target of iOS 6.0, your code accessing AVCaptureDevice.formats/activeFormat will compile just fine – and also run on iOS6-only devices. This won't be the case under previous, iOS6-only Xcode versions; for example, the now-current 4.6.3, these properties having been private in the iOS6 times.
The results obtained by reading AVCaptureDevice.formats only contain the recording and encoding resolutions and the available framerates, though; therefore, they'll be of limited utility.
For the iPhone 5 running 6.1.4, it'll be as follows:
'vide'/'420v' enc dims = 480x360, pres dims = 480x360 { 1 - 30 fps }> - 0
'vide'/'420f' enc dims = 480x360, pres dims = 480x360 { 1 - 30 fps }> - 1
'vide'/'420v' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 2
'vide'/'420f' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 3
'vide'/'420v' enc dims = 960x540, pres dims = 960x540 { 1 - 30 fps }> - 4
'vide'/'420f' enc dims = 960x540, pres dims = 960x540 { 1 - 30 fps }> - 5
'vide'/'420v' enc dims = 1280x720, pres dims = 1280x720 { 1 - 30 fps }> - 6
'vide'/'420f' enc dims = 1280x720, pres dims = 1280x720 { 1 - 30 fps }> - 7
'vide'/'420v' enc dims = 1920x1080, pres dims = 1920x1080 { 1 - 30 fps }> - 8
'vide'/'420f' enc dims = 1920x1080, pres dims = 1920x1080 { 1 - 30 fps }> - 9
'vide'/'420v' enc dims = 2592x1936, pres dims = 2592x1936 { 1 - 20 fps }> - 10
'vide'/'420f' enc dims = 2592x1936, pres dims = 2592x1936 { 1 - 20 fps }> - 11
'vide'/'420v' enc dims = 3264x2448, pres dims = 3264x2448 { 1 - 20 fps }> - 12
'vide'/'420f' enc dims = 3264x2448, pres dims = 3264x2448 { 1 - 20 fps }> - 13
For the iPhone 3GS on 6.1.3, the following:
'vide'/'420v' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 0
'vide'/'420f' enc dims = 640x480, pres dims = 640x480 { 1 - 30 fps }> - 1
'vide'/'420v' enc dims = 2048x1536, pres dims = 2048x1536 { 1 - 17 fps }> - 2
'vide'/'420f' enc dims = 2048x1536, pres dims = 2048x1536 { 1 - 17 fps }> - 3
3.1.2 Source Code: Getting and Displaying AVCaptureDevice.formats
I've created a very simple Xcode 5 project to obtain the list above. As usual, I provide you with the full sources – download HERE. Basically, it's very simple: in the view controller (the only place I've changed anything)'s header, I #import <AVFoundation/AVFoundation.h>. (Note that, unlike with previous Xcode versions, Xcode 5 automatically resolves references to AVFoundation, which means you don't need to add it by hand under General > Linked Frameworks and Libraries.)
Then, I declare two properties:
@property (retain) AVCaptureSession *captureSession;
@property (retain) AVCaptureDevice *videoDevice;
In the implementation file, right upon loading (in viewDidLoad), I create a capture session and add the video input to it. As I don't even start streaming, I don't add a AVCaptureVideoPreviewLayer to display the camera view or any kind of output (e.g., AVCaptureMovieFileOutput) to the session - just getting the device model-specific video modes doesn't require an active connection.
Finally, I just iterate over AVCaptureDevice.formats, postfixing the record's printed representation with its actual array index:
int idx=0;
for (AVCaptureDeviceFormat* currdf in self.videoDevice.formats)
NSLog(@"%@ - %i", currdf, idx++);
Note that, below, when presenting a full 720p60 recorder, I'll provide you with the sources of a much more useful demo app, which parses all the above-explained properties and selects the (currently, with the iPhone 4S / iPhone 5, only one) full range + 60 fps combo to set the current video mode to it.
(continues below)
Last edited: