iOS How do they do the effects in video?

1458279

Suspended
Original poster
May 1, 2010
1,601
1,514
California
I just started an app involving video and I see other apps that seem to add contrast / brightness / zoom and various effects.

I don't see any of that in the Apple documentation.

My guess is that they capture frames, alter the frames and then feed them into a preview window.

Is that how it's done, or am I not seeing something that's in Apple's documentation?

Anyone know of a site/book that has a tutorial on working with video that actually goes beyond recording and playback?
 

1458279

Suspended
Original poster
May 1, 2010
1,601
1,514
California
They did it with these apps:
Motion
After Effects
Shake
Final Cut Pro
Ok, those are apps for the Mac. I'm asking how these are programmed in ObjC for the iPhone.

Example, an app that takes a video and zooms, converts to black and white, adjusts contrast, overlays pictures within a video.

What I'm asking, is how do you program an app to do this?

It looks like it's not built into the APIs that Apple offers, so it's either custom work on the data file or an aftermarket API.

I saw one sample on stackoverflow concerning zoom, but haven't seen anything else out there.
 

larswik

macrumors 68000
Sep 8, 2006
1,552
11
These are things that I was looking into doing in the future too. I have been doing video production for years and things are manipulated on a pixel bases. So you take the current frame of video and within that frame you manipulate the pixels.

Although I have not done it I have seen some Objective C code for image manipulate. I would take the frame of video, create a context for the video frame and manipulate that frame then do the same till it reaches the end of the file.

Something like Gaussian Blur is a mathematical equation to manipulate pixels read here http://en.wikipedia.org/wiki/Gaussian_blur

Now I could be off on how to do it but with my programming skills I have learned so far that would be the direction I would start in.
 

xStep

macrumors 68020
Jan 28, 2003
2,006
99
Less lost in L.A.
I just started an app involving video and I see other apps that seem to add contrast / brightness / zoom and various effects.

I don't see any of that in the Apple documentation.

My guess is that they capture frames, alter the frames and then feed them into a preview window.

Is that how it's done, or am I not seeing something that's in Apple's documentation?

Anyone know of a site/book that has a tutorial on working with video that actually goes beyond recording and playback?
First, what other apps?

This is performed on a frame-by-frame & pixel-by-pixel bases. Some developers may have written their own manipulation code in the past by converting a frame to a CoreGraphics image, altering it, and converting by to a video frame image. You can now look at using CIImage and related tools. Some conversion may still be needed. There is a way to add custom filters via CIFilter. The Accelerate framework could come in handy too.

I'm also considering using some of these effects in the future. Apple keeps making it easier and more attractive to do so.


They did it with these apps:
Motion
After Effects
Shake
Final Cut Pro
You jumped in before comprehending the question. The question is; how do other iOS video apps accomplish effects.
 

1458279

Suspended
Original poster
May 1, 2010
1,601
1,514
California
First, what other apps?
There's actually a huge number of them, one makes a face look like a cat's face, others draw on the video in an AR style. Some offer zoom beyond what's offered on the native Apple camera app (zooms more).

One that looked simple and cool, turned the device into a magnify glass, I saw some sample code online.

I saw Apples sample, where it adjusts Hue, Contrast, etc... I'll dig into that and see how they do it...

So it looks like this it done pixel by pixel.

Ok, so are there any good aftermarket APIs out there?
 

ArtOfWarfare

macrumors G3
Nov 26, 2007
8,580
4,020
There's actually a huge number of them, one makes a face look like a cat's face
That's just a matter of Augmented reality. I was under the impression that Apple touted iOS 4 as adding in that kind of feature to the SDK? (I'm not actually familiar with this kind of thing... it always struck me as too complicated for not enough of a payoff. I suppose the real limit is my imagination for what could be done with it... best idea I have is some Xray vision that would let you see the next subway approaching...)
 

Duncan C

macrumors 6502a
Jan 21, 2008
853
0
Northern Virginia
I just started an app involving video and I see other apps that seem to add contrast / brightness / zoom and various effects.

I don't see any of that in the Apple documentation.

My guess is that they capture frames, alter the frames and then feed them into a preview window.

Is that how it's done, or am I not seeing something that's in Apple's documentation?

Anyone know of a site/book that has a tutorial on working with video that actually goes beyond recording and playback?
There are a couple of different ways to do this sort of thing.

Starting with iOS 5, Apple Added Core Image filters. You can apply a fair number of CI filters to video. Contrast/brightness, tint, and quite a few other effects are quite easy to do.

Another way to do this sort of thing is to set up an AV capture session, bring in frames of video in a CAVideoBuffer that is linked to an OpenGL texture, manipulate the texture in OpenGL, and write an output texture back to the AVCapture session. The ChromaKey demo from Apple (released to WWDC 2011 attendees) shows how to handle the flow of frames of video in a performant way. I'm putting the final touches on an app for my company that used the ChromaKey code as a starting point. It does morphs like "Fat Booth" and Photo Booth, on live video, by doing mesh warping in OpenGL textures.
 

xStep

macrumors 68020
Jan 28, 2003
2,006
99
Less lost in L.A.
There's actually a huge number of them, one makes a face look like a cat's face, others draw on the video in an AR style.
I thought you had something specific in mind. As others have pointed out, there are many things you can do and more frameworks at your call. After market frameworks may be difficult to come by as once someone has written these apps, they consider the specifics secret. You're more likely to find snippets of code samples.

Some effects only affect a single frame while others require combining frames and images. Last year I did a proof of concept that required using contiguous video frames and filtering via a dynamic mask. My knowledge the frameworks gave me enough to display some of the possible options. Browse all of the imaging and video frameworks to get familiar with their capabilities. Watch the associated WWDC videos too.

If you have something very specific in mind, then clearly ask for appropriate leads. A generic question is going to get broad answers that may not be of much help.

Duncan C, please let us know when that app is released. I'd like to see it in action.
 

1458279

Suspended
Original poster
May 1, 2010
1,601
1,514
California
There are a couple of different ways to do this sort of thing.

Starting with iOS 5, Apple Added Core Image filters. You can apply a fair number of CI filters to video. Contrast/brightness, tint, and quite a few other effects are quite easy to do.

Another way to do this sort of thing is to set up an AV capture session, bring in frames of video in a CAVideoBuffer that is linked to an OpenGL texture, manipulate the texture in OpenGL, and write an output texture back to the AVCapture session. The ChromaKey demo from Apple (released to WWDC 2011 attendees) shows how to handle the flow of frames of video in a performant way. I'm putting the final touches on an app for my company that used the ChromaKey code as a starting point. It does morphs like "Fat Booth" and Photo Booth, on live video, by doing mesh warping in OpenGL textures.
I searched all over and can't find that ChromaKey demo, must be for attendees ONLY :(

Someone in a feed, mentioned that it's a lot faster than other routines, I'd like to see it in action and dig into the code.
 

1458279

Suspended
Original poster
May 1, 2010
1,601
1,514
California
I thought you had something specific in mind. As others have pointed out, there are many things you can do and more frameworks at your call. After market frameworks may be difficult to come by as once someone has written these apps, they consider the specifics secret. You're more likely to find snippets of code samples.

Some effects only affect a single frame while others require combining frames and images. Last year I did a proof of concept that required using contiguous video frames and filtering via a dynamic mask. My knowledge the frameworks gave me enough to display some of the possible options. Browse all of the imaging and video frameworks to get familiar with their capabilities. Watch the associated WWDC videos too.

If you have something very specific in mind, then clearly ask for appropriate leads. A generic question is going to get broad answers that may not be of much help.

Duncan C, please let us know when that app is released. I'd like to see it in action.
Actually what I was hoping for was a comprehensive API and documentation :D No such luck.

I'm just poking around learning how various things are done. These AR, OCR, and Object Recognition programs look very interesting. But right now, contract, brightness, zoom, etc would do.
 

Duncan C

macrumors 6502a
Jan 21, 2008
853
0
Northern Virginia
Actually what I was hoping for was a comprehensive API and documentation :D No such luck.

I'm just poking around learning how various things are done. These AR, OCR, and Object Recognition programs look very interesting. But right now, contract, brightness, zoom, etc would do.
For basic things like contrast, brightness, tinting, and the like, Core Image Filters are what you need. They are designed to be fast enough to work with video.

I've seen some sample apps that do CI filters with still images, but I'm not sure about video. I suggest googling something like "iOS Core Image Filters for Video"

I found this link using that search string:

http://maniacdev.com/2011/10/using-core-image-filters-in-ios-5-on-a-live-video-feed/