Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Casey-Scalf

macrumors newbie
Original poster
Feb 9, 2011
7
0
Hey,

So I am wondering how I can use the video function of the device better. Instead of taking a picture or a video I would like to use that video feed realtime to provide analysis on what it is seeing.

My question would then be:

--Does any SDK for xcode support live video feeds?
--And if so, is there enough hardware muscle to engage the camera and analyze what it is viewing (akin to gesture recognition in a Xbox Kinnect)
 
You can grab each from out of the live video stream using the AVFoundation framework classes. As to whether this is enough horsepower to analyse what you are seeing: it would depend on the analysis. I would note that the Kinnect uses multiple cameras you you are unlikely to be able to do that.
 
Thanks for the tip!

Although multiple camera angles would be cool, one will have to do for now
 
You might be interested in this thread. I posted code that will create a video stream, saving it to a file, and examining/drawing on each video frame as it's captured. Obviously that's a bit different to what you want but it should give you a start.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.