View Full Version : Feasibility of eye/head-tracking with built-in camera?

Jul 2, 2009, 01:32 PM
So I was wondering if it's even remotely possible to try to track where the user is looking on-screen based on the video from their built-in iSight. I was thinking you could have a calibration mode where the user would look at flashing dots in the corners, mid-points of the sides, center and various other locations. The images snapped could then serve as the benchmarks.

During the tracking phase, the image from the camera would be constantly compared (using color vectors maybe? any other suggestions?) to determine which benchmark images it most closely matches. Obviously this strategy isn't going to be nearly as accurate as the more sophisticated methods currently in use, but would it be so frustratingly inaccurate that it's not even worth the time to think about?

I realize "accuracy" can mean a lot of different things, but I was thinking of a 200-300pt radius circle of tracking area with around 60-75% accuracy.

Jul 4, 2009, 01:08 PM
there's actually a really good one called iTracker

Jul 5, 2009, 08:39 PM
there's actually a really good one called iTracker

Here's a link to iTracker (http://www.eyetwig.com/151141) for those who are curious. iTracker is doing something different that what I was talking about however. It translates movement of the head into actual cursor movements; I was trying to see if it would be possible to translate the image of the user's face into an approximation of where on-screen they are looking. Thank you for the reference though, it looks like a pretty impressive piece of software.