Feasibility of eye/head-tracking with built-in camera?

Discussion in 'Mac Programming' started by GorillaPaws, Jul 2, 2009.

  1. macrumors 6502a

    GorillaPaws

    Joined:
    Oct 26, 2003
    Location:
    Richmond, VA
    #1
    So I was wondering if it's even remotely possible to try to track where the user is looking on-screen based on the video from their built-in iSight. I was thinking you could have a calibration mode where the user would look at flashing dots in the corners, mid-points of the sides, center and various other locations. The images snapped could then serve as the benchmarks.

    During the tracking phase, the image from the camera would be constantly compared (using color vectors maybe? any other suggestions?) to determine which benchmark images it most closely matches. Obviously this strategy isn't going to be nearly as accurate as the more sophisticated methods currently in use, but would it be so frustratingly inaccurate that it's not even worth the time to think about?

    I realize "accuracy" can mean a lot of different things, but I was thinking of a 200-300pt radius circle of tracking area with around 60-75% accuracy.
     
  2. macrumors member

    Joined:
    Mar 28, 2007
    #2
    there's actually a really good one called iTracker
     
  3. thread starter macrumors 6502a

    GorillaPaws

    Joined:
    Oct 26, 2003
    Location:
    Richmond, VA
    #3
    Here's a link to iTracker for those who are curious. iTracker is doing something different that what I was talking about however. It translates movement of the head into actual cursor movements; I was trying to see if it would be possible to translate the image of the user's face into an approximation of where on-screen they are looking. Thank you for the reference though, it looks like a pretty impressive piece of software.
     

Share This Page