Gesture displays?

Discussion in 'Mac Programming' started by Kingsly, Aug 23, 2006.

  1. Kingsly macrumors 68040


    I am building a working model of the FTIR multi-touch interface. The screen itself is extremely simple to make, but I don't imagine the controller software is.

    It would have to:
    1) via an IR camera, detect blobs.
    2) decide that, from one frame to the next, its the same blob. (i.e. if one were to drag her finger across the screen)
    3) recognize blob position in relation to the screen.
    4) use this data to make an accurate human interface device, such as controlling a mouse pointer, or better yet, recognize gestures such as tapping (like a trackpad), dragging, pinching, expanding, etc. Such would be excellent for controlling Aperture, Final Cut, or even safari.

    So, I have no programing knowledge (yet, Im waiting for my Objective-C book) but am curious as to the difficulty of such a program, algorithms and all...
  2. kpua macrumors 6502

    Jul 25, 2006
    I hate to break it to you, but with no programming or research experience, this is probably way out of your league. Algorithms are not a simple thing to come up with. Tons of research has gone into making the demo that you linked to possible. To start from scratch and try to reproduce it would be, I'm afraid, way out of your reach.

    Keep dreaming though, and maybe someday you'll get there.
  3. Kingsly thread starter macrumors 68040


    I know there would be no way on earth I could come up with something like this. What Im wondering is the likleyhood that someone else does. A lot of people have built on these displays and are working on drivers, Im just trying figure out is someone will succeed.

Share This Page