Recently Aperture 2 has support for "pinch" etc like the iPhone for laptops with touch pads.
So, I was thinking...why can't we take the software from the eyetoy that recognizes movement of hands and apply that to the OSX allowing us to use hand movements to control our programs and stuff.
How hard would this be to do?
So, I was thinking...why can't we take the software from the eyetoy that recognizes movement of hands and apply that to the OSX allowing us to use hand movements to control our programs and stuff.
How hard would this be to do?