At the 2009 Computer-Human Interaction (CHI) conference in Boston, Dr. Pattie Maes and her doctoral student, Pranav Mistry, the Fluid Interfaces Group at MIT’s Media Lab unveiled the prototype of SixthSense, a wearable gestural interface that augments our physical world with digital information, and lets us use natural hand gestures to interact with that information.
Basically, the SixthSense prototype is mobile projector coupled with a Webcam and a cell phone. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the Webcam recognizes and tracks user’s hand gestures and physical objects using computer-vision based techniques. SixthSense uses simple computer-vision techniques to process the video-stream data captured by the camera and follows the locations of colored markers on the user’s fingertips (which are used for visual tracking). In addition, the software interprets the data into gestures to use for interacting with the projected application interfaces.
The current SixthSense prototype supports several types of gesture-based interactions, demonstrating the usefulness, viability, and flexibility of the system. It allows the user to project information from the phone onto any surface — walls, the body of another person or even your hand. Interestingly, the current prototype system is quite inexpensive and it costs approximate $350 to build.
In the following videos, Dr. Pattie Maes and Pranav Mistry demonstrated SixthSense technology and showed how this technology creating a mobile interface that will integrate into many parts of our life, giving an access to information for making optimal decisions throughout our day.
I am very impressed with the smart ideas and potentials of SixSense technology. I can’t wait when such technology becomes available for consumers. I anticipate SixSense technology will significant changes the way we work as well as the way we teach and learn in the classrooms.