You must have heard of the Microsoft Surface some time ago, which was really a great device that responded to hand gestures and real-world objects. Now Microsoft is working on something more powerful, the LightSpace. LightSpace is a combination of elements of surface computing and augmented reality research to create a space whose surfaces would be to interactive to hand gestures and motions.
The setup for LightSpace is facilitated with multiple depth cameras and projectors. Actually Lightspace is a small closed cube system where one can employ lots of interaction methods for interactive displays and the space around them. The cameras in the LighSpace room are calibrated in respects with the real 3D world coordinates to project graphics in a realisitic form.
Senior researcher Andy Wilson and Hrvoje Benko of Microsoft Research explain their LightSpace research project in this video. (You must have Silverlight installed for this video to play).
The selective projection of graphics by using a depth camera gives a person, the ability to emulate interactive displays on normal surfaces for example, a table. So, a user can interact with an object projected on the surface of a table by multitouch input method. Isn’t that great? Users can pick up the object, and move it around easily.