Multi-touch tabletop computers are useful tools and the User Interface group at Autodesk Research has explored ideas on ways to make them even better with a system called Medusa. Imagine a world where the tabletop can recognize multiple users, differentiate between right and left hands and support non-touch, virtual reality type gestures like in the sc-fi movie Minority Report.
It all starts by hacking a Microsoft Surface with 138 proximity sensors and Phidget Interface Kits. These sensors extend the touch capabilities of the computer surface to determine user proximity and the location of their hands. The sensors ane not only inexpensive but they remove complications like setting up cameras or requiring users to wear gloves or tracking markers. In a future incarnation, these sensors could be built into the table for a better aesthetic and to prevent the users from needing to worry about them.
All of this adds up to allow Medusa to support the following user interactions:
- User Position Tracking
- User Differentiation
- Independent Left and Right Hand Tracking
- Hand Gestures (Pre-Touch Functionality)
- Touch + Depth Gestures
This was tested with a prototype UI creation application called Proxi-Sketch. Proxi-sketch allows users to collaboratively develop new graphical user interfaces. You can see it all in action in the following video. If you want to know more about building the system or how parts of it worked, please refer to the Medusa publication.