In this comic, Rubaiat gets into his motivation and process for creating such tools and offers inspiration for all researchers, UX designers and software developers. Beyond that, it's a great read and you can have a taste of it below!
The CHI conference showcases the very best advances in computer science, cognitive psychology, design, social science, human factors, artificial intelligence, graphics, visualization, multi-media design and more is approaching with Autodesk participating both as a proud sponsor and presenter. The theme for CHI 2015 is "Crossings": crossing borders, crossing boundaries, crossing disciplines, crossing people and technology, crossing past and future, crossing physical and digital, crossing art and science, … crossing you and me.
This year Autodesk Research has three papers receiving Honorable Mentions (the top 5% of all submissions):
Fraser Anderson, Tovi Grossman, Daniel Wigdor (Department of Computer Science, University of Toronto) and George Fitzmaurice look at ways to conceal your usage of mobile devices and stay connected without offending your co-workers.
There has been a longstanding concern within HCI that even though we are accumulating great innovations in the field, we rarely see these innovations develop into products. Our panel brings together HCI researchers from academia and industry who have been directly involved in technology transfer of one or more HCI innovations. They will share their experiences around what it takes to transition an HCI innovation from the lab to the market, including issues around time commitment, funding, resources, and business expertise. More importantly, our panelists will discuss and debate the tensions that we (researchers) face in choosing design and evaluation methods that help us make an HCI research contribution versus what actually matters when we go to market.
Parmit K Chilana, Management Sciences, University of Waterloo, Waterloo, Canada
Mary P Czerwinski, Microsoft Research, Redmond, United States
Tovi Grossman, Autodesk Research, Toronto, Canada
Chris Harrison, Human-Computer Interaction Institute, Carnegie Mellon University, Pittsburgh, United States
Ranjitha Kumar, Computer Science, University of Illinois at Urbana-Champaign, Champaign, United States
Patrick Baudisch, Hasso Plattner Institute, Potsdam, Germany
Shumin Zhai, Research @ Google, Mountain View, United States
Digital 210 King is part of Project Dasher, a research project designed to study buildings as living organisms. The Autodesk Toronto office was laser scanned, creating a point cloud that could be used to create a Building Information Model (BIM).
To support the community in studying how buildings operate, the Autodesk Research team has provide the dataset for the building. Kai Kostack has worked with the point cloud data to create a very artistic look at the building which you can watch below.
Multi-touch tabletop computers are useful tools and the User Interface group at Autodesk Research has explored ideas on ways to make them even better with a system called Medusa. Imagine a world where the tabletop can recognize multiple users, differentiate between right and left hands and support non-touch, virtual reality type gestures like in the sc-fi movie Minority Report.
It all starts by hacking a Microsoft Surface with 138 proximity sensors and Phidget Interface Kits. These sensors extend the touch capabilities of the computer surface to determine user proximity and the location of their hands. The sensors ane not only inexpensive but they remove complications like setting up cameras or requiring users to wear gloves or tracking markers. In a future incarnation, these sensors could be built into the table for a better aesthetic and to prevent the users from needing to worry about them.
Medusa'a sensors are arranged in three rings. An outward-facing ring of 34 sensors is mounted beneath the lip. Two upward facing rings atop the table are made-up of 46 sensors on the outer ring and 58 sensors on the inner ring.
All of this adds up to allow Medusa to support the following user interactions:
User Position Tracking
Independent Left and Right Hand Tracking
Hand Gestures (Pre-Touch Functionality)
Touch + Depth Gestures
This was tested with a prototype UI creation application called Proxi-Sketch. Proxi-sketch allows users to collaboratively develop new graphical user interfaces. You can see it all in action in the following video. If you want to know more about building the system or how parts of it worked, please refer to the Medusa publication.