« August 2014 | Main | October 2014 »

September 2014

It is Indeed Possible to Type 30 Words Per Minute on a Smart Watch

Ironically, as cell phones are getting bigger, we see increasing popularity in ultra small screen devices such as smart watches. With these smaller screens we need to find ways to work more efficiently with them or risk these new devices being regarded as novelty items. The same old interfaces don't work.

Autodesk Research Swipeboard Smart Watch Text Entry
What time is it? It's time for Swipeboard!

One of the most common things to do on a mobile device is to enter text. We've learned to enter text with our thumbs so we can continue to learn. The problem with a smart watch is that it's a one handed device and the size of the screen really only works for a single finger without obscuring too much of the screen. Not content to revert to hunt-and-peck typing 101 the Autodesk Research User Interface group set out to find a solution.

Enter Swipeboard

Autodesk Research Swipeboard Title

Swipeboard takes inspiration from Morse code and gestural input for an easy to master text entry paradigm that sees users entering more than 30 words per minute (wpm).

Morse Code
The fastest recorded Morse code entry is 140wpm.

Swipeboard uses a QWERTY keyboard broken up into segments of 3 or 4 characters. The user simply taps in the region of the character block and then swipes to identify the character. Some users have achieved a level of comfort with the system that allows them to enter text without looking at the screen.

Autodesk Research Smart Watch Text Entry Swipeboard
First a QWERTY style keyboard is shown for selecting the character region
Autodesk Research Smart Watch Text Entry Swipeboard
After a tap, the keyboard zooms in to prompt for a gesture to define the specific character


Hard to believe? Watch the video of Swipeboard in action below. Note that the video is not sped up - you're seeing it work in real time.


 

What's next for Swipeboard?

Well, we'll be talking about it at UIST 2014, the User rInterface Software and Technology Symposium, for starters.

Autodesk Research Swipeboard Glasses
Swipeboard could be applied to other wearable devices such as glasses

For future work, this could be interesting to explore on other wearable devices like glasses and rings. It could also be interesting to see Swipeboard expanded from characters to complete words. What do you think?

If you liked this post, you might also like to read about Duet, a research project that looks at making a smart watch and smart phone work well together. Duet shows that 1 + 1 can equal more than 2.


Kitty is a Drawing Tool for Interaction Authoring

Hopefully you're familiar with Project Draco, our answer to the question:

Can animation be made as easy as drawing?

Dragon[1]

We've discussed Draco here on the blog and have a video overview of what we were showing at this year's SIGGRAPH conference in Vancouver to catch you up.

Kitty builds on Draco and looks into the animation question and asks:

Can we make Draco interactive?

  Vakhkhosh[1]

In the image above you'll see two interactions happening:

  • the user can move the dragon's head into the frame
  • the user can move the baby dragon into the pot

With the egg going into the pot, you'll notice that the monster's eyesfollow the egg and that the egg causes a particle splash as it enters the pot.

This opens up a lot of possibilities for iteractive storytelling. 

  • How would children like this for an ebook on a tablet?
  • Does it make web content more dynamic?
  • Could it be useful for game authoring?
  • Is it useful for training and instructions?

Kitty builds on Draco but how does it work?

We've introduced a simple node network to define the relationships between objects. Let's look at the picture below of a different egg going into a different pot - yes we like cooking here at Autodesk Research.

Autodesk Research Kitty Interaction Authoring Dynamic Drawings

We've set up the scene as you would in Draco with steam and splashing particles coming from the pot. In the following image you can see that we have a simple node graph that gets overlaid on the picture. This helps reduce UI while keeping the events and relationships in context.

Autodesk Research Kitty Interaction Authoring Dynamic Drawings

You can see the path the egg takes to get into the pot as well as two blue circles representing the particle events. The user is making a connection from the egg to the circle on the right to tell the splash to only happen when the egg is close.

Autodesk Research Kitty Interaction Authoring Dynamic Drawings

When the connection is made between the nodes, the egg path and the splash, the user can then choose how to link the events. In this case the movement of the egg is connected to the emission of the particles. The inlaid square defines the timing of the event. 

Autodesk Research Kitty Interaction Authoring Dynamic Drawings

The curve can be redrawn to control what happens. The horizontal axis represents the object that triggers the event (the egg). The vertical axis represents the object that is being driven (the particle splash). When the line is flat, there are no particles being emitted.

In this image below we explore using Kitty to explain how an electric doorbell works.

Autodesk Research Kitty Interaction Authoring Dynamic Drawings

You can learn more about Kitty and see how easy it is to author these kinds of behaviours in the video below. More information is available on the Draco project page.

We'll be presenting this latest research at this year's UIST, the ACM User Interface Software and Technology Symposium, in Hawaii in October. If you are there, stop by to see the demo or attend the talk. 

Whether you are at UIST or not, please let us know what you think about these tools and the possibilities that they open up for you. 


One Awesome Tool for User Experience Designers and Researchers

Here's a great tool for those of you exploring the User Experience - the Paper Forager!

Autodesk Research Paper Forager for User Experience Design

There is lots of good material nowadays for exploring user experience. Perhaps so much that it makes it hard to get started or find what you need. And that's where the Paper Forager steps in to help! For those with acess to the ACM Digital Library, the Paper Forager let's you explore more than 5000 research papers from ACM CHI and UIST

The Paper Forager is easy to use and provides things like:

  • searching
  • filtering by date
  • most popular authors
  • thumbnail views
  • paper overviews

All without having to download the paper and open it in a viewer. This can really speed up your research.

PaperForagerUI

To make things even faster, when browsing papers, the Paper Forager preloads adjacent papers ro quickly move forwards and backwards through your search results.

Please enjoy this video overview of the Paper Forager, complete with a toe-tapping, finger-snapping beat. If you're at UIST 2014, you can talk to members from the Autodesk Research team and complement them on their musical tastes :) 


How would Autodesk make the Apple Watch work with the iPhone?

For the impatient, the answer is we would make both devices active participants for display and input. Like two mathematical musicians playing a duet, the beautiful music they create can equal more than two (perhaps it can go as high as eleven).

Antique Clock Phone
Composite of images by Tim G. Photography and Garry Knight under the Creative Commons License

OK. Confession time: we do not have an Apple Watch or a new iPhone and we did this research before they were announced. We used devices that are publically available. But, that should not make this research any less interesting.

With the premise that two tools working together can create greater values, the User Interface group at Autodesk Research, with partners at the University of Toronto and Carnegie Mellon University, started to explore these possibilities:

  • new input methods 
  • new security possibilities
  • new operational abilities 

New Input Methods

The watch has an acceleronmeter in it so it can provide intital information about how the hand it is attached to is working as an input device. Specifically, what is the orientation of the hand relative to the phone. Knowing the orientation of the hand means a person is not limited to the traditional finger tip press. People could now also enter data with:

  • the side of the finger
  • the back of the finger, also known as the knuckle
Autodesk Research Knuckle Input with Smartphone and Smartwatch
A person may use their knuckle as an alternative to their fingertip or as an additional tool

What could this do for reading email? You could have one finger touch point for navigation (move through message, go to next message, etc.), one finger touch point for email management (archive, delete, etc.) and one finger touch point for things like cut, copy and paste.  

Of course, a person is not limited to entering data on the phone. Wearing the watch on the inside of the wrist, so that the watch screen is oriented in the same manner as the phone screen in the hand, a person could be gesturing across devices:

  • swipe from phone to watch (to mute the phone and set the watch to buzz mode)
  • swipe from watch to phone (to unmute the phone and turn the buzz off on the watch)
  • close pinch across devices (to mute both devices)
  • open pinch across devices (to unmute both devices)

  Autodesk Research Multi Device Gestures on Smartphone and Smartwatch

And of course you could tap the phone and watch together or tap and flip devices to initiate additional commands.

Autodesk Research Smartphone and Smartwatch Double Bump

New Security Possibilities

Knowing a phone and watch are paired, could improve security - you can only use the phone if your watch matches. Plus, a new gesture could be made for unlocking the phone.

Autodesk Research Smartphone and Smartwatch Security

New Operational Abilities

This is where it really gets cool!

What if you used the watch as a tool to zoom in on a map without losing your position?

Autodesk Research Smartwatch to zoom in on Smartphone Display

What if you used the watch as a tool palette for the phone?

Autodesk Research Smartwatch Smartphone Tool Palette
New gestures and connections open up a lot of possibilities. You can read more about this research in the Autodesk Research paper entitled Duet: Exploring Joint Interactions on a Smart Phone and a Smart Watch as well as watching the movie below. What kind of things could you imagine doing with these abilities?


What is the latest in state of the art baseball analytics?

What is the latest in state of the art baseball analytics?

Who said Autodesk was only about designing stuff?

How do you visualize and analyze mass amounts of video data?

Read on for answers to these pressing questions and more.

Video Lens is one of the freshest pieces of work from the User Interface group at Autodesk Research. Something really interesting is that this uses baseball as the foundation for this exploration into big data and analytics.

Autodesk Research Video Lens User Interface Overview
The Video Lens User Interface is Easy to Use and Provides Real Time Feedback

As we know from stories like Moneyball, sports analytics is a serious business and lots of data is captured. The team worked with about 8000 video clips and the metadata captured by the PITCHf/x toolset to quickly visualize what happens in the game of baseball and analyze specific plays. Video Lens had four design goals:

  1. Maximize Information Desnity: Reveal as much as possible about the underllying data
  2. Treat Events as the Primary Entity: Events are the important parts in the data; each data source (video clip) may have multiple events
  3. Support Rapid Playback of Video Results: If you have to wait for things to happen the system is of lower value
  4. Encourage Exploration of the Data: trying things out are low cost


Imagine you want to see all the knuckleball pitches thrown by a southpaw that were high and to the right. You would circle the target in the pitch zone UI, mark the pitch and pitcher type. Three quick steps in this interactive tool and all the appropriate video clips play back.

Autodesk Research Video Lens Strikezone User Interface
The Video Lens Strikezone User Interface Allows the User to Quickly Isolate Specific Pitches

Want to narrow that down to the team you're facing next week? 

Autodesk Research Video Lens Teams User Interface
Autodesk Research The Video Lens Teams User Interface

What if you were a sports commentator and had to fill in one of those awkward moments where a streaker was being escorted off the field? You could go to the video Lens and talk about potential strategies with real time feedback to illustrate. Or maybe you just want to see how many balls are hit to the area of the field your streaker was spotted in and talk about the chances of him being hit by a live ball

Autodesk Research Video Lens Field User Interface
The Video Lens Field User Interface

Intrigued? Here's a video showing the Video Lens system in action:

Taking this back to Autodesk and the possibilities for use in other tools, here are some possibilities:

  • TV broadcasters looking for B roll footage in their database of a rainy day at a famous landmark
  • Film editor sorting through possibilities in their storytelling
  • Advanced learning possibilities for tools like Autodesk Screencast
  • Connecting security camera data to your BIM
  • Connecting video recordings of your real work product stress testing (eg crash test dummies, force testing chairs, etc.) to your digital product design

We think Video Lens is a great toolset for so many things. Let us know what you think here on the blog or on our Video Lens Research page where you can also find more info and contact information.

By the way, if you happen to be in Hawaii around October 8th, please stop by the UIST Symposium where we'll be talking about Video Lens in person.

Play ball!