Two representatives from Tobii visited Avanade, where I work. They gave a presentation on the background of Tobii, the way their technologies are leveraged, and how developers can make use of their technology through the EyeX SDK.
Later that day we had a brainstorming session, and consequently developed a proof of concept using Tobii EyeX and Leap Motion to control a Spotify player.
Trying out EyeX
During the day we were able to try out Tobii’s EyeX controller on Windows 8.1. We used Modern UI apps such as Bing Maps, Windows Store, and Twitter. Since these kind of apps have been designed with touch in mind, it benefitted Eye Interaction as the ‘hit target’, so to speak, was much larger compared to UI’s designed for a mouse pointer. Larger hit targets allowed for improved accuracy when invoking UI elements such as tiles. Eye Interaction was facilitated by holding down a key with a special binding. This key allowed us to switch between modes, such as zooming in or out, or panning across a map, for instance.
Designing for Eye Interaction
Tobii shared their principals on what to consider when designing eye for interaction:
- Eyes are made for looking around
- Eyes and hands work well together
- Eyes are curious
- Eye movements provide information
Using these principles, we began a whiteboard session to explore how we use our eyes when using computers. We agreed that our eyes are “passive”, and that the clues our eyes give should supplement another method of interaction.
We grounded this theory based on a study by UIE, where they looked at how users found flyout menus and rollovers, and discovered:
“We found users follow a pattern: they decide what they are going to click on before they move the mouse.”
- Users Decide First; Move Second by Erik Ojakaar, UIE
In keeping with our Natural User Interface (NUI) theme, we wanted to try and combine Tobii EyeX with another gestural technology. We were fortunate to have both Microsoft Kinect v2 and Leap Motion available to us, which gave us some interesting capabilities to try and combine.
The concept we developed that day was a Spotify controller using Tobii EyeX and Leap Motion. EyeX detects when the user is looking at the Spotify icon in the task bar. Leap Motion provided an interface where a user can give hand gestures to control Spotify. Gestures recognised by Leap Motion would not be honoured unless the user was looking at the Spotify icon at the same time as performing the hand gesture. The proof of concept application supported the following gestures:
- Poke to play or pause
- Wave right to play next track
- Wave left to play previous track
- Circle clockwise to increase volume
- Circle anticlockwise to decrease volume
Why Leap Motion Instead of Kinect v2?
We chose Leap Motion over Kinect for our Spotify controller for the following reasons:
- The user needs to be beside the computer, as the EyeX controller has a limited range of view.
- Leap Motion has a much smaller desktop footprint, which suits close range interaction.
- Leap Motion specialises in hand gestures, detecting each finger and thumb.
Having the eye tracking and motion capture capabilities as separate pieces of hardware quickly clutters your workspace. The fact that they are separate also means it is not really suitable for a laptop, as it requires a desk, and makes moving from one place to another quite cumbersome.
Many computers, like the one we used for the prototype, have media keys. Those keys allow you to change the volume, skip or return to a previous track, play, and pause. In terms of interaction speed, those media keys, although not formally measured, appeared to be considerably faster than using gestures.
Nevertheless, that day was a very thought-provoking experience. The capabilities on show were very impressive, and it will be interesting to see how they develop and are leveraged in the future.