A couple of weeks ago, we attended a fascinating NUX event called 'Making More Senses of UX'. The workshop, run by Alastair Somerville of SensoryUX.com, looked at the opportunities and threats of designing interfaces for different senses.
Alastair's built a career in sensory cognition, and creating accessible interfaces for people with disabilities. With the Apple Watch about to push wearables into the mainstream, and with smart environment technology like Nest disrupting traditional markets, Alastair's background gives him a unique and fascinating view on an area of user experience design that is about to explode -- one that most UX designers currently have little experience in themselves...
The key take out of the talk was that wearables and smart technology aren't just about smaller screens.
We are already seeing the limitations of visual interfaces, as shown in the pic below:
We are also seeing interesting uses of other types of interface such as auditory and touch, like the ability to draw on the Apple Watch:
Challenges and opportunities
Wearables bring opportunities but also challenges to us as interface designers.
Wearable technology is much more personal and so much more emotional. Whether we like it or not, these interfaces will trigger emotions, and sometimes in ways we don't expect -- just look at the negative reaction that Google Glass ultimately got.
We are so accustomed to designing for sight that we don't yet understand the challenges of different sensory interfaces. To demonstrate this, Alastair put us to the test, blindfolding us and then getting us to order drinks and communicate that someone had spilt our pint through touch alone -- which is turns out is a very, very tricky task.
In the same way we learned how mobile and touch were different from traditional mouse interfaces, the next step is to test and learn how customers emotionally respond to wearables, and find creative and successful ways to being sensory interfaces to life. Exciting times ahead!