One fundamental step to unlocking the full transformational potential of smart mobile technology is to significantly improve the usability of multi-function devices. As additional features have been added into mobile phones, there’s been a natural tendency for each new feature to detract from the overall ease of use of the device:

It’s harder for users to locate the exact function that they wish to use at any given time;

It’s harder for users to understand the full set of functions that are available for them to use.

This has led to feelings of frustration and disenchantment. Devices are full of powerful functionality that is under-used and under-appreciated.

Recognising this problem, companies throughout the mobile industry are exploring approaches to improving the usability of multi-function devices.

One common idea is to try to arrange all the functionality into a clear logical hierarchy. But as the number of available functions grows and grows, the result is something that is harder and harder to use, no matter how thoughtfully the functions are arranged.

A second common idea is to allow users to select the applications that they personally use the most often, and to put shortcuts to these applications onto the homescreen (start screen) of the phone. That’s a step forwards, but there are drawbacks with this as well:

The functionality that users want to access is more fine-grained than simply picking an application. Instead, a user will often have a specific task in mind, such as “phone Mum” or “email Susie” or “check what movies are showing this evening”; The functionality that users want to access the most often varies depending on the context the user is in – for example, the time of day, or the user’s location; The UI to creating these shortcuts can be time-consuming or intimidating.

In this context, I’ve recently been looking at some technology developed by the startup company Intuitive User Interfaces. The founders of Intuitive previously held key roles with the company ART (Advanced Recognition Technologies) which was subsequently acquired by Nuance Communications.

Intuitive highlight the following vision:

Imagine a phone that knows what you need, when you need it, one touch away.

Briefly, the technology works as follows:

An underlying engine observes which tasks the user performs frequently, and in which circumstances; These tasks are made available to the user via a simple top-level one-touch selection screen; The set of tasks in this screen vary depending on user context.

Intuitive will be showing their system, running on an Android phone, at the Mobile World Congress at Barcelona next week. Ports to other platforms are in the works.

Of course, software that tries to anticipate a user’s actions has sometimes proved annoying rather than helpful. Microsoft’s “paperclip” Office Assistant became particularly notorious:

It was included in versions of Microsoft Office from 1997 to 2003 – with the intention of providing advice to users when it deduced that they were trying to carry out a particular task;

It was widely criticised for being intrusive and unhelpful;

It was excluded from later versions;

Smithsonian magazine in 2007 called this paperclip agent “one of the worst software design blunders in the annals of computing“.

It’s down to the quality of the underlying engine whether the context-dependent suggestions provided to the user are seen as helpful or annoying. Intuitive describe the engine in their product as “using sophisticated machine learning algorithms” in order to create “a statistically driven model”. Users’ reactions to suggestions also depend on the UI of the suggestion system.

Personally, I’m sufficiently interested in this technology to have joined Intuitive’s Advisory Board. If anyone would like to explore this technology further, in meetings at Barcelona, please get in touch!

For other news about Intuitive User Interfaces, please see their website.