Watching what you’re watching (Image: Caroline Purser/Getty)

Bored of using a mouse? Soon you’ll be able to change stuff on your computer screen – and then move it directly onto your smartphone or tablet – with nothing more than a glance.

A system called EyeDrop uses a head-mounted eye tracker that simultaneously records your field of view so it knows where you are looking on the screen. Gazing at an object – a photo, say – and then pressing a key, selects that object. It can then be moved from the screen to a tablet or smartphone just by glancing at the second device, as long as the two are connected wirelessly.

“The beauty of using gaze to support this is that our eyes naturally focus on content that we want to acquire,” says Jayson Turner, who developed the system with colleagues at Lancaster University, UK.


Turner believes EyeDrop would be useful to transfer an interactive map or contact information from a public display to your smartphone or for sharing photos.

Midas touch

A button needs to be used to select the object you are looking at otherwise you end up with the “Midas touch” effect, whereby everything you look at gets selected by your gaze, says Turner. “Imagine if your mouse clicked on everything it pointed at,” he says.

Christian Holz, a researcher in human-computer interaction at Yahoo Labs in Sunnyvale, California, says the system is a nice take on getting round this fundamental problem of using gaze-tracking to interact. “EyeDrop solves this in a slick way by combining it with input on the touch devices we carry with us most of the time anyway and using touch input as a clutching mechanism,” he says. “This now allows users to seamlessly interact across devices far and close in a very natural manner.”

While current eye-trackers are rather bulky, mainstream consumer devices are not too far away. Swedish firm Tobii is developing gaze-tracking technology that can be installed in laptops and tablets and is expected to be available to buy next year. And the Google Glass headset is expected to include eye-tracking in a future iteration.

Turner says he has also looked at how content can be cut and pasted or drag-and-dropped using a mix of gaze and taps on a touchscreen. The system was presented at the Conference on Mobile and Ubiquitous Multimedia in Luleå, Sweden, last week.