The biggest problem with smartwatches, beyond the fact no one really knows what to do with them, is their small screens. Scrolling through text or swiping a notification is particularly frustrating when your finger obscures whatever it is you're trying to see. This is why you can't tap out a text message, let alone play games.

But some really smart designers at Carnegie Mellon's Human Computer Interaction Group found a way of making your arm part of the user interface. Over the past few years, they've come up with several novel ways of thinking beyond the edges of your typical smartwatch screen. Tilting and twisting the bezel lets people control the watch like a joystick. Skin Buttons projected buttons onto the wearer's forearm. Now there's SkinTrack, a project that explores how your arm might function as a touchscreen for wearables.

In the project video, a finger swipes and pokes at skin like it’s a touchscreen. As the finger navigates a hairy forearm, a cursor reacts to the movement on the smartwatch screen. There’s no projection and little lag between the finger's movement and movement on the screen. Other projects that explore similar interactions use a camera to track finger motion. Those systems track continuously, says Gierad Laput, who co-wrote the paper describing SkinTrack, "but then, you have a Kinect on your shoulder."

Carnegie Mellon’s method works differently. Instead of a camera, the researchers developed a ring that sends a high-frequency alternating-current signal into your finger. When your finger touches or hovers above your arm, that signal propagates outward along your skin to a wristband embedded with electrodes. By measuring something called phase difference, which this technology does by comparing the times at which the oscillating signal arrives at two pairs of electrodes, SkinTrack can determine the position of your finger with impressive accuracy.

The team offered some fairly straightforward applications for SkinTrack. In one example, the user streeetches the slingshot in Angry Birds across his forearm. Another shows the wearer using his arm to create app shortcuts. Swiping down scrolls through music, while swiping right selects. It's not perfect—without a projection to show you exactly what you're touching, there's an element of guesswork involved in using your arm as an input device. Still the system can tell when you’re touching and hovering above your arm with 99 percent accuracy, when wearing a shirt.

Laput says SkinTrack is a natural extension of his previous research, most of which explores using the area beyond the touchscreen (i.e. your body) to improve the user experience. For Laput, the body is just another input and sensing platform. “Looking at the bigger picture, you can make your arm into an actual sensor,” he says. “If you imbue your arm with computation, you’re basically augmenting the human experience.” Laput’s vision for this subtle variety of cyborg-ism revolves around two ideas: using the arm as an input device, and using the body to augment other activities. That idea is best illustrated by CMU’s Em Sense project, which uses the body’s natural conductivity to sense what a hand is touching. This allows your devices to surface context-specific applications without you having to pull anything up.

The technology isn't quite ready for consumers. Right now, SkinTrack must be calibrated to each user because electricity passes through each body differently. Laput says the system will work better once a projection can give visual feedback, but that’s still years away. They’re also researching how the electrical signals might impact things like Pacemakers.

Limitations aside, the lab’s projects provide a compelling vision of how the human body might be used as an interface. Stroking your arm to open Spotify might sound weird today, but the day is coming when it’ll seem as natural as tapping a screen.