Chris Harrison doesn’t mean to parse words, but if you really want to know, Apple’s new 3D Touch feature isn’t technically 3D touch. “I thought it was sort of amusing Apple called its technology 3D Touch because I don’t really think of it as a 3D manipulation,” he says. “Pushing your finger into the screen—I would call that pressure sensing.”

Go ahead and roll your eyes, but Harrison has reason to claim it a misnomer. As head of Carnegie Mellon’s Future Interfaces Group and a co-founder of interface tools startup Qeexo, the guy knows 3D touch when he sees it. In fact, Harrison himself (with his team at Qeexo) has developed what he believes to be true 3D touch, with a new product called FingerAngle.

FingerAngle might not have quite the same marketing ring as 3D Touch, but it’s an incredibly cool interaction feature that allows a touchscreen to register not just where you’re touching but how you’re touching—all through software. Harrison explains that when you touch most phone or smartwatch screens, your finger registers as a big black blob, which signifies the point of capacitance. “That’s how most touchscreens figure out the x and y of where you’re touching,” he says, referring to the horizontal and vertical coordinates of your finger placement on a two-dimensional surface.

Harrison and his team found that even when our fingers aren’t fully on the screen, the device still registers a weak connection. Our gadgets rightfully filter out these weak signals—after all, you don’t want to activate an app when your finger is an inch away from the screen—but it’s that seemingly hazy data that Harrison is interested in. “We said, hm, there’s something interesting there; let’s embrace the weak signal,” he says.

By measuring a finger’s angle relative to the screen’s surface, the phone is able to register the x- and y-rotation of a touch. This opens a whole new dimension for touchscreens, and creates a richer vocabulary of interactions we can use on our ever-shrinking screens.

Imagine this: Instead of dragging a tiny scrubber across your Moto360 to increase volume, all you have to do is place the tip of your finger on the screen and twist clockwise. Or maybe you want to zoom in on Google Maps; this same twisting motion could replace pinch or tap. A particularly compelling example shows how a finger can orient a spaceship in a smart watch game along the x-, y-, and z-axes.

Harrison says building these new kinds of interactions into the lexicon of touchscreen gestures means starting slow. “You sort of have to ease into it,” he says, explaining that it would make sense to introduce it as a simple, replicable behavior—like twisting to increase or decrease volume. He imagines that, eventually, the technology behind FingerAngle will be layered on top of interactions like 3D Touch and multitouch, ultimately creating an expressive ecosystem of gestures we can use with these touchscreen devices. That means your smartwatch won’t just be able to tell you’re touching it at a 45-degree angle, but how hard you’re touching it at that 45-degree angle. “All of these technologies are complementary,” he says. What will be possible when our smartphones come loaded with this sort of software? That’s up to the developers, Harrison says. “We’re just giving developers the interesting building blocks to unlock new interactive experiences.”