When Steve Jobs introduced the first iPhone in 2007, its most spellbinding feature—the one that set it apart from BlackBerrys and Nokias, and the one that would forever change the way we interacted with our devices—was the touchscreen. All it took was a light press of the fingertip and an icon on the screen would spring open, as if tapped with a magic wand.

Touchscreen technology has since become inseparable from nearly every “smart” interaction. We have interactive screens on our laptops and televisions and watches and refrigerators; children, so immersed in a touchscreen world, have been known to press on inanimate objects expecting them to respond. The touchscreen has become ubiquitous; the touchscreen has become blasé.

So Google, in building its next phone, wants to introduce the next big thing: a way to control our screens like an orchestra conductor brandishing an invisible baton.

Google’s gesture technology is merely a glimpse of a touchless future.

When Google's next flagship smartphone, the Pixel 4, arrives this fall, it will respond to a series of gesture interactions—a pinch of the fingers, or a wave of the hand—without the user ever needing to touch the screen. Taken together, Google calls these controls “Motion Sense.” A teaser video shows a woman unlocking her new Pixel with a blink, then waving her hand to cycle through a series of songs playing on her phone. The video is only a few seconds long, but it calls to mind the way Jobs described the original iPhone: “It works like magic.”

When the Pixel 4 comes out, it will only have a few gesture controls: snoozing alarms, skipping songs, silencing phone calls. But by the time Pixel owners get used to pinching their fingers together and rotating their thumb on invisible dials, a seismic shift will already be underway. Gesture technology will further turn our devices into extensions of ourselves; we move our fingers, and the feedback shows up on a screen. That type of interaction won’t end with phones. One day, we might control every screen with a flick of the wrist.

Google’s gesture technology is merely a glimpse of a touchless future. Just like the iPhone taught millions of people to interact with their world by tapping and swiping, the Pixel may train us on a new kind of interaction, changing how we expect to interact with all of our devices going forward.

Do Not Touch

Ivan Poupyrev, the technical projects lead at Google’s Advanced Technology and Projects division, has been working toward that future for years. Five years ago, Poupyrev founded Project Soli, a skunkworks lab at Google to invent better gesture controls using miniaturized radar technology. The team developed a chip that's smaller than the size of a nickel and studded with sensors. Those sensors emit electromagnetic waves that pick up on motion, then translate it into data. Google says the technology works better than 3D cameras to capture fine motions, like pinching together two fingers to “press” a button on the screen.

In January, the Federal Communications Commission gave the technology its nod of approval, noting that the Soli chip “will serve the public interest by providing for innovative device control features using touchless hand gesture technology.” The Pixel 4, expected to land in October, will be its commercial debut.

Google is far from the only company hoping to crack gesture controls. Touchless interaction has liberated gamers from joysticks, offered more flexibility in virtual reality, and allowed people to beckon their drones, no remote control needed. Some car-makers, like BMW, have included basic gesture controls to let drivers adjust the volume or accept phone calls while driving by waving their hands. Even Apple has been inching toward this; to silence an alert on your Apple Watch, you can simply place your palm on the display.