If the rumors are true, you will be able to use the Pixel 4 without touching it or talking to it.



Google is allegedly adding its proprietary radar chip to the next flagship phone. Like, do one subtle hand motion and your volume will go up or your droids will be free to pass any Imperial control.



(Image credit: Tom's Guide)

According to 9to5Google, Project Soli could debut in the Pixel 4. The technology — which got a special FCC permit for testing as it needed to broadcast an electromagnetic signal more powerful than what is currently needed — allows users to use subtle finger and hand movements to control virtual knobs, buttons, and sliders without having to touch a device.

MORE: New Pixel 4 Model Leaks Along with Facial ID Details

Ivan Poupyrev — Project Soli’s founder — has been working on the technology at Google’s Advanced Technology and Projects Group since 2015. According to Poupyrev, it can sense the tiniest motion and works up in everything, in close proximity to small devices like smart watches or larger gadgets like speakers or TVs across the entire room. However, the first implementation may be coming in the Pixel 4 and Pixel 4 XL, 9to5Google says.

The rumor has been allegedly confirmed by XDA — the site found code in Pixel Q that shows that Google is working on new gestures that require an “Aware” sensor — the consumer name for Project Soli’s hand gesture feature. XDA claims the sensor will possibly debut in the Pixel too.

Would Soli be useful?

It’s yet to be seen how practical this technology will be. Waving your hands like a Jedi to make things happen without having to touch anything is objectively awesome from a futuristic nerd point of view.

(Image credit: Google's Project Soli chip could enable gestures on the Pixel 4. Credit: Google)

But your phone is naturally in your hand most of the time, so why rub your thumb and index fingers to increase the volume or scroll when you just can do it with rubbing a finger on the screen? And when your phone is charging on the table, woudn’t it just be easy to say “Hey Google” and tell the assistant to pump the volume up?

It just seems to add a layer of user experience complexity to a set of touch conventions that are already complicated enough to learn to their full potential — at least for most people.

So far, the story of waving your hands as a user interface device has been plagued with failures. LG tried it with the LG G8, but it came off as gimmicky. Same with Samsung, which debuted “Air Gestures” years ago on the Galaxy S4.

But perhaps Google’s implementation will be so refined and useful that it may be an attractive point to consumers beyond the first five-minute “hey-mom-look-I’m-Obi-Wan-Kenobi” novelty factor. Or maybe it will be Google’s version of Apple’s 3D Touch, which also added a hidden layer of complexity most people never got to use and is reportedly going to be killed because most people don’t give a damn about it.

Be sure to catch up on all of the latest Pixel 4 rumors in our hub.

