Flipping off your television may gain a whole new meaning thanks to a technology being developed by a team of researchers at the University of Washington. The team, led by Assistant Professor of Computer Science and Engineering Shyam Gollakota, developed a system dubbed WiSee, which uses radio waves from Wi-Fi to sense human body movements and detect command gestures from anywhere within a home or office. The results of the WiSee team's research have been submitted to the ACM's 19th International Conference on Mobile Computing and Networking (Mobicom '13).

Unlike other "machine vision" sensors such as Microsoft's Kinect, the system can sense gestures anywhere within a house or office environment using the Wi-Fi signals created by devices already in the environment. The user doesn't need to be within line of sight of the WiSee receiver—or even in the same room.

"The nice thing about Kinect is that it does motion detection without you having to carry anything around," said Shwetak Patel, an assistant professor of computer science and electrical engineering at UW and one of the lead researchers behind WiSee, in a phone interview with Ars. "We started out looking to see if there is a way to do what Kinect does in a larger area, and we started looking at RF." The ubiquity of Wi-Fi and the multiple antennas of newer MIMO Wi-Fi routers were a natural fit for the research.

WiSee "sees" gestures by detecting subtle changes in the radio signals bouncing off of and passing through human bodies as they move. Changing the body's position or moving a hand or foot causes a small Doppler frequency shift in Wi-Fi signals that can be detected by a receiver. When Wi-Fi signals hit a human body, "some is absorbed, and some is reflected," said Sidhant Gupta, another UW researcher contributing to WiSee. "The reflections cause a very subtle shift in frequency, in the tens of Hertz." Wi-Fi protocols generally are robust enough to handle those variations, but WiSee uses them to detect motion.

Using algorithms to screen out normal variations in the broadband signals of devices and correct for normal gaps in broadcasts, WiSee can separate out the signatures of a series of movements from the rest of the broadband signals. "The Wi-Fi channel by itself is 20 MHz wide," Gupta said. "You can't just look at that whole spectrum and find subtle changes of 2 Hz." WiSee's detection algorithm breaks the broadband Wi-Fi spectrum into smaller chunks, and processes them to discover the shifts hidden within them.

By using multiple antennas and a Wi-Fi receiver with multiple input multiple output (MIMO) capability, WiSee can "lock on" to a specific user with an antenna from among a group of other people in a space.

"You can determine where a gesture is coming from," said Gupta, "whether it's in the kitchen or the dining room. That's one way of compartmentalizing who the gesture is coming from." And gestures in different locations of the house can mean different things, or be localized to where the user is—for example, by changing the appropriate lighting based on what room the user is in.

The UW team applied machine learning to the RF signatures of a series of body movements to help the system identify them as gestures associated with a command. WiSee was able to identify a set of nine gestures with a 94 percent accuracy rate during the experiment. Patel said that the team stuck to nine gestures because of the limits of the simplified machine learning process they used for the first effort—the next version of WiSee will use a Hidden Markov Model to look for sequences of gestures and allow a much larger vocabulary of commands. "The intent is to make this into an API where other researchers can build their own gesture vocabularies," Patel said.

Two of the research team members working on WiSee brought experience from two similar projects sponsored by Microsoft Research, which also used Doppler shifts to detect body movements. One of them, Humantenna, used changes in signals from electrical "noise" and other background radio frequency radiation picked up by the human body as it moved to detect gestures and motions. Another, SoundWave, uses speakers and a microphone to detect a Doppler shift in reflected sound waves.

But both of these previous projects required the user to be in a specific room or directly in front of the device they are interacting with. WiSee can "see" through walls, making it more practical for applications like home automation as well as the usual Minority Report-like interactions with media and computing devices. "We haven't tested the upper limits of gesture recognition," Gupta said. Patel added that the practical limits of gestures "depend on the entropy of the signal. It won't ever be as smooth and crisp as Kinect."

One of the next concerns to be addressed will be "how do you make it secure," said Gupta. "Someone walking by your house should not be able to turn your kettle on by waving his arms." Currently, users can be distinguished by a "startup" gesture that identifies them before allowing a command gesture. But the team is looking at how to "geofence" commands within a specific room or the limits of the house.

This story was updated with additional data from the University of Washington research team.