University of Washington

phone

Matt Reynolds

Shwetak Patel

Team

Association for Computing Machinery

User Interface Software

Honolulu

environment

MM

researchers have created an ability to interact with our devices not just with touchscreens, but through gestures in the space around the. Some smartphones are starting to incorporate 3D gesture sensing based on cameras, for example, but cameras consume significant power and require a clear view of the user’s hands.UW engineers have developed a new form of low-power wireless sensing technology that could soon contribute to this growing field by letting users “train” their smartphones to recognise and respond to specific hand gestures near the phone.The technology – developed in the labs ofand, associate professors – uses the phone’s wireless transmissions to sense nearby gestures, so it works when a device is out of sight in a pocket or bag and could easily be built into future smartphones and tablets.“Today’s phones have many different sensors built in, ranging from cameras to accelerometers and gyroscopes that can track the motion of the phone itself,” Reynolds said. “We have developed a new type of sensor that uses the reflection of the phone’s own wireless transmissions to sense nearby gestures, enabling users to interact with their phones even when they are not holding it, looking at the display or touching the screen.”members will present their project, called SideSwipe at the’s Symposium onand Technology inWhen a person makes a call or an app exchanges data with the Internet, a phone transmits radio signals on a 2G, 3G or 4G cellular network to communicate with a cellular base station. When a user’s hand moves through space near the phone, the user’s body reflects some of the transmitted signal back toward the phone.The new system uses multiple small antennas to capture the changes in the reflected signal and classify the changes to detect the type of gesture performed. In this way, tapping, hovering and sliding gestures could correspond to various commands for the phone, such as silencing a ring, changing which song is playing or muting the speakerphone. Because the phone’s wireless transmissions pass easily through the fabric of clothing or a handbag, the system works even when the phone is stowed away.“This approach allows us to make the entire space around the phone an interaction space, going beyond a typical touchscreen interface,” Patel said.A group of 10 study participants tested the tech by performing 14 different hand gestures – including hovering, sliding and tapping – in various positions around a phone. Each time, the phone was calibrated by learning a user’s hand movements, then trained itself to respond. The team found it recognised gestures with about 87 per cent accuracy.“SideSwipe’s directional antenna approach makes interaction with the phone completely self-contained, because you’re not depending on anything in theother than the phone’s own transmissions,” Reynolds said. “Because the SideSwipe sensor is based only on low-power receivers and relatively simple signal processing compared with video , we expect SideSwipe would have a minimal impact on battery life.”