Engineering students at Taiwan's National Chung Hsing University demonstrated a clever use of the motion sensors in Apple Watch to interpret hand gestures, enabling them to remotely control real world devices akin to the science fiction fantasy depicted in Star Wars.

The Force Awakens: use the Dong

A group of five researchers, including civil engineering PhD student Mark Ven and University professor Yang Ming-der have been working at PVD+ since 2013, developing software they call Dong coding to interpret hand gestures, notes a report by Reuters.

Simply wearing an Apple Watch provides enough motion controls— thanks to the device's gyroscope and accelerometers— to allow the researchers to pilot a Parrot AR Drone 3.0 using hand movements, or alternatively turn on Philips Hue HomeKit lamps using a clap, then activate a given color by tracing the outline of a character (such as drawing out a "R" to turn the lamp red).

Ven demonstrated using PVD+ software to fly a drone in Taichung City (above), where he was interviewed by Reuters. Ven also demonstrated using Apple Watch to remotely control a Sphero robotic toy and control HomeKit-capable devices.

"Previously we've needed complicated controls to fly drones, but now we can use a wearable device, and through human behavior and gestures directly interact with them - using a hand to control and fly drones directly," he said.

Google's YouTube appends a dreadfully annoying 45 second ad on the one minute, 47 second video clip published by Reuters, but you can also watch the video ad-free, albeit using Adobe Flash (above).

PVD+ is seeking to patent and commercialize the technology, which appears to be an interesting new application of wearables, an emerging market Apple entered over the past year with Apple Watch, which it rapidly turned into a $7 billion business across its first 9 months on the market.

Swiss Watch industry annual sales ~$25 billion. First three quarters of Apple Watch sales ~7 billion.. — Ben Bajarin (@BenBajarin) December 31, 2015

Apple in motion

Apple has been working on enabling technologies related to hardware motion sensitivity since the iPhone first appeared in 2007 with a proximity sensor, a 3-axis accelerometer (for tilt, motion and bump/shake detection) and WiFi location features.

The company subsequently gave iPhone 3G full Global Positioning Satellite support, added a digital compass to iPhone 3GS, and then updated the motion-sensing accelerometer to a 6-axis gyroscope in iPhone 4, capable of determining pitch, yaw and roll (twisting movements).





In addition to making motion sensors available to developers, Apple also began using introducing novel motion gesture applications of its own, including shake-to-shuffle playback on iPod nano and shake to undo in iOS, as well as more sophisticated health and sports related motion tracking managed by HealthKit.

Apple has also incorporated novel support for proximity and micro-location sensing with Bluetooth and WiFi related geofencing features, including Continuity HandOff and retail-related iBeacons.

Apple has also incorporated sophisticated low-power chip logic for managing motion-related data in its M-series components used in iPhones and iPads, as well as the software frameworks to make background tracking of motion sensor data easier for developers to access and use, once the user gives their apps permissions.