https://github.com/liebeskind/leapdrone

What I’m Doing

I initialize my Node.js server, plug my Leap Motion into the computer’s USB port and hold my hand in the air. I make a gesture that looks like left-clicking on a mouse, but am not actually touching anything. Suddenly, with a whoosh, the rotors on my drone buzz to life. This particular model, the AR Drone 2.0, has a quadcopter structure with 4 rotors in a square shape, which gives it enhanced stability. The drone lifts into the air and hovers there, waiting for me to issue a command. I move my hand forward and the drone edges away from me.

The further away from center I move my hand, the faster the drone moves until I realize, almost too late, that it is quickly approaching a tree. I drive my hand down and to the right and the drone dodges to the right and under the looming branches, narrowly escaping disaster. I point with my finger and make a circle in the air counter-clockwise and the drone rotates so that it is facing me. Moving my hand forward again, the drone accelerates towards me. Glancing back at my computer, I can see myself getting closer in the drone’s video camera, which is streaming in my browser. This is only the beginning of my drone journey.

Why Node opens the door for programming robotics with Javascript

A few years ago, it was virtually impossible to control robots using Javascript alone because it was so slow that any application that required a reasonable response time would not function properly.

With Google’s V8 Javascript Engine, Javascript’s day has arrived. V8 is written in C++ and compiles down to assembly, so it is very fast. Recent benchmarks put it ahead of PHP, Ruby, and Python – second only to C itself. Despite being initially designed to run in Google’s Chrome browser, V8 has since been adopted by several javascript frameworks, including Node.

Node is used to make web applications responsive by quickly pushing javascript from the server to the client. Node also functions asynchronously so that multiple data streams can be queried simultaneously. This non-blocking data transmission means that Node is able to process a second and third command without waiting for the first command to succeed. Another unique feature of Node is that it leverages callbacks (functions that run upon success or failure to receive data) to chain instructions so that you can create a series of commands that will run in order upon completion of the previous command.

The implications for robotics are that multiple commands can either be processed simultaneously or chained to occur in a particular sequence. Languages like Ruby and Python are not asynchronous and commands block one another, which may result in disaster if a single command gets stuck and takes a long time to process.

Why I used Leap Motion and The Future of Controllers

Leap Motion is the first viable product in a paradigm shift that is changing the way we interact with technology. For those that don’t know, Leap Motion is a small camera that plugs into a computer’s USB port. It can detect and track each of your hands and every finger’s movement in the half-dome-shaped space above it. There are several applications, including playing video games, computer interfacing and, now, flying drones. I used the javascript framework leap.js to translate hand coordinates into drone commands, then published instructions to my Node server using Faye (a simple publish-subscribe messaging system), and issued movement commands to the drone.

At this point, Leap Motion’s software is in need of an upgrade and isn’t very effective at detecting finger movements if, for instance, you turn your hand to the side. As a result, I used hand movements for everything except takeoff / landing (done by gesturing with pointer finger as though left-clicking mouse) and rotation (done by making a circle in the air with pointer finger). I have heard that Leap Motion is upgrading their firmware in the next few weeks and am excited that one of the features is much more precise finger tracking. In the meantime, I am leveraging Leap Motion’s X, Y and Z axes hand position detection to control left/right, up/down and forward/back actions.

Devices for interacting with the world around us are rapidly increasing in effectiveness. Imagine if you could use something like Leap Motion without being tied to a computer. In early 2014, Thalmic Labs is releasing the Myo, an arm band that detects electrical activity in muscles associated with finger movements to wirelessly control digital technologies. In the near future, I may be able to build a pocket-sized autonomous personal drone that follows me down the sidewalk while sending a video feed to my Google Glass, then disengage autopilot and control my drone assistant through hand gestures. I could also send my drone on missions to pick up a burrito, survey surrounding traffic, mow my lawn, take an aerial picture of me and my buddies, and thousands of other possibilities.

In part 2 of this article, I’ll go through some of the challenges facing the emerging drone industry and why we are on the cusp of a hardware revolution.

To Be Continued…