Hacker pilots a drone with Google Glass using just his head movements

Developer Blaine Bublitz created code that can be used with Google Glass

When the wearer moves their head, the drone moves accordingly

The code has been made available online for other Glass wearers to try



A developer has created code that makes it possible for Google Glass wearers to pilot drones using the slightest of head movements.



Blaine Bublitz, from coding company, IcedDev developed the app for an annual event called Nodebots Day, which is for fans of robots controlled by a JavaScript program called node.js.

He developed the code to work with wheeled robots moving back and forth before adapting it to add right and left movements, as well as up and down, specifically to be used with a NodeCopter.



Scroll down for video



Blaine Bublitz, pictured right, is shown flying the NodeCopter, seen in the bottom left-hand corner, using his Google Glass device. Bublitz used JavaScript to create the code that responds to his head movements to make the drone move left or right

BEER DRONE WILL DELIVER DRINKS TO FESTIVAL GOERS FROM ABOVE

Festival-goers in South Africa this summer will be able to order beer from their smartphones and have it delivered by a flying drone dropping a can attached to a parachute. The drone has been developed by Darkwing Aerials and was tested at the Oppikoppi music festival in the Limpopo province of South Africa this August. Customers will be able to place their drink orders through an iOS app that will send their GPS coordinates to the drone operators.



On his blog , Bublitz explained that he began by creating code that controlled wheeled robots on a table top, before adding left and right steering using a palm-controlled device.



However, he couldn't see what he was looking at as he moved his head so wanted to control a robot that could be steered and seen from at least eye height.



During a video demonstration, Bublitz, from Phoenix, explained that he developed the code using the Face app that was unveiled during a talk at the event in Portland.



The app can send sensor information - - the data needed to fly the drone that is usually sent to the device using a joystick or remote control - to a web server which then rebroadcasts it.



This new broadcast can then be picked up and used by the Google Glass headset.



In the video Bublitz adds that he's used pitch and roll movements but could also add a rotation option.

Landing and take off was controlled using an A and B button on the Google Glass.



He said: ' With JavaScript, we are able to write a relatively small amount of code that can serve as a basis for swappable hardware components.

' Turns out that I was driving the drone at full speed in each direction I tilted my head.



'I should have had the speed at about 0.3 instead of 1. Lesson learned.

Bublitz developed the app for an annual event called Nodebots Day, which is for fans of robots controlled by a JavaScript program called node.js. He developed the code to work with wheeled robots moving back and forth before adapting it to add right and left movements specifically to be used with a NodeCopter, pictured



'I would have also liked to add the ability to rotate the drone left and right based on the Glass' azimuth value, but I guess that will have to be in the future.'

Bublitz has now put the coding details on his blog to make it possible for other developers to adapt their Google Glass devices to be able to control NodeCopters.



In May, a Jetstream aircraft has become the first to fly 'unmanned' across UK shared airspace.

The pioneering flight, largely controlled by a pilot on the ground, took off from Warton, near Preston in Lancashire, and landed in Inverness.

Described as 'a new chapter in aviation history,' it could pave the way for passenger planes controlled from the ground by 'drone' operators.