We’ll find out Monday, Nov. 9, when Canadian Queen’s University’s Human Media Lab professor Roel Vertegaal and his students will unleash their “BitDrones” at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina.

Programmable matter

Vertegaal believes his BitDrones invention is the first step towards creating interactive self-levitating programmable matter — materials capable of changing their 3D shape in a programmable fashion, using swarms of tiny quadcopters. Possible applications: real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics, and online information visualization.

“BitDrones brings flying programmable matter closer to reality,” says Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.”

Vertegaal and his team at the Human Media Lab created three types of BitDrones, each representing self-levitating displays of distinct resolutions.

PixelDrones are equipped with one LED and a small dot matrix display. Users could physically explore a file folder by touching the folder’s associated PixelDrone, for example. When the folder opens, its contents are shown by other PixelDrones flying in a horizontal wheel below it. Files in this wheel are browsed by physically swiping drones to the left or right.

ShapeDrones are augmented with a lightweight mesh and a 3D-printed geometric frame; they serve as building blocks for real-time, complex 3D models.

DisplayDrones are fitted with a curved flexible high-resolution touchscreen, a forward-facing video camera and Android smartphone board. Remote users could move around locally through a DisplayDrone with Skype for telepresence. A DisplayDrone can automatically track and replicate all of the remote user’s head movements, allowing a remote user to virtually inspect a location and making it easier for the local user to understand the remote user’s actions.

All three BitDrone types are equipped with reflective markers, allowing them to be individually tracked and positioned in real time via motion capture technology. The system also tracks the user’s hand motion and touch, allowing users to manipulate the voxels in space.

“We call this a ‘real reality’ interface rather than a virtual reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift: you can actually touch these pixels, and see them without a headset,” says Vertegaal.

The system currently only supports a dozen comparatively large 2.5 to 5 inch sized drones, but the team is working to scale up their system to support thousands of drones measuring under a half-inch in size, allowing users to render more seamless, high-resolution programmable matter.

Other forms of programmable matter

BitDrones are somewhat related to MIT Media Lab scientist Neil Gershenfeld’s “programmable pebbles” — reconfigurable robots that self-assemble into different configurations (see A reconfigurable miniature robot), MIT’s “swarmbots” — self-assembling swarming microbots that snap together into different shape (see MIT inventor unleashes hundreds of self-assembling cube swarmbots), J. Storrs Hall’s “utility fog” concept in which a swarm of nanobots, called “foglets,” can take the shape of virtually anything, and change shape on the fly (see Utility Fog: The Stuff that Dreams Are Made Of), and Autodesk Research’s Project Cyborg, a cloud-based meta-platform of design tools for programming matter across domains and scales.



Human Media Lab | BitDrones: Interactive Flying Microbots Show Future of Virtual Reality is Physical

Abstract of BitDrones: Towards Levitating Programmable Matter Using Interactive 3D Quadcopter Displays

In this paper, we present BitDrones, a platform for the construction of interactive 3D displays that utilize nano quadcopters as self-levitating tangible building blocks. Our prototype is a first step towards supporting interactive mid-air, tangible experiences with physical interaction techniques through multiple building blocks capable of physically representing interactive 3D data.