While walking around the Bay Area Maker Faire this weekend, I stumbled across an amazing piece of technology: Valve's "Lighthouse" tracking system. Valve's demos were (supposedly) a major contributor to Oculus' fundraising efforts and ultimate sale to Facebook, and this device may have been a key piece of those demos. I'd heard rumors about this system for many months. It was designed for augmented and virtual reality -- namely, for head and controller tracking, where it needs to have some insane specifications for positional accuracy, angular accuracy, and especially(!!!) latency. Honestly, the rumors didn't do it justice: it's really elegant! This solution is exceedingly simple, low-cost, light weight, and performant. It's much (much!) better than the image processing techniques I've seen to date. Most importantly.... I think this technology could be a "big deal" for robotics too. I had a chance to speak with Valve's Alan Yates about how the Lighthouse system works; I didn't get all the specifications, but I did get some interesting information -- so read on!

Valve's "Lighthouse" Optical Tracking System:

I captured a few photos (with permission) at Maker Faire. The transmitter (highres here) has a few key components: a bank of infrared LEDs, and two spinning IR lasers -- one that sweeps across the X axis, and one that sweeps across the Y axis. These lasers are different from normal laser rangefinders' "point" lasers, in that each of the Lighthouse lasers is a line laser. The two lasers are mechanically timed so that they're swept roughly 180-degrees out of phase.

Unlike a laser rangefinder, the transmitter does not estimate the distance or pose of the tracked points -- it just emits light. Each "receiver" is a simple photodiode, which can be integrated (with other photodiodes) into a rigid unit. Here's Valve's prototype for inclusion into a game controller (highres here). You can see a number of photodiodes (little black squares); IIRC, Alan said it had 17 individual photodiodes so that several would be unobstructed regardless of orientation.

I made an animated GIF to explain how the system works; let's look at the light as seen by one of the photodiodes:

The IR LEDs provide the start of a timing sequence. A microcontroller (attached to the photodiode) starts a counter (with fine-grained time resolution) when it receives that initial sync signal, and then waits for the X and Y line lasers to illuminate the diode. Depending on the time elapsed, the microcontroller can directly map the time delay to X and Y angular measurements. Using multiple photodiodes and knowing the rigid body relationship between them, the microcontroller can calculate the entire 6-DoF pose of the receiver.

This solution is elegant for a few reasons:

The computation overhead is minimal, especially compared to image processing.

It's super-low latency. Unlike image tracking techniques, this system doesn't need to wait while data-intensive images are transmitted, processed, and analyzed. The microcontroller count can be quickly and accurately mapped to angles in (basically) one instruction cycle. This is super-critical when it comes to VR and AR, as angular errors or latency delays can create problematic artifacts that make them unusable.

It relies on the high(ish) time resolution at the receiver to determine angles. This has major benefits over systems (like Raskar's, which I'll discuss later) that use a similar technique, but have limitations due to spatial resolution rather than time resolution.

The receiver hardware is stupidly cheap: e.g., a $0.01 photodiode. It's also very small and light weight, so it could be included in almost any object.

One thing to note: I simplified this discussion a little bit. Like many IR systems, the LEDs and lasers are actually modulated (Alan said, "on the order of MHz"). This is useful for a few reasons: (1) to distinguish the desired light signals from other IR interferers such as the sun; and (2) to permit multiple transmitters with different modulation frequencies. This is a pretty obvious enhancement, but it muddles the layperson description.

I was kinda scrapped for time, so I didn't pry too much about detailed specs. So no information on sensing rates or pose accuracy (I think I heard 1mm accuracy over 5m range, but my memory is hazy). When I asked about availability, they said they're keen to get them in the hands of Makers to explore new applications. I didn't really inquire about licensing or IP (hands over ears... "lalalala..."), but Alan did said they're more interested in ensuring standards compliance rather than extracting revenue.

They had a few other prototypes sitting around too:

According to recent news reports (here and here), this tracking system will be a critical component of the HTC Vive VR system:

Unfortunately the Maker Faire is over,** so you probably won't get a chance to see the units in person.

** Sorry. I meant to get this post online in time for people attending on Sunday to swing by and check it out. But I'm busy working all weekend to finish some deliverables for my day job at Google[x] -- the same, sorry reason that I haven't been active on Hizook for months!

Here's what the poster said:

- Lighthouse Optical Position Tracking - A simple optical position tracking & navigation technique for maker projects. Want your robot or quadcopter to know where it is and where it is pointing in free space? Lighthouse is a scalable way to implement 3 or 6-DoF navigation for your project.

Robotics Applications of the Lighthouse Tracking System

There are all kinds of robot localization systems that use 2-part tracking systems: GPS (TX on satellites, RX on robot), fiducials (eg. tracking colored blobs, QR codes, ARToolkit, April tags, etc), my work on long-range RFID, etc. A tracking system of this kind has the potential to replace or augment all of those old applications. So for example:

Tracking your quadrotor indoors so that you can pull off those crazy quadrotor demos at home without a $50k vicon system!

Tracking an inflatable robot's kinematics (or just end effector) -- an otherwise difficult task.

Replacing the fiducial tracking for systems like Kiva Systems, First Robotics, and RoboCup.

Integrating range measurements in the base station: You could trivially use the line lasers plus a camera to do ranging (like Morgan Quigley's borg scanner) or add a ToF camera. Then the base station would get range information as well as pose information from the "tags."

There are a lot of interesting possibilities and permutations....

A Quick Note on Related Work

If you're interested in Valve's Lighthouse system, you might also be interested in an analogous method by Ramesh Raskar (MIT Professor), Johnny Lee (of Project Tango fame), Jay Summet (a friend of mine), et. al. from sometime around 2005-2007. Instead of using line lasers, they use projectors (LED/lasers with masks, DLP light projectors, or any number of pico projectors) to project a series of 2D IR images onto a scene. A single photodiode can then determine its angular location based on the its observed light-dark signals. Combine multiple photodiodes for full 6 DoF pose.

I don't want to spend the time to go over the details.... which can be found in: "Lighting-Aware Motion Capture Using Photosensing Markers and Multiplexed Illumination" (website, PDF) and "Moveable Interactive Projected Displays Using Projector Based Tracking" (PDF). But here is an image the describes the basic idea -- it should be obvious given the relevance to Lighthouse.

Maker Faire Impressions

This is the third time I've attended Maker Faire. I attended the inaugural Maker Faire in 2006, and it was awesome. It seemed like the audience and Makers were mostly hardcore hackers, and there was a large variety of projects. After we moved back to California, we attended in 2013.... and it sucked. The entire event was dominated by commercial companies peddling their wares, and entirely too many 3D printers (yes, I get it... you can assemble a MakerBot). We attended again this year on a whim, and I was impressed. The quality, quantity, and variety of hobbyist Makers was vastly improved. The only gripe I might make: It was bloody crowded! Next year, I'll probably attend as a Maker so that I can get those Friday VIP passes to avoid the crowds. Maybe I'll demo my wirelessly powered robot swarm -- or perhaps something entirely new....