Wide angle, low distortion camera tracking for the Oculus Rift

I thought I would write a quick demo about one problem of the Oculus Rift DK2 that not been addressed, the coverage of the positional tracking camera. Generally it works really quite well, but if you move outside it’s field of view then tracking will stop, immersion will be lost and the experience degraded. The camera is pretty standard, they’ve not designed anything new, just adapted a fast, reliable (and cheap) off the shelf sensor. It doesn’t have an amazing field of view so it’s quite easy to move outside this range.

52º is actually pretty narrow…

If Oculus hopes to have a system whereby you can navigate a whole room then 52º just can’t cut it. A 90º view would mean you could place the camera in the corner of a room and it would be able to look along the walls.

Not everyone will want to want to sit near a corner….

Ideally we have a system that has 180º coverage so it can be placed on a suitable wall and the user has little danger of moving outside it’s field of view.

So just get a wide angle camera right?



Actually this isn’t an optimal solution since wide angle lens create not only a huge amount of distortion but the center of the lens, where the user is likely to spend most of their time, is compressed so the amount of ‘pixels on target’ is actually quite low. Great for expressive photography but not so much for tracking LEDs at sub milimeter accuracy…

Instead a better solution is to simply use multiple cameras at a slight angle to each other providing 180º+ coverage. By this I mean we have three camera sensors on a single circuit board, not three separate cameras.

I created a quick playable demo in Unity to show the idea. A camera on either side of the central camera gives us significant overlap. The three bottom ‘screens’ show what each of the camera can see, when you move to the side the LED markers are transferred to the other camera.

We don’t have to worry about strictly lining up the images since they won’t be displayed. The cost is marginally more expensive, but imaging sensors are really very cheap, so the camera would only be a couple of dollars more expensive.** There is a slight processing overhead when moving from one sensor to the next as more markers would need to be tracked.

It is inevitable that Oculus will move to using camera on the HMD to track position (and pass through a picture to the user) but this may not come for another couple of consumer versions, so in the mean time using multiple low field of view cameras together to give a wide field of view tracking is quite possible.

You can ‘play’ the demo here. Press 1 and 2 to switch from a 90º camera watching three 72º overlapping cameras to a 160º camera in the same position and move your mouse. You can see the sphere behind the ‘displays’. In a real application the cameras would not move, the user would, but the demo allows you to move the camera to show how they would overlap and still give you a wide, undistorted field of view.

** A quick Goggle reveals the sensors in the Oculus camera are actually about $9 each! More than I hoped but still not crazy money…

The experts over on Reddit had this to say:

3rd_Shift: “It seems utterly preposterous to pursue a multiple camera solution with the added cost and complexity that entails when you could achieve the same result with a wide-angle lens and a higher resolution camera.”

Randomoneh “Have you ever used a fisheye lens?It seems like you’re confusing fisheye for rectilinear lens.”

My reply :