Real time videogrammetry has hit new levels of public exposure as Microsoft released its Holoportation video last month, featuring Skype-like telepresence through what appears to be eight kinect-like cameras and a Hololens. Personally, I would like to see this technology accessible through existing consumer peripherals - and that brings us to the potential of Microsoft’s Kinect 2.0 with libraries like libfreenect2.



Last fall I began working with libfreenect2′s free multi-Kinect-2.0 abilities. The photo above shows two instances of their example program, Protonect, running side by side to prove the setup’s multi-Kinect support. You can see my friend Ashook demonstrating in MxR at USC.

Above: Kinect 2.0 in action.

Hardware



I highlight the this since there are hardware requirements for multiple Kinects:

Each Kinect requires a Kinect 2.0 (aka Kinect for Xbox One) and a Kinect for Windows Adapter. These are $100 each and $50 each respectively, but that is not all.



Due to the high volume of data transferred when a Kinect 2.0 captures data, each device requires not only its own USB 3.0 port but its own bus within the computer too. Most computers with USB 3.0 support have at least two buses: one for front ports, and at least one for back ports. I can’t speak too much to this side of the hardware, but most USB ports directly connected to the motherboard seem to have a higher chance of having their own bus, and I have heard each PCI card works separately in this way too.

This means that while it is possible to get multiple Kinects working on one computer, it is not immediately guaranteed.

Above: building Protonect with libfreenect2.



Software



Libfreenect2 is also not plug and play for making your own builds, with extensive library dependencies and may require some troubleshooting. I have done fresh installs on two computers so far, but neither was a quick process. I have a few pages now of custom notes to help with future setups.

However…

Distributing compiled builds (executables) made from libfreenect2 appears to be almost trivial. When sent with all needed files in one folder, the only required change to the computer was swapping drivers for the Kinects.

Distribution, for exes, is easy.

Above: Kinect’s greatest limitation here - cable length.



Conclusion



And that’s where this truly has potential. My plan is to create software from libfreenect2 that can easily capture videogrammetry data - animated point clouds or models - to distribute for captured moving physical spaces in VR. No fancy hardware (a less than $1k setup if you already have a VR rig), no deep knowledge base required, and it just lets people capture data to distribute and experience.

Easily archiving the present with the best consumer tech we have, for the future to have a better view into the past. That’s the plan.

Or 3D cat videos. That works too.

