The x24 is so named because it has 24 cameras; the x6, meanwhile, has -- you guessed it -- six cameras. While the x24 looks like a giant beach ball with many eyes, the x6 is shaped more like a tennis ball, which makes for a less intimidating look. Both are designed for professional content creators, but the x6 is obviously meant to be a smaller, lighter and cheaper version.

Both the x24 and the x6 are part of the Surround 360 family. And as with version one (which is now called the Surround 360 Open Edition), Facebook doesn't plan on selling the cameras itself. Instead the company plans to license the x24 and x6 designs to a "select group of commercial partners." Still, the versions you see in the images here were prototyped in Facebook's on-site hardware lab (cunningly called Area 404) using off-the-shelf components. The x24 was made in partnership with FLIR, a company mostly known for its thermal-imaging cameras while the x6 prototype was made entirely in-house.

But before we get into all that, let's talk a little bit about what sets these cameras apart from normal 360 ones. With a traditional fixed camera, you see the world through its fixed lens. So if you're viewing this content (also known as stereoscopic 360) in a VR headset and you decide to move around, the world stays still as you move, which is not what it would look like in the real world. This makes the experience pretty uncomfortable and takes you out of the scene. It becomes less immersive.

With content that's shot with six degrees of freedom, however, this is no longer an issue. You can move your head to a position where the camera never was and still view the world as if you were actually there. Move your head from side to side, forward and backward, and the camera is smart enough to reconstruct what the view looks like from different angles. Instead of just looking around the inside of a photosphere, you now have movement within it. This lets you explore 360-degree video freely, the same way you can in VR spaces. All of this is due to some special software that Facebook has created, along with the carefully designed pattern of the cameras. According to Brian Cabral, Facebook's engineering director, it's an "optimal pattern" to get as much information as possible.

I had the opportunity to look at a couple different videos shot with the x24 at Facebook's headquarters (using the Oculus Rift, of course). One was of a scene shot in the California Academy of Sciences, specifically at the underwater tunnel in the Steinhart Aquarium. I was surprised to see that the view of the camera would follow my own as I tilted my head from left to right and even when I crouched down on the floor. I could even step to the side and look "through" where the camera was, as if it wasn't there at all. If the video had been shot with a traditional 360 camera, it's likely that I would have seen the camera tripod if I looked down. But with the x24, I just saw the floor, as if I were a disembodied ghost floating around.

Another wonderful thing about videos shot with six degrees of freedom is that each pixel has depth. Each pixel is literally in 3D. This a breakthrough for VR content creators, and it opens up a world of possibilities in visual-effects editing. This means you can add 3D effects to live-action footage, a feat that usually would require a green screen.

I saw this demonstrated in the other video, which was of a scene shot on the roof of one of Facebook's buildings. Along with Otoy, a Los Angeles-based cloud rendering company, Facebook was able to actually add effects to the scene. Examples include floating butterflies, which wafted around when I swiped at them with a Touch controller. I also observed a visual trick where I could step "outside" the scene and encapsulate the entire video in a snow globe. All of this is possible because of the layers of depth that the footage provides.

That's not to say there weren't bugs. The video footage had shimmering around the edges, which Cabral said is basically a flaw in the software that they're working to fix. Plus, the camera is unable to see what's behind people, so there's a tiny bit of streaking along the edges.

Still, there's lots of potential with this kind of content. "This is a new kind of media in video and immersive experiences," said Eric Cheng, Facebook's head of Immersive Media, who was previously the director of photography at Lytro. "Six degrees of freedom has traditionally been done in gaming and VR but not in live action." Cheng said that many content creators have told him that they've been waiting for a way to bridge live action into these "volumetric editing experiences."

Indeed, that's partly why Facebook is partnering with a lot of postproduction companies like Adobe, Foundry and Otoy in order to develop an editing workflow with these cameras. "Think of these cameras as content-acquisition tools for content creators," said Cheng.

But what about other cameras, like Lytro's Immerge, for example? "There's a large continuum of these things," said Cabral. "Lytro sits at the very very high end." It's also not nearly as portable as both the x24 and x6, which are designed for a much more flexible and nimble approach to VR capture.

As for when cameras like these will make their way down to the consumer level, well, Facebook says that will come in future generations. "That's the long arc of where we're going with this," said CTO Mike Schroepfer.

"Our goal is simple: We want more people producing awesome, immersive 360 and 3D content," said Schroepfer. "We want to bring people up the immersion curve. We want to be developing the gold standard and say this is where we're shooting for."

Click here to catch up on the latest news from F8 2017!