Last month I had a disorienting and quite frankly unnerving experience: I saw myself from behind.

This bit of magic happened at the LAUNCH festival in San Francisco, where a few hundred baby startups get together to show off their ideas. This year, reflecting the broader trend in the industry, several firms had VR products.

Twenty-five years ago I was deeply involved in the development of first-generation virtual reality systems. I truly believed VR to be The Next Big ThingTM. It was obvious that by 2015 - twenty five years in the future - we’d all be using virtual reality gear to visit virtual worlds, and communicate with one another.

That future didn’t quite turn out as planned. Sega VR - meant to bring the immersive experience to millions of video game kiddies - disappeared before it reached the market, under a cloud of safety concerns. When Sega pulled the plug, most of the the VR industry went down the gurgler.

For nearly 20 years, VR disappeared from view. But the idea of 3D virtual worlds, well, that went on to become the multi-billion dollar industry of video gaming.

When Oculus launched its Kickstarter campaign, a new generation of enthusiasts - too young to remember the crash-and-burn of the early 90s - became the new True Believers.

I’ve rarely been impressed by virtual worlds. Most feel like barebones theatre sets. It’s a lot of effort to create a richly textured world - we observe more detail than we acknowledge, until deprived of it - and even the most fanatically designed video game worlds (GTA, for instance) still feel like a Hollywood set. Step too far outside the viewing frustum, and the illusion disappears.

On the other hand, the idea of ‘telepresence’ - being able to immerse your senses in another location - only grows more exciting. Twenty-five years ago I did some basic engineering work on ‘telepresence toys’, similar to the sorts of robot-cars-with-wifi-cameras-beaming-back-to-your-smartphone that you can buy everywhere today. Imagine being able to chase your cat around the house, at their level. How much fun would that be?

So when I walked up to the VideoStitch booth at LAUNCH, I was ready. I’d been ready for almost half my life. I donned the still-too-heavy and still-too-tethered Oculus DK2 headset they offered, and found myself in another world - about two meters to my posterior.

No one should really have the experience of seeing themselves from behind. We’re all more-or-less familiar with what we look like from the front, because we’ve seen ourselves in mirrors. But our backsides? That’s a view we never get, not even in a clever arrangement of reflections. It’s not pleasant. Some of that comes from unfamiliarity, some from the sense of the uncanny, and some because it utterly undermines our self-image.

That a little tech demo could leave me so thoroughly harrowed speaks to the power of immersive video. I could look anywhere I wanted, in any direction, and see the world around me. When I removed the Oculus, VideoStitch founder Nicolas Burtey pointed to a tripod, two meters behind me, studded with six GoPro cameras. Each fed their HD video into a PC running his software, knitting their different views into a seamless bubble of streaming video that fed into the Oculus.

The view from inside the display seemed relatively low resolution, because even UHD video (4K) provides barely enough pixels to fill the world with any detail. VideoStitch offers a ‘military-grade’ version of their software, and although Burtey remained cagey about specifics, I got the sense the software could handle much higher resolutions.

Later that same day, I donned Samsung’s VR Gear (better than Oculus by a long shot) and watched an LA Lakers basketball game from a seat at centre court. I’d always known immersive video would be huge, but this gave me my first taste of just how big it would be. Pretty much every event of any consequence - from sports to red carpets to political rallies to live crosses to war zones - will be broadcast in immersive video within the next few years.

Google has already added immersive video capabilities to YouTube with its ‘YouTube 360’ program, and Facebook, partnering with VideoStitch, has announced the same feature.

Immersive video is coming, and the punters are going to love it. Even if most won’t don a head-mounted display to watch the footie, they’ll love the fact that they’ll always be able to pan the view on their big-screen telly. It’s a feature we won’t use all the time (we’ll leave that to well-trained directors and camera operators) but it’ll be another important feature of 21st century broadcasting.

To do immersive video well requires extraordinary amounts of bandwidth. VideoStitch can take the output from an array of RED cameras - each capturing their own UHD stream - and integrate them into a stream that could easily be 8K (Ultra Ultra High Definition), even 16K. Try getting that over any broadband that doesn’t have nearly a gigabit of capacity, and you’re going to have all sorts of problems.

UHD streaming is beyond the carrying capacity of most Australian broadband, and even the cheap-and-cheerful 50Mbs NBN will - in a best-case situation - strain to offer multiple streams. That’s in 2015. When, in 2018, there are multiple immersive video streams on offer from YouTube and Facebook and Al Jazeera and god knows who else, Australians will be left staring at cricket ball-sized pixels, because we can’t shift enough bits down the pipe to bring things into focus.

That’s our broadband - stretched to its limits today, and blind tomorrow.

Meanwhile, American cable companies fall over one another, upgrading their customers to 2 Gbps service. They can already see how immersive video will drive traffic skyward.

The future always comes, and it comes with a list of demands. It’s ours to decide how we’ll meet those challenges. Looking backward in a decade’s time, will we be unnerved at what we see? Will we be able to see anything at all? ®