In the Darwin-like world of software development, the spoils often go to developers with a knack for spotting the next lucrative platform opportunity towards which they should evolve their skills in order to sieze early mover advantage. For example, many of the first programmers to reinvent themselves as mobile developers and load Apple’s app store with some of its earliest third party apps went onto seize six or seven-figure incomes from the bedrooms of their apartments.

But as such opportunities go, there are many platforms on which developers can place their bets. Given the investment in time and re-education, the magic question is which one? Today, at Facebook’s F8 developer conference in San Jose, the company’s CEO Mark Zuckerberg and his various lieutenants presented a vision for what could potentially be that next new lucrative developer opportunity; a yet-to-be undefined category, under the broader rubric of "building community," that lives at the intersection of mobile computing, artificial intelligence, and messaging and of which augmented reality will serve as a key centerpiece. Along with that vision came a slew of announcements of new capabilities and associated developer tooling cutting across Facebook's various offerings such as Facebook, Messenger, and Oculus Rift.

Though not a total pivot, the company’s new vision represents a slight recalibration of its original direction for virtual and augmented reality — one that may have been influenced by the blockbuster success of Pokemon Go and how it superimposed a game and its images onto the reality we roam with nothing more than a smartphone and its built-in camera. In other words, no fancy headgear (ie: Oculus Rift) required (meanwhile, a strong vision for Oculus Rift itself remains, particularly in Facebook’s Project Santa Cruz; a form of virtual reality headgear that looks like the standard Oculus Rift offering, but needn’t be tethered to a smartphone or PC in order to deliver rich immersive experience).

Unfortunately, as augmented reality goes, Pokemon Go is not a platform. It’s a one-off application that was custom built for a single purpose that neither end-users nor developers can leverage to author and build their own augmented reality experiences and applications. But it is instructive on the practicalities of bringing augmented reality to the mainstream using tools that are already in the hands of billions of people.

Citing the smartphone as a basic but ubiquitous platform for building augmented reality experiences, Zuckerberg said that “We used to think glasses were the way that augmented reality would work. But we've seen a primitive use case of augmented reality where the phones are the device for this.” No doubt reflecting on the role that cameras now play in driving the world’s hottest social applications (eg: Snapchat, Instagram, Pinterest and Facebook’s live video service), Zuckerberg went on to say that photos and videos are now more central than text and, for those reasons, the camera therefore needs to be more central to the end-user’s experience than the text box.

After lamenting the ubiquity of smartphone cameras with no means for users to easily create, share or collaborate in augmented reality, Zuckerberg announced that Facebook is "going to make the camera the first mainstream augmented reality platform.” In a nutshell, the Facebook vision is evolving to drive shared and collaborative augmented experiences driven largely by the camera and the smartphone as the primary authoring tool. Facebook, Messenger, and Oculus Rift — each with its own set of unique capabilities for collaboration and automation (ie: bots on Messenger) will serve as the vehicles for a variety of shared, user-driven augmented reality experiences.

In a simple example, a Facebook user will be able leave a virtual sticky note on the menu board at a fast food restaurant that’s viewable by his or her friends (through their smartphones).

More complicated however is the underlying platform’s ability not only convert a two dimensional image into a 3D image, but to parse the resulting space into the components that make it up. Under the hood, the technology recognizes flat surfaces like floors, walls, and countertops and can identify objects such as refrigerators, cups, bowls humans and dogs.

Left: Objects are identified with varying degrees of confidence Right: The platform detects the dimensionality of flat surfaces (aka "planes")

The platform can inherently recognize a coffee cup and augment it with steam that stays true to the location of the cup when the 3D perspective changes. Such augmentations can be static or dynamic and the depth of experience — particularly real time interactions with friends that, in its richest instantiation, can involve life-like augmentable avatars -- is demonstrably better with headgear than without.

While the sophistication of Facebook’s augmented reality platform will, by itself, enable myriad end-user experiences, collaboration, and commerce (how Facebook will no doubt monetize the environment), it will be the developers and their imaginations that will really unlock its power.

The way the company’s CTO Mike Schroepfer put it, developers will now be able to code against the real world. For example, take everything that the underlying platform has determined about the reality it sees through the camera — the walls, floors, countertops, cups, glasses, bottles of wine, people, pets, etc — as well as end-user applied augmentations like notes about the vintage, quality, and cost of the wine, and by making that data available to developers through APIs and SDKs (which Facebook intends to do), the potential for building immersive augmented reality applications could potentially grow by several orders of magnitude.