With all the phone and watch and TV and game and chip and other chip news coming out of Apple's big event last week, it was easy to forget the company's longest-running background process: an augmented-reality wearable. That's by design. Silicon Valley's advent calendar clearly marks September as the traditional time for Apple to talk finished hardware, not secretive projects.

But those secretive projects have a weird habit of poking their heads into the light. A slew of features and language discovered recently inside iOS 13 and 13.1 seem to explicitly confirm the very thing Apple executives have steadfastly refused to acknowledge—an honest-to-Jobs AR headset. In fact, taken in conjunction with acquisitions and patent filings the company has made over the past several years, those hidden features have painted the clearest view yet of Apple's augmented dreams.

Hard to StarBoard

First came StarBoard. At the very beginning of September, a leaked internal build of iOS 13 was found to contain a "readme" file referring to StarBoard, a system that allows developers to view stereo-enabled AR apps on an iPhone. The build also included an app called StarTester to accomplish exactly that. That marked the first explicit mention stereo apps—i.e., those that output to separate displays, like those found in AR/VR headsets—in Apple material.

Not long after, on the day of the hardware event, Apple released Xcode 11, the newest version of the company's macOS development environment. Inside that set of tools lurked data files for what appeared to be two different headsets, codenamed Franc and Luck. The same day, iOS developer Steve Troughton-Smith found the StarBoard framework in the official "golden master" of iOS 13; he also pointed out references to "HME," which many speculated stood for "head-mounted experience." (HMD, or head-mounted display, is a common term for a VR/AR headset.)

So far, so unprecedented. When Apple first released ARKit in 2017, it was the beginning of a long journey to familiarize developers with augmented reality and get them playing with the possibilities. Yet, the company has always been careful to situate AR as a mobile technology, people peeking through iPhones or iPads to shop or play with Legos, or even experience public art installations. Finding this kind of data, even hidden deep within OS developer files, marks an uncharacteristic transparency from Apple—as though the company is planning something sooner rather than later.

What that thing might be depends who you ask. Reports from Bloomberg News and Taiwanese analyst Ming-Chi Kuo have long claimed that Apple would be beginning production on an AR headset this year for release in 2020—one that acts more like a peripheral than an all-in-one device, depending on the iPhone to handle the processing power.

Troughton-Smith came to a similar conclusion after poking through iOS 13. "The picture of Apple’s AR efforts from iOS 13 is very different to what one might expect," he tweeted. "It points to the headset being a much more passive display accessory for iPhone than a device with an OS of its own. The iPhone seems to do everything; ARKit is the compositor."

That idea of a passive display accessory got fleshed out late last week, when another developer got StarTester up and running on a beta of iOS 13.1, which officially comes out today.

That person also found specific numbers in the iOS framework referring to the fields of view for the two specific headset codenames: 58 and 61 degrees for Luck and Franc, respectively. (A third codename, Garta, seems to refer to a testing mode rather than a specific device.)