Google Glass and the Apple Watch have both flopped for the same two reasons. First, neither is very attractive . Second, neither proved as essential as the phone that’s already in your pocket.

But a new concept called Monitorless, developed by Samsung’s internal C-Lab, recognizes something vital: If we’re going to carry around any second screen–and especially if we’re going to wear something as obtrusive as glasses–then it needs to augment our reality in a way that’s not some secondary experience, but the OS of our lives.

Featured on RoadtoVR, the glasses look something like Snap’s Spectacles. They’re an extra-thick pair of Ray-Ban knock-offs that put a display in each eye. But the clever part is how they’re thought to work. The glasses connect via Wi-Fi to your phone, so you can see screens from your phone floating right in front of you. Then, your phone can connect via 4G to your computer. So if you’re at your desk using Photoshop, maybe you’ll just use it as usual with your desktop monitor. Or maybe you will use your phone to stream Photoshop from your PC directly into your eyes (in which case, your phone becomes a touchscreen controller, too).

Then, for the pièce de résistance, the glasses can glide seamlessly on a gradient between augmented reality (graphics that sit on top of but don’t block your view) and virtual reality (graphics that totally shut out the world around you), changing their immersiveness to fit the mood. Want to play a game that tunes the world out with full-bleed graphics? Fine! Want to just have a semi-transparent clock in your field of view? Also fine! The glasses acknowledge that you may want to give them different levels of attention within different contexts.

Now I’m not saying Samsung has painted the perfect portrait of augmented reality here. For instance, do we really want our Android apps floating in the shape of our phone screen in front of our face? Is that really the best way to shape and place such content?

Probably not. But the company has acknowledged, and attempted to solve, a serious infrastructure reckoning through user interface. Namely, AR and VR will be the most intimate user interfaces we’ve ever experienced. They can be with us anywhere at any time. And yet, our smartphones are already horrible at talking to our computers (think how laborious it is just to send a picture or document from your iPhone to your Macbook). So how will adding yet another screen fit into the equation? Samsung suggests that AR is not a replacement for smartphones and PCs, but an extension thereof that can access each at will. And it’s perfectly fitting that Samsung called its design “Monitorless.” Because executed properly, and these glasses would be the only “monitor” you’d need, being fed information by all of the devices that you already know.

A skeptic might ask: Wouldn’t it be better if Monitorless beamed straight to the cloud like some really advanced Google Doc, leaving all of your old devices from the 20th century behind. Again, sure. But progress is slower than we tend to imagine, and old habits die hard. The phone and desktop PC aren’t going anywhere any time soon. Monitorless just suggests how AR could fit into our lives as they are today.