It may only be two years old, but Oculus’ Connect developer conference has already established some cherished traditions. Many look forward to John Carmack’s annual assault on developer’s brains with an information overload, but Chief Scientist Michael Abrash’s own talk, which usually follows the primary keynote, is also not to be missed.

This year, Abrash made some bold predictions about what VR headsets will look like half a decade from now.

At the start of his talk, however, he also mentioned that he’d made some predictions back at the first Oculus Connect in 2014, which had largely come true.

So what exactly did Abrash predict? What was he right about? What’s still to come? We stepped back in time to take a look and found four statements the head of Oculus Research had made that are still an important part of the conversation today.

Prediction: VR will spur a massive acceleration in graphics

What he said: “The reason that graphics is going to have to up its game in a hurry is that because VR engages so much more of the perceptual system than a monitor does, it’s held to a much higher standard. Parallax, accurate positioning, wide field of view and stereovision together provide vastly more information to the perceptual system, which responds very strongly to VR done right and complains just as strongly about VR that isn’t quite right.”

What he means: With VR, you can’t accept any substitutions. That’s why we need graphics that can cope with the demands of imitating real life, be it seeing more of the world than you’re used to with a traditional first-person shooter, or running top-tier games with the stresses of stereoscopic display placed upon them. VR is a demanding process and it’s on GPU manufacturers to provide hardware for consumers that provide believable experiences not just in visual fidelity but also framerate and performance.

Was he right?: Absolutely. In the months following the release of both the Oculus Rift and HTC Vive both Nvidia and AMD have upped their game with more powerful GPUs, like the GeForce 1080 and more affordable offerings like the Radeon RX 480. There’s still a long way to go, but we’ve already made great progress.

Prediction: Eye-tracking will play a big part in the future of VR

What he said: “Eye tracking will almost certainly be a part of the future of VR. We wouldn’t actually draw anything close to 32K by 24K pixels [on a display], we’d take advantage of the steep drop in resolution away from the fovea, the tiny high-resolution area of the retina, to render only as much resolution as the eye can detect. But, of course, we’d need to know where the fovea was pointed every frame, which means we’ll need to integrate eye-tracking with new display, optics, GPU, and rendering technology in order to make a truly great VR visual experience possible.”

What he means: Foveated rendering. Abrash himself went much more in-depth on the idea at this year’s event — but it’s interesting to see just how early on he had conceptualized it. Essentially it refers to using eye-tracking to pinpoint where a user is looking on a display and then only fully rendering the area the retina is focusing on. Less detail is rendered where you won’t notice it, freeing up resources for other tasks.

Was he right?: Yes, but we’re not there yet. Companies are starting to dabble with foveated rendering but we only have eye-tracking in one VR headset, FOVE. Now Abrash is predicting we’ll need a solution far more advanced than what’s currently available to achieve big gains in VR quality.

Prediction: Social VR will be a core type of experience

What he said: “Once we have virtual places to be in, we’ll want to see ourselves and others in that space. Sensors that can capture the inter-personal cues humans key off of: head-pose, eye-movement, facial expression, hand gestures, body posture and movement, and then map them onto avatars in real time will enable social interaction in these virtual spaces.”

What he means: Experiences in which you can meet up with other players and believe they’re standing with you in VR environments is going to be hugely important to the future of the technology. We’ll want to share our amazing life-like moments with others in real time, just as we would when we meet up in real life.

Was he right?: Abrash’s vision came to life at OC3 this year when Facebook demonstrated its own social VR experience that used Rift and Touch to deliver the expressions he mentioned two years ago, at least in a cartoon-ish fashion. We’re far off from doing this photo-realistically, but what we have now is still pretty convincing.

Prediction: VR will grow to include all senses

What he said: “VR is about driving the perceptual system, the more of it the better. So, eventually, it will involve all the senses: audio, tactile, balance, kinesthetic, maybe even smell and taste. The entire body will become the sensor, not just the eyes, and the display will consist of not just pixels, but of everything that drives perception.”

What he means: Headsets are just the start. VR has a long list of senses to conquer before we can truly consider it the “alternate reality” that Abrash predicted it would become as he wrapped up. We want to walk into rooms and feel the surfaces of desks and sofas, smell the sea air and sand in our toes as we walk along the beach and, in time, maybe even taste pizzas we grab infinite slices of as we sit down to watch a movie with friends halfway across the world.

Was he right?: He probably will be, one day. Work in haptics and audio also has some way to go to effectively replace our natural equivalents, and the kinds of technologies that would replace taste and smell may not even exist yet. As they’re solved in the decades to come, however, we’ll have VR that is indistinguishable from real life.