By the mid-60s, the computer-graphics pioneer Ivan Sutherland described a concept, sometimes called the “Ultimate Display,” that would form the basis for later VR apparatuses, including the Oculus Rift and its brethren. Sutherland’s idea seems obvious in retrospect, but nothing of the sort had existed before nor was feasible at the time he imagined it: a computer-rendered 3-D virtual world viewed through a head-mounted display, with tactical and audio feedback, and user interaction.

The name “virtual reality” didn’t become popular until the 1980s, when Jaron Lanier’s visual programming lab (VPL) began using the term for his company’s headsets, gloves, and related paraphernalia. The head-mounted displays were heavy and expensive, running tens of thousands of dollars at the time (the equivalent of well over $100k today).

But if virtual reality was conceived in the ‘60s and born in the ‘80s, it came of age in the 1990s. Not commercially—although not for lack of trying. SEGA and Nintendo, the gaming rivals from this era of the console wars, released VR devices in the mid-90s, both of which flopped catastrophically. No, the ‘90s were important for VR because that decade’s media set the terms for the fantasy of virtual reality.

Star Trek: The Next Generation had already introduced the holodeck back in 1987, but the simulation chamber bears more in common with what we call augmented reality today: computer-generated experiences that overlay themselves atop the ordinary world. The holodeck still inspires new devices, including the Microsoft HoloLens, a technology capable of displaying holographic projections for the bearer of a special set of goggles, and the mysterious, well-funded startup Magic Leap, which appears to project computer graphics directly into the user’s retina.

The holodeck and its progeny promise a temporary world, separate from but contiguous with the real world. But VR always promised to replace the real world. Sensory immersion set the stage for the implied or actual transformation of the human world into the machine world. VR, in other words, is fundamentally untrustworthy of “meatspace,” and it ultimately seeks to suspend or supersede it with an alternative reality.

Media of the 1990s, particularly film, was obsessed with virtual reality as alternative reality—and often as dystopia. In the 1992 film The Lawnmower Man, for example, an intellectually disabled man named Jobe becomes the subject of a researcher’s experiments to increase the intelligence of apes using virtual reality. Eventually, Jobe becomes telekinetic and uploads himself into the computer network, overcoming the limitations of the human world.

Like The Lawnmower Man, most films of the ‘90s that address VR imply a deviance or danger in the technology. Sensory immersion implied the potential abandonment or absconding of humanity. Take Kathryn Bigelow’s 1995 film Strange Days, set in a fictional Los Angeles of the turn of the millennium (that’s 1999, back then). An illegal device called a SQUID or Superconducting Quantum Interference Device can record information directly from the brain, which another viewer can play back from a “deck,” encountering all of the recorder’s mental and physical sensations. Amplified SQUID signals can “cook off” their viewers, rendering them brain-dead—the film’s not-so-subtle rendition of overdose, which also connects SQUID to the dangers and prohibitions of narcotics. The film’s plot revolves around solving a series of murders captured on SQUID discs.