Black Mirror is simultaneously attainable and impossible-seeming. Each episode explores the potential pitfalls of technology, both futuristic and around the corner. But what makes the show border on creepy instead of simply fascinating is this question: How possible is it that this thing will happen? How far into the future is — or isn’t — it?

The answers vary. We took five of our favorite episodes and asked experts when we could expect the plots’ technology to launch. And honestly, it’s all feeling a little too real now.

Episode: “Hated in the Nation”

Technology: Surveillance Honeybee Drones That Kill

ETA: 2019 (Maybe now?)

Molly McHugh: In “Hated in the Nation,” the sixth episode of Black Mirror Season 3, we are confronted with a very real problem: the near extinction of bees. Drone bees have been created to supplement the dying population, but it turns out they are also being used as a means for mass government surveillance. They can be hacked, too, and then used to target and kill citizens. Also, it all seems entirely probable!

“This is actually a really funny concept that could potentially happen,” said Kyle Foley, president of the drone technology company Skyworks Project, via email. “The way I would imagine it working is using object detection [which is already happening] to differentiate between flowers and people. All you need onboard is a very tiny computer and camera to tell the two apart. The technology is there, it’s just a matter of miniaturizing it … It would need to have some type of ‘stinger’ on the drone that can puncture the skin and inject a very small amount of some very poisonous fluid, or maybe a way to release some type of powder. With a drone that small, it would be very difficult to carry a large amount of this substance, so it would likely be a one-shot, one-kill type of deal.”

Foley says that in the next two to three years, drones are going to get smaller and smaller — they may get as small as, say, a honeybee. “Currently they are pretty small, but not bumblebee small.” He says right now there are drones the size of goldfinches that “could definitely do what you are talking about.” He gives it two to three years until this could be a real thing. “Kinda scary to think about …”

Another expert, David Wright, who’s a managing partner at Trilateral Research and wrote a paper called “Ethical dilemma scenarios and emerging technologies,” says the Defense Advanced Research Projects Agency has been working on creating “dragonfly drones” for years now, as has the U.K. government. “I would assume the U.S. and U.K. would have fully functional dragonfly drones in a few years, probably less than five. Some speculate they might already be among us,” Wright said via email. He adds that the hacking story line figures, too. “Internet of Things (IOT) devices can be hacked and become part of a botnet, so I would assume it would be possible to hack a dragonfly drone and convert it from a surveillance tool to a weapon — although given its tiny size, I’m not sure how dangerous a payload it could carry — a few drops of nitroglycerin, perhaps?”

Basically, we are mere years away from (or already living in) a world where this could happen, though it seems the drone bees will be able to carry only (currently only?!) a small supply of a deadly substance.

Episode: “The Entire History of You”

Technology: Thought-Controlled Contact Lenses

ETA: 2021

Victor Luckerson: Several Black Mirror episodes, including Season 1’s “The Entire History of You” and Season 3’s “Nosedive” and “Men Against Fire,” use some variation of a camera-equipped contact lens that can scan and record the user’s field of vision. Think of it as a more powerful, miniaturized version of Google Glass.

Sony and Samsung have filed patents for smart contact technology but there are a lot of challenges to bringing such a device to reality. Excess heat generated by the lens could be damaging to the eye. Electronic components small enough to fit into a lens form factor are expensive. But by offloading a lot of the computing effort to the cloud or a nearby paired device like a smartphone, the complexity of the lens itself could be kept in check.

Kohitij Kar, a researcher at the MIT Department of Brain and Cognitive Sciences, believes lenses with functionality largely similar to Google Glass could be available to the market within five years, though they’d likely be used for medical or military purposes at first. And they’d probably have to be controlled by a separate device, like a phone, or via voice commands. The ability to control a smart contact via thoughts, as the characters in “The Entire History of You” do, could eventually be feasible as well, though.

“There is recent work showing that imagined objects have similar brain activity patterns compared to when those objects were shown to the subjects,” Kar says. “This means, you can ask someone to imagine a car or an elephant (for example) and then given their brain activity, reliably predict which object they imagined … We can also decode particular faces from our brain. Hence it is theoretically possible to turn on videos about particular friends or family members, given some link between the imagined face and the correct switch for the ocular device. Although all this is theoretically feasible, integrating the brain decoder with the intra-ocular optical device will be challenging. Most complex brain data comes from functional magnetic resonance imaging or invasive recordings (which currently require pretty extensive and expensive setups). Instead of trying to get hold of thoughts, just using voice commands might be a good starting point.”

Episode: “Men Against Fire”

Technology: IRL Snapchat Style Face-Filtering

ETA: “Hopefully” in the 2030s

Claire McNear: Let’s get it out of the way: “Men Against Fire” is not Black Mirror’s best episode. But I would posit that it’s one of the series’ most chilling. While most episodes depict technology disrupting the banalities of daily life, here we see it co-opted by a real-world organization that is often responsible for taking tech to its darkest, deadliest places: the military.

In “Men Against Fire,” we find soldiers hunting down and killing “Roaches,” a race of humanoids we’re told have a blood disease that causes them to be grossly disfigured. The soldiers are assisted in their quest by military-supplied neural implants that equip them with all manner of augmented reality capabilities: maps of battlefields, visual links to nearby drones, additional data about combatants, etc. The twist comes when one soldier has his implant damaged and loses AR functionality — revealing to him that the Roaches he and his fellow soldiers had been gleefully shooting are in fact perfectly normal, disease-free human beings. The military was simply using the implants to make the Roaches — whom we learn are a politically persecuted minority — appear monstrous so that soldiers tasked with eliminating them wouldn’t be distracted or experience the trauma that necessarily comes from fighting fellow humans.

There’s a lot here, including the ability to revise memories, but let’s focus on the episode’s main item. Could the military use AR technology to give its soldiers the impression that enemy combatants are zombie-like monsters?

“I see it as a natural evolution of the user’s media experience,” says Mark Skwarek, a lecturer at NYU’s Tandon School of Engineering who works on augmented reality, of advanced AR like the implants in “Men Against Fire.” “We won’t be looking at 2-D smartphones for too much longer — people will be surrounded by a 3-D interface.”

If you’ve ever vomited up an AR rainbow, you know that live-filter tech in its current state is very good. Snapchat still owns the game, but other contenders have shown themselves just as capable of creating realistic digital filters and masks that are highly responsive to the size, movements, and angles of whatever face they’re fixed on. Take a gander at the Halloween masks designed by Prisma and rolled out for use across Facebook Live last month: If you want to make somebody look like a creepy-ass Voldemort knockoff, you can do it with ease. “[T]his would definitely be doable on our end,” says Ellen Taylor of Facetune, a photo editing app dedicated to facial tweaks, while acknowledging that what happened in “Men Against Fire” is “a bit of a different direction from how we work.”

The insidious part of “Men Against Fire” is that the Roach-ifying is automatically applied to a specific ethnic group that we learn has been systematically registered by the government. The technology to allow this is, more or less, here as well — in the same way that facial recognition software suggests that you tag a known friend, it might suggest a default filter for a recognized face. In the not-so-distant future, you could set a default filter for yourself that would appear across devices. Don’t like that scar or mole on your cheek? An app could remember that and filter it out, even when you appear on a friend’s device.

“It’s definitely not a stretch to imagine this applied to different sorts of livestreams,” says Stav Tishler, also of Facetune.

But automatic filtering can only do so much while the devices that enable AR remain niche products — no one’s taking the bus while wearing an Oculus Rift. Face-filtering “won’t really be great [until] it’s seen through some type of lightweight glasses,” says Skwarek. “It will have the ability to create hyper-realistic real-world people, objects, environments, and experiences.”

A brain implant would seem to solve that issue (and open up the possibility of darker uses than cosmetic mole removal), and this, at least, is still quite a ways off. That’s not to say that scientists aren’t working on it: Last year, a researcher at DARPA’s Biological Technologies Office proposed a “cortical modem” that would connect directly to the brain’s visual cortex, potentially working around blindness and creating Terminator-style optical displays.

Once VR is truly immersive, the difference between what’s real and what’s perceived begins to bend. “VR tests with children show that immersive experiences are remembered in the part of the brain with real-life memories,” says Skwarek. “TV shows, videos, movies, and 2-D images like photos are remembered at another location. It will be a very powerful experience.”

Episode: “San Junipero”

Technology: Eternal Life Via Uploaded Consciousness

ETA: 2026 (for Snails); 2036-plus (-ish, Maybe, for Humans)

Alyssa Bereznak: In “San Junipero,” two women named Kelly and Yorkie meet in a beachside town and form a special connection. As the episode unfolds, however, we learn this isn’t any simple tourist destination, but a virtual reality playground for the dead and terminally ill, where you’re assigned era-specific outfits and dope Barbie cars. As each creeps closer to death in the flesh, the two grapple with whether they should continue their relationship in the virtual world — to “pass over” rather than “pass away.” In other words, they live in a future where they can choose to either disappear, or have a replica of their human consciousness uploaded to the all-powerful cloud, existing in a fantasy land forever.

The whole thing sounds pretty awesome and horrifying. But Mikhail A. Lebedev, a senior research scientist at Duke’s center for neurobiology, says we’re very far off from seeing it enacted within our lifetimes.

“We are currently at the stage of [this being] science fiction,” Lebedev told me. “There is no practical way of uploading the entire consciousness of a human to an external carrier.”

To get to some “San Junipero”–level stuff, we have some work to do. First experts need to build an artificial brain — research that’s currently being funded by the BRAIN Initiative, Blue Brain Project, and Human Brain Project. Though these initiatives have helped with things like mapping brain connectivity, Lebedev argues they lack a clear plan for delivering an artificial brain anytime soon. Instead, he predicts researchers will likely copy the brain of a much more simple organism — maybe a snail — within the next 10 years.

But once we theoretically have an artificial brain, there’s the requirement for scientists to actually understand what consciousness is. (Cue “mind blown” GIF.) There are many theories in this field, but Lebedev is skeptical. “We know absolutely nothing about the mechanisms of consciousness,” he said. “I believe that this quest for the origin of consciousness is doomed.”

If scientists somehow jump those hurdles, they must then figure out how to read the content of the brain. Here, again, Lebedev is not optimistic. “Theoretically, you could interrogate every neuron in the brain and then set neurons in the artificial brain the same way. However, this will definitely fail: Errors in reading out the information from individual neurons will mount to a huge overall error when billions of neurons are reproduced, so the resultant ‘soul’ would not be anywhere near to the original one.”

So how far off, then, are we from a euphoric afterlife like Kelly and Yorkie’s? Well beyond 20 years. That’s how long Lebedev estimates it’ll take for good artificial brains to emerge. And who knows what will happen after that? In the meantime we can probably expect a commercial product that mixes AI and robotics to reproduce the thinking and behavior of a deceased person, much like the experiments many artists and technologists have released in the past year.

Episode: “Playtest”

Technology: Implanted VR Horror Experiences

ETA: Never, Probably

Kate Knibbs: In “Playtest,” a broke traveler takes a gig testing a new, top-secret virtual reality horror game. He gets a spinal implant to connect his brain to the gaming interface, for gameplay customized to his worst nightmares. When do today’s actual scary, brain-warping VR horror-makers think we can expect that type of technology in consumer gaming products? The good news: Not anytime soon, and probably not ever.

“The ‘Playtest’ Black Mirror episode was great and theoretically possible, but we are light-years away from something like that,” Shawn Hitchcock, who created the horror VR game “Emily Wants to Play,” said via email. “So for now, if anyone wants to stick something into the back of your neck, don’t let them!”

Hitchcock’s not the only horror gaming expert with doubts. “For actual ‘spinal Wi-Fi’ I would say ‘never, or at least not on any timeline we can guess,’” horror VR game developer Jarod Pranno, who works for Chicago-based Phosphor Games Studio, said.

Pranno noted that there are already some games with brain-wave-reading interfaces, including Mattel’s Mindflex — but, of course, nothing made by Mattel requires a drill to the spine. “I think the main barrier to that kind of VR is the brain-computer interface; at least in the game development world, no one is seriously considering or working on a spinal uplink of any kind, and even if we had that, no one would know what to do with it or how to ‘upload’ to the brain.”

Hitchcock elaborated on why “Playtest” is too far-out to happen. “In the episode they directly interface with the user’s brain. The device is physically attached to their nervous system through the back of their neck. This itself would be very dangerous and could cause all sorts of problems, from numbness in limbs to complete paralysis. It also wouldn’t work correctly unless you mapped out the entire central nervous system of the user. Then you would need to perfectly attach wires to the nerves. If that worked out, you would need software that could send information to the user’s brain and also get information back from it,” Hitchcock said. “Imagine the complexity, and how would you send a signal that would cause any type of experience?”

“Another important aspect is receiving information back from the user’s brain. You would need to tell the software what the person is doing and seeing to correctly project the proper sights and sounds. We can barely get any info from the brain and make sense of it. We know if there is activity and where the activity is happening but we can’t look at the signal and determine it is from a person experiencing a roller-coaster ride or their 20th birthday party,” Hitchcock said.

So: It can’t be done! I, personally, find this enormously comforting. But at least one of my expert friends disagrees. “It’s kind of a shame because if you think about it, the human brain is the most advanced virtual reality experience generator in existence, as we discover every night in our dreams,” Pranno said.