We got lucky with the weather in Washington State. It’s a clear afternoon with a few scattered clouds, low wind speeds—ideal flying conditions. Mike Dubbury’s calm briefing helps, too. Honeywell’s senior test pilot talks me through the upcoming trip in the Beechcraft King Air C90, which I’ll be piloting.

I can’t quite relax, though. Not just because I’ve never flown a plane before, but because I’ll be doing it without touching the flight controls—by thought alone. And except for the guy who came up with this contraption, I’ll be the first person to fly the mind-controlled King Air.

While Dubbury delivers his safety instructions, Santosh Mathan wires me up. A neurotechnology researcher at Honeywell Aerospace, he invented this system. Mathan helps pull what looks like a navy blue swimming cap onto my head, with a series of holes in it. He squirts cold conducting gel into each, then threads 32 electrodes through the cap and onto my scalp. I’m left with an old-fashioned ribbon cable—the kind that would have connected a computer and a dot matrix printer—hanging down like a ponytail. I look like a steampunk Andre Agassi.

“We’ll be doing a bunch of basic maneuvers—climbs, descents, turns—around Puget Sound,” says Mathan, picking up from Dubbury. And by we, he means me, a guy with no pilot license and no flying experience. That’s what I’m thinking as we idle on the runway, waiting for our takeoff slot.

Honeywell has built a brain-computer interface into this six-seat twin turboprop’s autopilot. The system interprets patterns of electrical activity in the brain, watching for certain signals or patterns that almost anyone can produce with a few minutes of training. In this plane, those patterns translate to commands to climb or bank left or drop a few thousand feet. “We saw control of an aircraft as a nice target in order to develop, refine, and test our neurotechnology,” Mathan says. Today’s flight is the culmination of 12 years of work. They’ve shown that it works in a simulator, and now they want to push it further by taking it to the air, with the minimal (but real) risk of death by crashing.

A few minutes after Dubbury handles takeoff, the system is ready for me. Mathan tells me to make the plane climb.

While not quite as simple as “think up, fly up,” it’s close. I’m sitting in front of an iPad-sized screen, which has arrows for up, down, left, and right, plus a level flight indicator in the center. A green box flashes around each command, one at a time, seemingly at random. My job is to focus on the arrow that reflects what I want do.

Joshua Lim

When the box surrounds the command in question, my brain creates an electrical signal called an event-related potential. Those clues, which are born in the visual perception area at the back of the brain and ripple across the cortex, aren’t easy to spot. It’s not just that they register at under 10 microvolts, one-tenth of what you get from normal brain activity. Or that muscle movements like blinks create their own signals and obscure what I’m trying to tell the plane. Or that I had just 15 minutes of practice in a simulator before takeoff.

It’s also that the very act of focusing in the noisy, crowded, stressful environment of a small cockpit is difficult. I have air traffic control squawking in my ears, sunlight glinting off the gauges, propeller noise, and the disconcerting knowledge that I’m trying to fly a plane by thinking about a bunch of green arrows.

I relax and put all my mental energy into watching the up arrow. To make sure it’s picking up an intentional command and not an eye twitch, the computer waits until it’s registered several signals in a row. Then the plane climbs. Just like that.

The first few maneuvers, I can’t believe it’s actually me in command. And then comes the euphoria. I am swooping through clouds, climbing, diving, flying in circles, all at my whim.

Except it doesn’t feel quite that free. Each move takes at least 10 seconds of hard concentrating, sometimes longer, trying to ignore everything that’s going on around me. (Mathan says that if he had a couple of days to calibrate the system to my brain, he could speed that up.)

And then there’s the lack of feedback. The plane isn’t an extension of my body. Where birds and Chuck Yeager fly by feel and instinct, I have to look up to see whether my command is being implemented, and then look quickly back at the screen to start focusing on the next direction. And I’m only flying within the confines of the autopilot system. I’m issuing simple updated commands to it, nothing as complicated as taking off or landing.

By the time Dubbury puts us back on the ground, I have a headache starting. I can’t tell if it’s from focusing hard, the noise and bright sunlight in the small plane, or the physical pressure of the skullcap with headphones clamped over the top of it.

Jack Stewart/Wired

Whatever: I flew a plane with my thoughts. This is the stuff of my childhood science-fiction fantasies. Mathan and I grab a selfie in front of the plane to mark this moment. I suspect this test stressed him as much as me. “It’s one thing for us to do this in the lab, but seeing another user, with a limited amount of training data, well, that part made me curious,” he says.

Brain-computer interfaces already control cursors on a screen and fly small drones. They can even work in both directions, providing a sense of touch from an artificial hand, and researchers hope they’ll someday help people with disabilities communicate and interact with their environment. The most advanced work comes out of a consortium called BrainGate, which has so far implanted BCIs in about a dozen humans, intended to help them deal with paralysis from ALS or strokes. Some have even controlled a robot arm. That requires more precision from the signals, so most of those people get electrodes implanted under their skulls. (Honeywell accepts lower resolution as the price for skipping invasive surgery. My recovery process will be limited to lather, rinse, and repeat.)

But the swimming cap approach is less reliable, says neuroscientist Beata Jarosiewicz, who has worked on the project at Brown and Stanford Universities. “I wouldn’t depend on this to fly a plane in a fast-paced scenario where you’re trying to dodge cliffs or other planes, but it’s definitely an interesting example—how often will it do the right thing within a reasonable amount of time?”

We know there are limitations to human performance. We hope to create more robust technologies to monitor cognitive states that might affect pilots. Neurotechnology researcher Santosh Mathan

Even implantable sensors don’t match up to the accuracy and speed of natural human movement. Not yet, anyway. Darpa keeps a close eye on this space, and Jarosiewicz says that yes, someday, people might control electronics as naturally as their own muscles.

Still, the pilots of the future will not fly by thought. It’s a high-risk, low-return use for the technology. The planes of tomorrow will fly themselves, without plugs connected to human heads.

Mathan says his research could help in the cockpit, however. Pilots won’t fly by focusing on commands on a screen, but they might use the technology when they read through a checklist or zoom in on a map or flick switches for distracting, noncritical tasks. That would keep their hands free for other things. And the rest of us may get devices that let us scroll through pages or Tinder profiles on our phones just by thinking about it.

“If the consequences are not as dire, and you just accidentally end up on the wrong webpage or hit the back button, then sure, it’s totally fun to play around with these skullcap electrode-array-type interfaces,” says Jarosiewicz.

But Mathan says the real potential of his system builds on research into how to keep pilots engaged and attentive. “We all know there are limitations to how well a human can perform,” he says. “We hope that the work we’re doing will help create more robust technologies to monitor cognitive states that might affect pilots.” The same could apply to drivers, particularly of increasingly automated cars. A noninvasive way to measure how much or how little they’re paying attention could help the computer decide when to step in, and how much information to relay if it needs to hand control back to the human.

As I peel the skullcap off and start rubbing the conducting gel out of my hair, I can’t imagine pilots wanting to put one of these things on every time they fly. But once I’m relegated to the back of the plane again, I’d be perfectly happy if some ridiculous cap was helping my captain to function at peak performance.