"You have to try this."

Steven Bathiche, Director of Research at Microsoft's Applied Sciences Group, hands me a pen. It's a new pen, a prototype, but it's actually the screen he's excited about. In the meticulously organized lab he runs, inside Microsoft's sprawling Redmond, WA, campus, he's built a computer with almost zero latency—when you do something, the computer reacts instantly. Here, Bathiche has solved that infuriating problem where you write or draw on a screen, and the ink is always a half-second behind your finger. The latency is minimized to the point of inconsequentiality, and it feels amazing, like I'm actually writing ink on the glass. Next to it, there's another demo where the glass is gone and you're writing directly on the screen. That one feels even better.

The whole lab is dedicated to projects like these. There are dozens of prototypes, of all sorts: One is a ludicrously complicated light switch that takes Bathiche about fifteen seconds to turn on and prompts a delightful round of "how many brilliant scientists with 45 patents to their name does it take to change a lightbulb" jokes. But most of them involve a pen and a screen. There's this vision at Microsoft, one that's making its way around the company and the tech industry alike: The pen is back.

To explain why, Bathiche talks about keyboards. Think about typing the character 'a' on a keyboard. Fast, right? Just the one keystroke. "It is fast," Bathiche says, "if you make some certain assumptions. That the position of the letter 'a' is where you intended it to be, the font is the way you intended it to be, the size of the 'a' is what you intended it to be." All those decisions are made before you hit the key, and you often don't have a choice. "But with ink, you can dictate all those things, almost simultaneously as you're writing. I can put my 'a' here, or here, and I can make it as big as I want, as hard as I want."

Much of the time, that doesn't matter—it's not like the font size or brushstroke waviness matter in your Google searches and tweets. Nobody, not even Bathiche, thinks the keyboard or touchscreen is going to die. But he believes forcefully that the pen in your pocket should also be a powerful computing tool. He's equally clear that it should always look like the one in your pocket. We've spent literally hundreds of years perfecting these implements—they're comfortable and natural in our hands, and you instantly know how to use it. Even as technology has morphed the shape and size of everything from our cars to our thermostats, the pen's form hasn't changed. What's changing, dramatically, is what we can do with it.

Pointing to the Future

Study after study shows we remember things better when we write them—our brain stores the letter-writing motion, which is much more memorable than just the mashing of a key that feels like every other key. We think in fragments, too, in shapes and colors and ideas that just don't come through on a keyboard. "Think about how many things that are built start as a drawing," Bathiche says. "Most things, right? Everything you're wearing probably started as a drawing."

You can't type out the folds of a dress, or the gentle curves of a skyscraper. Drawing with your stubby finger on a touchscreen isn't much better. Humans are tool-based creatures: Our fingers can do amazingly intricate things with a pen, a brush, or a scalpel, that we can't replicate with a mouse or the pads of our fingers. Our computers are giving back that kind of detailed control. In turn, the pen is opening up new ways of digital expression, new tools for communication, new ways to interact with our tech.

Over the last few years, our computers have become powerful enough to figure out what we're trying to write or draw. They understand more and more complex inputs. And with pens like Livescribe and Phree, turning pen input into digital output happens more naturally than ever. You can write with the Livescribe pen, in a Moleskine notebook just like the one you've always used, and all of your handwriting is instantly and perfectly digitized. With Phree, you can write almost anywhere—on the table, on the edge of the couch, on the ceiling—and have it show up on your phone's screen in real time. Samsung's Galaxy Note line has been improving its text-recognition for years, and apps like Evernote offer excellent handwriting recognition.

Draw Something

The possibilities are kind of incredible, actually. For instance, the Microsoft team has been experimenting with a search tool that lets you draw your search query. Bathiche sketches a crude Eiffel Tower, and Bing-powered image results populate. OK, what about at night? He scribbles a dark sky, and quickly switches the pen's digital ink to yellow—lights! The search results change instantly. In another demo, he sketches arrows, shapes, and symbols. Rather than scroll through lists or try to figure out whether he wants "Medium Left Arrow" or "Medium Left Arrow with Big Head," he just draws what we wants. The app recognizes what he's making, smoothing out lines and sharpening corners so it looks like an actual arrow.

WIRED

That's all great, and means a pen can be much more versatile than any other input we have. It's also mostly possible right now—handwriting recognition and even computer vision have been around for years. The core problem is something else: Writing on a screen is terrible with our current technology. It's slow, it feels slick and bad, there's a disconnect between you and the ink. It's just too hard, and recreating that same, satisfying pen-on-paper experience out of transistors and pixels is difficult work.

"We're all carrying these supermassive, powerful computers in our pockets, but the ways we want to interact with them are more and more natural," says Gilles Bouchard, CEO of digital-pen-maker Livescribe. "We're in the phase where technology is learning to adapt to you as a human being. It's a really fundamental change."

A pen can be much more versatile than any other input we have.

Everyone has different ideas for how it'll work. Livescribe's pen uses special paper which helps the system digitize your writing for you. With Phree, you're separating input and output, like moving a mouse in your hand and seeing the cursor on the screen move—nothing comes out of the pen when you scribble it, but the exact lines show up on the screen. Wacom makes both a product where you write on the screen (Cintiq) and products with a pressure-sensitive pen-input tablet that sits on your desk (Intuos).

Microsoft's researchers believe, however, that the only way to advance pen-based input is for you to write directly on the screen. The Surface, in many ways, isn't so much "the tablet that can replace your laptop" as it is a playground for Bathiche and his teammates to experiment with bringing the feeling of pen and paper to the digital era. The screen's aspect ratio was chosen to mimic a piece of letter-sized paper, and the Surface Pen was always supposed to feel like, well, a pen.

"The good ones show up between 7 and 10mm in thickness," says Ralf Groene, the bespectacled, German-accented head of design for Surface at Microsoft. The goal is to have a pen that looks and feels like the Bic or Mont Blanc, but is dramatically more powerful. "You want to have a device you want to write and sketch with."

"The other thing," Groene says, "is that if you have a good idea and you have your Moleskine and a pen there, it's almost immediate." So the Surface team worked with Microsoft's OneNote team to build a feature that launches an empty page as soon as you click the Surface's pen, even above the lock screen, so you can start taking notes as fast as flipping open your notebook. (OneNote has always been designed for free-form thinking—you can just tap anywhere on the page and start drawing or writing.)

Phree

As he speaks, Groene touches his Moleskine, a battered yellow notebook with colored tabs sticking out all over. There are sketches and notes on almost every page, and you get the sense that this, not his OneNote library, is his most prized notebook. "There were always stylus and pens in the Palm Pilots, pokers and things," Groene says. That's still most of what's available for tablets, phones, and everything else. "We looked around, and just said let's go and create something that we have here, which is like a Moleskine." He talks about the coating on the paper, the sound it makes when you write on it. When I ask if it's even conceivable to build a screen that feels this good, Bathiche just smiles. "Yes. Yes it is."

Sketching the Surface

A state away at a secretive lab in Oregon, Microsoft's Surface Hub team is working on this problem too. The pen is the primary tool for its 84-inch, conference-room-in-a-screen device that's now on sale for between 7 and 20 grand. The Hub team's pen is a little bigger, more like a dry-erase marker. Chad Roberts, the project's design lead, even points to a dry-erase marker on the (now empty, pointless, and sad) whiteboard in the small conference room where they've set up three Surface Hubs to show me. "You're used to drawing with a pen that size," he says. "It would feel weird" to use something more like the standard Surface Pen. There's also a special film on the giant screen that makes it feel a little less like glass, and more like you're drawing on a plain ol' whiteboard. Only super high-tech.

The Surface Hub just wouldn't work without great pen support. It's made to be collaborative; it's also made to be totally simple and intuitive. Even the simple act of hunting around for the keyboard is too much. When you walk in the room, the Hub team hopes, you'll just pick up a pen. When you do, a blank whiteboard screen shows up, and your meeting is off and running. They hope you never notice you're working on a giant Windows computer, at least not until you decide to Skype someone in using the Hub's two cameras and many multi-directional speakers and microphones. "I love the way the pen feels," says Peter Oehler, the Hub's hardware lead and the former director of technology at Perceptive Pixel, the company Microsoft bought and whose technology makes up much of the Hub. "I love the way you forget you're working on an electronic device, and you just start telling a story."

Surface just wouldn't work without great pen support

Isn't that the whole goal of technology in 2015? To get out of the way, let us do whatever we want, and then be smart and powerful enough to do interesting and useful things with it? We know how to use pens, we've evolved to be suited for them. Now it's time to make that digital. And lest you think it's just about taking notes and drawing pictures of the Eiffel Tower, the possibilities for pen input are much, much larger than that.

Think about virtual reality, where you're in a completely new world you need to learn to navigate and interact with. "If you have some kind of cursor floating in your virtual reality space, and you have something in your hand, your hand-eye coordination still works really well," Opher Kinrot says. "Your brain is wired that way." Kinrot is co-founder of OTM Technologies, which makes Phree, and he thinks about this stuff a lot.

The best VR demo I've ever seen was on HTC's Vive, which gave me a huge controller I used to paint in the air in front of me. Once I'd finished—I drew my name, and then the worst tree anyone has ever created—I could walk around it, seeing this free-floating creation from all angles.

Imagine a world where there's such fine control in your hands that you can chisel a statue out of virtual marble, or create a digital watercolor that is stroke for stroke exactly how it would look in real life. That's possible, and it's only possible with a pen. You already know how to use it, as famous stylus-hater Steve Jobs might say. And it gives you the kind of freedom and control every great technology should.

Groene compares the current thinking in pen-based input to the early digital photography: maybe the experience isn't perfect yet, but there are so many upsides that it's worth pursuing anyway. And eventually, with tech like Microsoft's, Phree's, and Livescribe's, the experience is going to get good fast.

As for me, I only see one problem: I still can't draw for shit.