In the late 1870s, Edgar Degas began work on what would become one of his most radical paintings, Jockeys Before the Race. Degas had been schooled in techniques of the neoclassicist and romanticist masters but had begun exploring subject matter beyond the portraits and historical events that were traditionally considered suitable for fine art, training his eye on café culture, common laborers, and—most famously—ballet dancers. But with Jockeys, Degas pushed past mild provocation. He broke some of the most established formulas of composition. The painting is technically exquisite, the horses vividly sculpted with confident brushstrokes, their musculature perfectly rendered. But while composing this beautifully balanced, impressionistically rendered image, Degas added a crucial, jarring element: a pole running vertically—and asymmetrically—in the immediate foreground, right through the head of one of the horses.

Degas wasn't just “thinking outside of the box,” as the innovation cliché would have it. He wasn't trying to overturn convention to find a more perfect solution. He was purposely creating something that wasn't pleasing, intentionally doing the wrong thing. Naturally viewers were horrified. Jockeys was lampooned in the magazine Punch, derided as a “mistaken impression.” But over time, Degas' transgression provided inspiration for other artists eager to find new ways to inject vitality and dramatic tension into work mired in convention. You can see its influence across art history, from Frederic Remington's flouting of traditional compositional technique to the crackling photojournalism of Henri Cartier-Bresson.

Degas was engaged in a strategy that has shown up periodically for centuries across every artistic and creative field. Think of it as one step in a cycle: In the early stages, practitioners dedicate themselves to inventing and improving the rules—how to craft the most pleasing chord progression, the perfectly proportioned building, the most precisely rendered amalgamation of rhyme and meter. Over time, those rules become laws, and artists and designers dedicate themselves to excelling within these agreed-upon parameters, creating work of unparalleled refinement and sophistication—the Pantheon, the Sistine Chapel, the Goldberg Variations. But once a certain maturity has been reached, someone comes along who decides to take a different route. Instead of trying to create an ever more polished and perfect artifact, this rebel actively seeks out imperfection—sticking a pole in the middle of his painting, intentionally adding grungy feedback to a guitar solo, deliberately photographing unpleasant subjects. Eventually some of these creative breakthroughs end up becoming the foundation of a new set of aesthetic rules, and the cycle begins again.

Degas wasn't just thinking outside the box. He was purposely creating something that wasn't pleasing.

For the past 30 years, the field of technology design has been working its way through the first two stages of this cycle, an industry-wide march toward more seamless experiences, more delightful products, more leverage over the world around us. Look at our computers: beige and boxy desktop machines gave way to bright and colorful iMacs, which gave way to sleek and sexy laptops, which gave way to addictively touchable smartphones. It's hard not to look back at this timeline and see it as a great story of human progress, a joint effort to experiment and learn and figure out the path toward a more refined and universally pleasing design.

All of this has resulted in a world where beautifully constructed tech is more powerful and more accessible than ever before. It is also more consistent. That's why all smartphones now look basically the same—gleaming black glass with handsomely cambered edges. Google, Apple, and Microsoft all use clean, sans-serif typefaces in their respective software. After years of experimentation, we have figured out what people like and settled on some rules.

But there's a downside to all this consensus—it can get boring. From smartphones to operating systems to web page design, it can start to feel like the truly transformational moments have come and gone, replaced by incremental updates that make our devices and interactions faster and better.

This brings us to an important and exciting moment in the design of our technologies. We have figured out the rules of creating sleek sophistication. We know, more or less, how to get it right. Now, we need a shift in perspective that allows us to move forward. We need a pole right through a horse's head. We need to enter the third stage of this cycle. It's time to stop figuring out how to do things the right way, and start getting it wrong.

In late 2006, when I was creative director here at WIRED, I was working on the design of a cover featuring John Hodgman. We were far along in the process—Hodgman was styled and photographed, the cover lines written, our fonts selected, the layout firmed up. I had been aiming for a timeless design with a handsome monochromatic color palette, a cover that evoked a 1960s jet-set vibe. When I presented my finished design, WIRED's editor at the time, Chris Anderson, complained that the cover was too drab. He uttered the prescriptive phrase all graphic designers hate hearing: “Can't you just add more colors?”

I demurred. I felt the cover was absolutely perfect. But Chris did not, and so, in a spasm of designerly “fuck you,” I drew a small rectangle into my design, a little stripe coming off from the left side of the page, rudely breaking my pristine geometries. As if that weren't enough, I filled it with the ugliest hue I could find: neon orange— Pantone 811, to be precise. My perfect cover was now ruined!

By the time I came to my senses a couple of weeks later, it was too late. The cover had already been sent to the printer. My anger morphed into regret. To the untrained eye, that little box might not seem so offensive, but I felt that I had betrayed one of the most crucial lessons I learned in design school—that every graphic element should serve a recognizable function. This stray dash of color was careless at best, a postmodernist deviation with no real purpose or value. It confused my colleagues and detracted from the cover's clarity, unnecessarily making the reader more conscious of the design.

But you know what? I actually came to like that crass little neon orange bar. I ended up including a version of it on the next month's cover, and again the month after that. It added something, even though I couldn't explain what it was. I began referring to this idea—intentionally making “bad” design choices—as Wrong Theory, and I started applying it in little ways to all of WIRED's pages. Pictures that were supposed to run large, I made small. Where type was supposed to run around graphics, I overlapped the two. Headlines are supposed to come at the beginning of stories? I put them at the end. I would even force our designers to ruin each other's “perfect” layouts.

At the time, this represented a major creative breakthrough for me—the idea that intentional wrongness could yield strangely pleasing results. Of course I was familiar with the idea of rule-breaking innovation—that each generation reacts against the one that came before it, starting revolutions, turning its back on tired conventions. But this was different. I wasn't just throwing out the rulebook and starting from scratch. I was following the rules, then selectively breaking one or two for maximum impact.

Once I realized what I'd stumbled on, I started to see it everywhere, a strategy used by trained artists who make the decision to do something deliberately wrong. Whether it's a small detail, like David Fincher swapping a letter for a number in the title of the movie Se7en, or a seismic shift, like Miles Davis intentionally seeking out the “wrong notes” and then trying to work his way back, none of these artists simply ignored the rules or refused to take the time to learn them in the first place. No, you need to know the rules, really master their nuance and application, before you can break them. That's why Hunter Thompson could be a great gonzo journalist while so many of his followers and imitators—who never mastered the art of traditional reporting and writing that underlay Thompson's radical style—suffer in comparison.

Why does the Wrong Theory work? After all, symmetry is naturally pleasing. Put two faces in front of a 1-year-old and she will immediately pick the more symmetrical one. But what if we're after something deeper than simple pleasure? It turns out that, while we might initially prefer the symmetrical and seamless, we are more challenged and invested in the imperfect. Think of Cindy Crawford's mole or Joaquin Phoenix's scar. Both people are stunning, but they stand out for their so-called imperfections. A better thought experiment might be to put that child in a room with 99 symmetrical faces and one asymmetrical one. Which one do you think she'll be drawn to?

A 2001 study conducted by Baylor College of Medicine and Emory University might begin to answer that question. In it, neuroscientists conducted fMRI scans on 25 adults who received squirts of fruit juice or water into their mouths in either predictable or unpredictable patterns. The scans showed that the subjects who got the unpredictable sequence registered noticeably more activity in the nucleus accumbens—an area of the brain that processes pleasure.

Yes, our minds learn to prefer activities that we repeatedly enjoy, because we recognize those patterns and come to expect a payoff. But the study suggests that when our predictions are wrong—when we walk into a surprise party instead of a planned dinner, for instance—that's when our pleasure centers really light up. We may find comfort in what we know we like, but it's the aberrations that bring us to attention.

How might these findings be applied to technology design? It's still a bit early to say. Right now we are late in the second stage of the design cycle—applying agreed-upon rules to an ever-widening array of products, apps, sites, and services. Put another way, designers are still trying to get things right, not deliberately make them wrong. But even as they do so, they are learning how to push up against once-sacrosanct conventions. As a result, they're giving us glimpses of what “wrong” technology might look like.

Take Instagram. When Kevin Systrom and Mike Krieger were first developing the photo-sharing social network, they wrote a sentence on a whiteboard that summed up the accepted wisdom around photo sharing: “Today online, people post photos that they take with cameras, and they store them in albums to share with only their friends.” Then, systematically, they began replacing words. Cameras became phones, in albums became as single photos, only their friends became everyone. In the process, they stumbled upon an innovative insight about how people's behavior would change. This isn't really an example of Wrong Theory—the result was incredibly appealing, not intentionally off-putting. But the method they used to create it, understanding and then subverting explicit established rules, suggests the kind of thinking that can move us into this new era.

Indeed, we're starting to see that kind of thinking everywhere. Snapchat built a multibillion-dollar empire on a notion that seems deeply wrong at first blush—actively preventing users from archiving and accessing their communication. And Netflix undercut the entire structure of television by deciding to release every episode of its original series at once. That meant trading off some of the pleasure of the weekly cliffhanger and the day-after watercooler chatter for more complicated plotlines—like the maybe-too-byzantine Arrested Development reboot—and the joys of binge-watching. Or take a look at the growing subgenre of intentionally frustrating videogames—like Flappy Bird or Super Hexagon—that ignore standard on-ramping and throw players directly into chaos.

All of these examples point the way toward the next challenge for technology design. What happens after you've learned how to make technology that is supremely appealing and functional? A whole new range of opportunities opens up. By breaking those rules, we can create technology that is more than merely useful or beautiful or natural. We can imagine technology that is complicated and personal—nostalgic, funny, self-deprecating, abrasive. Yes, there will be missteps. For every Kind of Blue there were about a million Metal Machine Musics—unlistenable exercises in self-indulgence. But only by courting failure can we find new ways forward. It's time for us to create the next wave of technology. It's time for us to be wrong.

Wrong Theory, a History

Throughout history, artists and innovators have advanced their fields by making deliberately “wrong” choices. Here are some great moments in Wrong Theory. —CORY PERKINS

1903 | Paris' fashion elite recognized Paul Poiret at a young age for his skilled drawings; but where other designers focused on cages and corsets, his work featured draped fabric and natural silhouettes. Getty

1913 | Igor Stravinsky's ballet score The Rite of Spring was a departure from traditional composition: The rising star abandoned harmonic consonance in favor of harsh, tense tones that incited a riot at its first performance. Alamy

Early/Mid-20th Century | In developing the Epic Theater style, dramatists like Bertolt Brecht consciously reminded audiences of the play's artifice, encour­aging actors to break the fourth wall and temper the authenticity of their performance. Getty

1964 | Sick of the utilitarianism dominant at the time, Robert Venturi designed his Vanna Venturi House to include blatantly unnecessary features—like the facade's nonsupporting arch and an interior stairway leading to nowhere—that are now hallmarks of postmodernism.

1989 | In the 1980s, Will Wright created SimCity, a cutting-edge videogame. Instead of building a closed ecosystem—like most developers before him—he handed the19 tools over to players to map their own gamescape.

1997 | Industrial designer Hella Jongerius molded perfectly propor­tioned tableware, then fired it at exceedingly high temperatures, slightly deforming each piece.