Ruminations on the heavy weight of software design in the 21st century.

Recently I took a monthlong sabbatical from my job as a designer at Basecamp. (Basecamp is an incredible company that gives us a paid month off every 3 years.)

When you take 30 days away from work, you have a lot of time and headspace that’s normally used up. Inevitably you start to reflect on your life.

And so, I pondered what the hell I’m doing with mine. What does it mean to be a software designer in 2018, compared to when I first began my weird career in the early 2000s?

The answer is weighing on me.

As software continues to invade our lives in surreptitious ways, the social and ethical implications are increasingly significant.

Our work is HEAVY and it’s getting heavier all the time. I think a lot of designers haven’t deeply considered this, and they don’t appreciate the real-life effects of the work they’re doing.

Here’s a little example. About 10 years ago, Twitter looked like so:

How cute was that? If you weren’t paying attention back then, Twitter was kind of a joke. It was a silly viral app where people wrote about their dog or their ham sandwich.

Today, things are a wee bit different. Twitter is now the megaphone for the leader of the free world, who uses it to broadcast his every whim. It’s also the world’s best source for real-time news, and it’s full of terrible abuse problems.

That’s a massive sea change! And it all happened in only 10 years.

Do you think the creators of that little 2007 status-sharing concept had any clue this is where they’d end up, just a decade later?

Seems like they didn’t:

People can’t decide whether Twitter is the next YouTube, or the digital equivalent of a hula hoop. To those who think it’s frivolous, Evan Williams responds: “Whoever said that things have to be useful?”

Considering these shallow beginnings, is it any surprise that Twitter has continually struggled at running a massive, serious global communications platform, which now affects the world order?

That’s not what they originally built. It grew into a Frankenstein’s monster, and now they’re not quite sure how to handle it.

I’m not picking on Twitter in particular, but its trajectory illustrates a systemic problem.

Designers and programmers are great at inventing software. We obsess over every aspect of that process: the tech we use, our methodology, the way it looks, and how it performs.

Unfortunately we’re not nearly as obsessed with what happens after that, when people integrate our products into the real world. They use our stuff and it takes on a life of its own. Then we move on to making the next thing. We’re builders, not sociologists.

This approach wasn’t a problem when apps were mostly isolated tools people used to manage spreadsheets or send emails. Small products with small impacts.

But now most software is so much more than that. It listens to us. It goes everywhere we go. It tracks everything we do. It has our fingerprints. Our heart rate. Our money. Our location. Our face. It’s the primary way we communicate our thoughts and feelings to our friends and family.

It’s deeply personal and ingrained into every aspect of our lives. It commands our gaze more and more every day.

We’ve rapidly ceded an enormous amount of trust to software, under the hazy guise of forward progress and personal convenience. And since software is constantly evolving—one small point release at a time—each new breach of trust or privacy feels relatively small and easy to justify.

Oh, they’ll just have my location.

Oh, they’ll just have my identity.

Oh, they’ll just have an always-on microphone in the room.

Most software products are owned and operated by corporations, whose business interests often contradict their users’ interests. Even small, harmless-looking apps might be harvesting data about you and selling it.

And that’s not even counting the army of machine learning bots that will soon be unleashed to make decisions for us.

It all sounds like an Orwellian dystopia when you write it out like this, but this is not fiction. It’s the real truth.

See what I mean by HEAVY? Is this what we signed up for, when we embarked on a career in tech?

15 years ago, it was a slightly different story. The Internet was a nascent and bizarre wild west, and it had an egalitarian vibe. It was exciting and aspirational — you’d get paid to make cool things in a fast-moving industry, paired with the hippie notion that design can change the world.

Well, that motto was right on the money. There’s just one part we forgot: change can have a dark side too.

If you’re a designer, ask yourself this question…

Is your work helpful or harmful?

You might have optimistically deluded yourself into believing it’s always helpful because you’re a nice person, and design is a noble-seeming endeavor, and you have good intentions.

But let’s be brutally honest for a minute.

If you’re designing sticky features that are meant to maximize the time people spend using your product instead of doing something else in their life, is that helpful?

If you’re trying to desperately inflate the number of people on your platform so you can report corporate growth to your shareholders, is that helpful?

If your business model depends on using dark patterns or deceptive marketing to con users into clicking on advertising, is that helpful?

If you’re trying to replace meaningful human culture with automated tech, is that helpful?

If your business collects and sells personal data about people, is that helpful?

If your company is striving to dominate an industry by any means necessary, is that helpful?

If you do those things…

Are you even a Designer at all?

Or are you a glorified Huckster—a puffed-up propaganda artist with a fancy job title in an open-plan office?

Whether we choose to recognize it or not, designers have both the authority and the responsibility to prevent our products from becoming needlessly invasive, addictive, dishonest, or harmful. We can continue to pretend this is someone else’s job, but it’s not. It’s our job.

We’re the first line of defense to protect people’s privacy, safety, and sanity. In many, many cases we’re failing at that right now.

If the past 20 years of tech represent the Move Fast and Break Things era, now it’s time to slow down and take stock of what’s broken.

At Basecamp, we’re leading the charge by running an unusually supportive company, pushing back on ugly practices in the industry, and giving a shit about our customers. We design our product to improve people’s work, and to stop their work from spilling over into their personal lives. We intentionally leave out features that might keep people hooked on Basecamp all day, in favor of giving them peace and freedom from constant interruptions. And we skip doing promotional things that might grow the business, if they feel gross and violate our values.

We know we have a big responsibility on our hands, and we take it seriously.

You should too. The world needs as much care and conscience as we can muster. Defend your users against anti-patterns and shady business practices. Raise your hand and object to harmful design ideas. Call out bad stuff when you see it. Thoughtfully reflect on what you’re sending out into the world every day.

The stakes are high and they’ll keep getting higher. Grab those sociology and ethics textbooks and get to work.

If you like this post, hit the 👏 below or send me a message about your ham sandwich on Twitter.

https://basecamp.com/svn