

There's a song in the Rogers and Hammerstein classic South Pacific that perfectly encapsulates the point of this article and the three-part-series it belongs to. It's called "You've Got to Be Carefully Taught."

You've got to be taught to hate and fear,

You've got to be taught from year to year,

It's got to be drummed in your dear little ear,

You've got to be carefully taught.

While that song is talking about a truly serious problem (how people slowly become racist), it also applies to another... well, not hatred, but perhaps extreme dislike and frustration that all too many people have. Today we've reached a point where it's even somewhat of a pastime—hating computers.

No one really starts out hating computers. People may be a bit fearful of them depending on when they first run into them, but no one starts out hating them. You learn to hate computers, and you mostly learn it from software. By and large, hardware in and of itself is pretty innocuous in terms of pissing people off. But one step at a time, one stupid thing at a time, we learn to hate computers. And when we stop to look at the stuff that drives us to this end point, it's often not a single big thing. Large, isolated problems are somewhat easier to fix. It's the little things, over and over again.

This situation isn't born out of malice. The real cause is "thoughtlessness" in the more literal, non-pejorative meaning. No one wants to write software or create webpages that drive people batty. But just like boiling a frog, it happens slowly. And step by step, inch by inch, people learn to hate their computers. A rather large part of my IT career has been spent managing this problem, helping people deal with the stuff that results from a decision made with good intentions but implemented in a way that causes another small pain point. "Being nibbled to death by baby ducks" is not a bad way to describe it. It's the low-hanging fruit like IBM (née Lotus) Notes, but companies you don't expect it from, like Apple, contribute, too. It's big companies with thousands of developers just as much as small indie devs. It's applications and websites.

But don't worry, this three-part series will pound its head against the desk so you don't have to. We'll take a look at some of the most common (and some of the most egregious) software traits that foster computer hatred. Today, we start with a few of the truly little things, saving the most Advil-requiring culprits for parts two and three.

Security, an evil Web it spins

Websites can be particularly at fault here, all in the name of "security." To protect users, website devs are willing to do some of the... let's say silliest things. These are things so silly that I wonder at times if they were done on a bet of some sort. For example, one of the vendors I deal with in my personal life uses Intuit to manage its online payments. In and of itself, this isn't bad; Intuit does have some experience in the field. But when I go to pay my monthly bill, it seems like Intuit is somehow punishing me. Namely, the site requires me to enter a bank account number—a long string of numbers. As we know, if you type in a long string of numbers, you have a good chance of making some form of silly mistake. There's a way to manage this, right? Copy and paste... except Intuit doesn't allow that. You have to enter your bank account number twice. You can paste it in the first field, but in the second field you have to type it in by hand. If you try to paste it in, you get a pop-up message that says, "Please enter your account number by hand to ensure accuracy."

I'll pause a moment while you clear the mental segfault that craziness just caused. I go through that process every month because I don't think anyone can get used to the utter logic failure that is behind the idea that "manually typing in a long string of numbers is more accurate than copy/paste." I'm sure if pressed, Intuit would retreat behind security concerns or a similar defense to justify the decision. But it still makes no sense. If someone is hacking your clipboard, keystroke logging is not exactly difficult.

Intuit's system is just one example of the frustrating decisions websites regularly make in the name of security. A few others:

Session timeouts measured in seconds.

Password rules so arcanely nonsensical (and in many cases, hidden) that even someone with 20 years in IT can't figure out how they work. (As I write this, I'm "celebrating" just over 20 years in IT.)

Insanely short password aging rules, along with a previous password list that stretches into years.

Designs that make refreshing a page the same as a logout.

Sites that still specify specific versions of a browser and won't let you use them unless you match. (This is particularly fun when the site requires "Firefox 4 or later" and doesn't let you in with Firefox 20.)



And on and on—I could spend the next week building a list of all the ways website designers and programmers torment people who just want to get work done. Every time, it's another drop of water on the forehead, another baby duck taking a bite. Every little quizzical irk is another push from what should be—if not an enjoyable thing—at least a neutral experience. Instead, the interaction becomes something many don't want to bother with.

Experiences like this make me wonder: do any of the developers of these programs use their own work? Do they sit around and watch people who try to deal with this nonsense? It's not just an experience issue. I'm so tired of dealing with Verizon's B2B password rules, hoops, and dances that any time I have to log in these days, I just click on "forgot my password." It's actually less annoying to do that. Though not by much—occasionally, the site decides that the password I just created and got the verification e-mail for is wrong. Really?

All of this is a primary reason people hate software—and security. Especially security, which is all too often just IT-speak for "I am going to make your life suck even more, with no real benefit." Users learn that software programs are always going to do random, stupid things that seem to have no purpose beyond making them frustrated and annoyed. They learn that "security" measures are largely a joke. You constantly hear of company after company getting attacked, and breaches never come from some awesomely difficult operation that only a planet full of Vulcans could dream up. Even if they aren't a joke in theory, security measures are (and I can only think it's on purpose) designed to be so hard to implement that it's no wonder no one uses them.

And that's a shame. There are a number of things you can do to, for example, secure e-mail in a way that would make it much harder for phishing attacks to work. The number of people actually using any of them? Not even a rounding error. Why? Because the procedures for setting them up are beyond user-unfriendly. It has become some kind of nerd "prove your worth" gauntlet that you run on a rickety walkway suspended over a bog of eternal stench. Here is one of the more (almost) non-nerd-friendly articles on the subject: How to secure your e-mail under Mac OS X and iOS 5 with S/MIME.

If you know what my colleague Iljitsch is talking about, no, it's not that hard. He explains it well. But for a non-technical user? Good luck getting that key exchange thing going. Heck, good luck in explaining how even the basics of PKI work. (Though here's an awesome attempt using finger puppets. Seriously.)

Any security method that requires the user to know the inner workings of what they are doing is going to fail. Can you imagine if car alarms worked like S/MIME does? No one would ever use them. The reason everyone uses car alarms is because the entire interface is a button on a key fob. Sorry, two buttons. One that locks your car and enables the alarm, one that does the opposite. There are no gotchas there, no "gosh, I need to program this into my car." Push the button, the alarm comes on. Push the other button, the alarm is turned off. How does your alarm work? You don't need to know, don't need to care.