A Non-Programmer’s Apology

In his classic A Mathematicians Apology, published 65 years ago, the great mathematician G. H. Hardy wrote that “A man who sets out to justify his existence and his activities” has only one real defense, namely that “I do what I do because it is the one and only thing that I can do at all well.” “I am not suggesting,” he added,

that this is a defence which can be made by most people, since most people can do nothing at all well. But it is impregnable when it can be made without absurdity … If a man has any genuine talent he should be ready to make almost any sacrifice in order to cultivate it to the full.

Reading such comments one cannot help but apply them to oneself, and so I did. Let us eschew humility for the sake of argument and suppose that I am a great programmer. By Hardy’s suggestion, the responsible thing for me to do would be to cultivate and use my talents in that field, to spend my life being a great programmer. And that, I have to say, is a prospect I look upon with no small amount of dread.

It was not always quite this way. For quite a while programming was basically my life. And then, somehow, I drifted away. At first it was small steps — discussing programming instead of doing it, then discussing things for programmers, and then discussing other topics altogether. By the time I reached the end of my first year in college, when people were asking me to program for them over the summer, I hadn’t programmed in so long that I wasn’t even sure I really could. I certainly did not think of myself as a particularly good programmer.

Ironic, considering Hardy writes that

Good work is not done by ‘humble’ men. It is one of the first duties of a professor, for example, in any subject, to exaggerate a little both the importance of his subject and his own importance in it. A man who is always asking ‘Is what I do worthwhile?’ and ‘Am I the right person to do it?’ will always be ineffective himself and a discouragement to others. He must shut his eyes a little and think a little more of his subject and himself than they deserve. This is not too difficult: it is harder not to make his subject and himself ridiculous by shutting his eyes too tightly.

Perhaps, after spending so much time not programming, the blinders had worn off. Or perhaps it was the reverse: that I had to convince myself that I was good at what I was doing now, and, since that thing was not programming, by extension, that I was not very good at programming.

Whatever the reason, I looked upon the task of actually having to program for three months with uncertainty and trepidation. For days, if I recall correctly, I dithered. Thinking myself incapable of serious programming, I thought to wait until my partner arrived and instead spend my time assisting him. But days passed and I realized it would be weeks before he would appear, and I finally decided to try to program something in the meantime.

To my shock, it went amazingly well and I have since become convinced that I’m a pretty good programmer, if lacking in most other areas. But now I find myself faced with this dilemma: it is those other areas I would much prefer to work in.

The summer before college I learned something that struck me as incredibly important and yet known by very few. It seemed clear to me that the only responsible way to live my life would be to do something that would only be done by someone who knew this thing — after all, there were few who did and many who didn’t, so it seemed logical to leave most other tasks to the majority.

I concluded that the best thing to do would be to attempt to explain this thing I’d learned to others. Any specific task I could do with the knowledge would be far outweighed by the tasks done by those I’d explained the knowledge to.1 It was only after I’d decided on this course of action (and perhaps this is the blinders once again) that it struck me that explaining complicated ideas was actually something I’d always loved doing and was really pretty good at.

That aside, having spent the morning reading David Foster Wallace2, it is plain that I am no great writer. And so, reading Hardy, I am left wondering whether my decision is somehow irresponsible.

I am saved, I think, because it appears that Hardy’s logic to some extent parallels mine. Why is it important for the man who “can bat unusually well” to become “a professional cricketer”? It is, presumably, because those who can bat unusually well are in short supply and so the few who are gifted with that talent should do us all the favor of making use of it. If those whose “judgment of the markets is quick and sound” become cricketers, while the good batters become stockbrokers, we will end up with mediocre cricketers and mediocre stockbrokers. Better for all of us if the reverse is the case.

But this, of course, is awfully similar to the logic I myself employed. It is important for me to spend my life explaining what I’d learned because people who had learned it are in short supply — much shorter supply, in fact (or so it appears), than people who can bat well.

However, there is also an assumption hidden in that statement. It only makes sense to decide what to become based on what you can presently do if you believe that abilities are somehow granted innately and can merely be cultivated, not created in themselves. This is a fairly common view, although rarely consciously articulated (as indeed Hardy takes it for granted), but not one that I subscribe to.

Instead, it seems plausible that talent is made through practice, that those who are good batters are that way after spending enormous quantities of time batting as a kid.3 Mozart, for example, was the son of “one of Europe’s leading musical teachers”4 and said teacher began music instruction at age three. While I am plainly no Mozart, several similarities do seem apparent. My father had a computer programming company and he began showing me how to use the computer as far back as I can remember.

The extreme conclusion from the theory that there is no innate talent is that there is no difference between people and thus, as much as possible, we should get people to do the most important tasks (writing, as opposed to cricket, let’s say). But in fact this does not follow.

Learning is like compound interest. A little bit of knowledge makes it easier to pick up more. Knowing what addition is and how to do it, you can then read a wide variety of things that use addition, thus knowing even more and being able to use that knowledge in a similar manner.5 And so, the growth in knowledge accelerates.6 This is why children who get started on something at a young age, as Mozart did, grow up to have such an advantage.

And even if (highly implausibly) we were able to control the circumstances in which all children grew up so as to maximize their ability to perform the most important tasks, that still would not be enough, since in addition to aptitude there is also interest.

Imagine the three sons of a famous football player. All three are raised similarly, with athletic activity from their earliest days, and thus have an equal aptitude for playing football. Two of them pick up this task excitedly, while one, despite being good at it, is uninterested7 and prefers to read books.8 It would not only be unfair to force him to use his aptitude and play football, it would also be unwise. Someone whose heart isn’t in it is unlikely to spend the time necessary to excel.

And this, in short9, is the position I find myself in. I don’t want to be a programmer. When I look at programming books, I am more tempted to mock them than to read them. When I go to programmer conferences, I’d rather skip out and talk politics than programming. And writing code, although it can be enjoyable, is hardly something I want to spend my life doing.

Perhaps, I fear, this decision deprives society of one great programmer in favor of one mediocre writer. And let’s not hide behind the cloak of uncertainty, let’s say we know that it does. Even so, I would make it. The writing is too important, the programming too unenjoyable.

And for that, I apologize.

Notes

You should follow me on twitter here.

May 27, 2006