Through my career writing software and observing people in companies large and small I have come to recognize a myth shared by many software engineers and managers; the Myth of the Sufficiently Smart Engineer. This myth is a misguided belief that engineers are like Laplace’s Demon; they maintain an accurate mental model of the system, foresee all the consequences of their actions, predict where the business is going, and are careful enough to avoid mistakes. As an ideal, it is appealing but harmful — as a standard, it is ruinous. Embracing a different ideal, The Humble Learner, creates a more humane culture and better technology. In this short essay I hope to show how the Myth arises, present an alternative, and suggest the changes our industry needs to make in order to get there.

The Myth

As we code or operate a system, we develop a theory of the problem and a theory of the solution. Another way to put this, we have a mental model of the system. Unfortunately, all models are wrong. The mismatch between our mental model and reality eventually manifests as an error. Maybe we introduce a bug, a bad design, or mistakenly delete the production database. The point is something will be less good than we could imagine it being. As we dig in to the issue and discover where we went wrong, our theory is refined.

Treachery occurs when our new self, with the power of hindsight, imagines a version of our past self that was a little bit smarter -- Sufficiently Smart -- that would not have made the error. Some people further betray their humanity by believing their current and future self is now Sufficiently Smart. Others fear they will be found out for having fallen short and question if they belong, suggesting that the myth contributes to the prevalence of Imposter Syndrome. Similarly, when we see an error introduced by someone else and we believe (rightly or wrongly) that it is not an error we would have made, we use ourselves as proof that a Sufficiently Smart Engineer exists.

Part of the reason the Myth is the Sufficiently Smart Engineer, is that it always manifests as just a little smarter or more careful than what leads to error. The other part is that this imagined ideal becomes the expectation.

Through work and struggle, we can seem to become Sufficiently Smart. Driven by ambition, marked by scars from past missteps and aided by habits of mind, the time between discovering model/reality discrepancies lengthens. These extended periods of high fidelity may even convince some people that they have become Sufficiently Smart.

If you have never been convinced of this, please take it as a sign of self-awareness. Optimism Bias, the biased belief that we “are at a lesser risk of experiencing a negative event compared to others”, is detrimental. It is easy to fall into the trap of believing you are Sufficiently Smart, which can lead to becoming a Brilliant Jerk. It is also easy to fall into the trap of believing the organization as a whole is Sufficiently Smart, which can lead to organizational failure.

If we can sometimes believe that we are Sufficiently Smart, then we may be seduced into thinking we can be it all the time. It is what we expect. Sometimes it is even hard to imagine how others -- if they are good at their job -- might not feel that way. We may even have a hard time imagining how someone else might not experience and act in harmony. This Illusory Superiority impedes empathy and leads to poor judgment.

Occasionally, someone else’s gap gets exposed. Maybe you don’t think you are Sufficiently Smart, but you believe that Alejandra wouldn’t have made the error that Yao did. We use the existence of this evidence -- real or imagined -- that someone else would not have made that mistake at that time to turn our bad feelings about the error into a value judgement about the person as being less than. We do it to ourselves, we do it to each other, and our incentive structures reinforce it.

It’s sad. It’s hurtful. It’s wrong.

While blameless postmortems are on-trend in popular thinking, I frequently observe experienced engineers self-flagellate in incident reviews, claiming they should have known better. Though they would never use that terminology about someone else’s actions, the desire to perform -- and relative acceptability of -- this form of self-abuse is due to the Myth. "People tend to feel embarrassed - but it's important to capture why they did what they did (because it made sense to them at the time they did it!). In hindsight, it may seem foolish, but, we don't always create an environment where people feel empowered to share these 'missteps'" - Nora Jones.

The industry is losing good people. I know great engineers who won’t apply to top-tier tech companies because of their belief that they are not Sufficiently Smart. Sadly, I have seen the brutality of feeling insufficient drive people from our craft to other roles where the there is less ambiguity.

A new standard: The Humble Learner

The Humble Learner accepts the limits of human capacity while seeking to grow their technical and empathetic skills.

The Humble Learner recognizes their models probably contain errors, are not comprehensive and probably have gaps they could not even imagine. They “get curious before they get furious”. They seek to understand before being understood. The Humble Learner asks open-ended questions and realizes they will sometimes say the wrong thing.

This is an ideal we can realize and a standard we should uphold.

“We suck compared to how great we are going to be” - Reed Hastings, often. This phrase works because we are perpetually in the “we suck” phase and are perpetually working to be greater. We’ll always imagine the next step of growth as an aspiration.

A new culture.

To encourage and uphold the new standard, we need to change engineering culture.

Leaders should talk about their errors. Make it okay and normal to diminish the ideal expectation. This is easy to say, but hard to do. It is even harder to do in a way that encourages more sun-shining without retribution. Doing this well requires a special sensitivity to the asymmetrical risk that talking about errors poses to members of under-represented groups.

Reward revelation, punish shaming. Beyond normalization, sharing theories (and their gaps) helps teammates deepen their understanding. We should celebrate consequence of learning instead of focusing on the negative consequence of failure -- this is also suggests building systems that minimize the cost of failure rather than attempting to prevent it. Shame destroys Psychological Safety, inhibits learning (or has you learn the wrong lesson,) makes teams perform worse, leads to burnout and turnover.

Stop telling people to be more careful. If being more careful worked, we could become Sufficiently Smart. "It has been actively disproven as a way of helping people to work "safer" and with less errors. Many new views have come into account since that time, however software and other industries still commonly operate under the "be more careful" and "we might not hit our SLAs this month!" way of encouraging people not to do "bad things". "Just Culture", on the other hand, has come about more recently. It means that a culture in which operators aren't reprimanded for their actions, omissions, or decisions taken by them that are inline with their expertise and training. However, gross negligence, and destructive acts are not tolerated." - Nora Jones

Avoid solitary action. Hopefully, our mental models are wrong in different ways. Through collaboration and gathering diverse perspectives, our blind spots can be filled in by our peers. This does not mean that groups are Sufficiently Smart, but they are smarter than individuals.

Evaluate performance differently. Think beyond direct impact. What are we learning, what are we teaching, how kind are we.

Take human performance seriously.

Fix your interviews. If your interview question was previously an open problem whose result was worthy of publication, you are doing it wrong. (Here's how my team is trying to approach interviewing.)

Remain vigilant. Since the Myth is the shadow of hindsight, it is not a disease that can be eradicated. The battle against it will last forever, and we will have to fight it with each new programmer. By becoming aware of it, naming it, talking about it and changing our incentive structures with it in mind, we can take the higher ground.

Conclusion

The Myth is pervasive, hurtful and wrong. Our industry suffers from this misbelief. By daylighting it, we have the opportunity for change. We can embrace a new ideal, The Humble Learner. To make this happen, a shift in culture is required. For ourselves and our industry, we can and must take on this work.

Afterward

I would like to thank the following Humble Learners for setting positive examples for me, helping me to understand what I am trying to say and what I should be saying in early drafts of this document: (in alphabetical order) Emily Burns, Lorin Hochstein, Nora Jones, Ryan Kitchens, Peter Stout, and William Thurston.

This essay takes its name from an old programmer meme about a theoretical Sufficiently Smart Compiler.

If you ever want to talk about this stuff, please reach out to me here on LinkedIn or on Twitter.



