Nearly every part of our lives is influenced by code. It’s the infrastructure that makes our digital technologies operate – the software that’s changing our world in innumerable ways – and knowing how to code opens up a new world of opportunities. Some would even argue it’s a prerequisite in our increasingly algorithmic existence.

So it’s no surprise that we have so many people from Barack Obama (it “makes sense” for coding to be written into high school curricula) to NBA superstar Chris Bosh (it’s “simply about understanding how the world functions”) arguing everyone should learn to code – and that coding ought to be a required part of a complete education. Starting really young, “because it is code, not Mandarin, that will be the true lingua franca of the future.”

Knowing how to code – the narrative goes – will help us* *navigate life, snag a lucrative job, and stay competitive with other countries. And then there’s the digital version of the “teach a man to fish” adage where a software engineer decided he would teach a homeless man to code. Though it caused a vitriolic reaction amongst tech bloggers, the homeless man was reportedly on his way to finishing up his first app.

That is, until he was arrested and his belongings were seized by the NYPD. Because “fairy tales don’t scale” in the real world.

All that compiles is not gold. Coding is only a panacea in a world where merit is all it takes to succeed. In other words, a starkly different world from the one we actually live in where social structures, systemic biases, and luck may matter more.

#### Jathan Sadowski ##### About Jathan Sadowski is a freelance writer and Ph.D. student in the [Consortium for Science, Policy & Outcomes](http://www.cspo.org/) at Arizona State University. His research focuses on the social, political, and ethical side of technologies. Follow him on Twitter @jathansadowski.

So is it wrong to teach a person to code? No. I don’t deny that coding is a useful skill to have in a modern ubiquitous computing society. It can help people personalize and understand the devices and services they use on a daily basis. It’s also good news that methods for teaching kids how to code are improving and becoming more effective, or that kids can ostensibly learn on their own when left to their own devices.

The problem is elevating coding to the level of a required or necessary ability. I believe that is a recipe for further technologically induced stratification. Before jumping on the everybody-must-code bandwagon, we have to look at the larger, societal effects – or else risk running headlong into an even wider inequality gap.

For instance, the burden of adding coding to curricula ignores the fact that the English literacy rate in America is still abysmal: 45 million U.S. adults are “functionally illiterate” and “read below a 5th grade level,” according to data gathered by the Literacy Project Foundation. Almost half of all Americans read “so poorly that they are unable to perform simple tasks such as reading prescription drug labels.” The reading proficiency of Americans is much lower than most other developed countries, and it’s declining.

We have enough trouble raising English literacy rates, let alone increasing basic computer literacy: the ability to effectively use computers to, say, access programs or log onto the internet. Throwing coding literacy into the mix means further divvying up scarce resources. Teaching code is expensive. It requires more computers and trained teachers, which many cash-strapped schools don’t have the luxury of providing. As software engineer Chase Felker has argued:

“I’m not sure it’s even possible to teach everyone how to code, but I do know that to mandate programming as a general education requirement would displace something else that we’re already failing to teach, and that’s not good, either.”

Focusing on the additional, costly skillset of coding – rather than the other more essential, but still lacking, types of literacy – is the product of myopic technical privilege. There’s a reason such arguments arise primarily from the digerati: In that world, basic access is rarely a problem.

What’s more, a society where people are expected to know how to code is one where powerful players like big corporations and government are more likely to ignore responsible design obligations – design decisions such as building in privacy protections and making sure technologies have a degree of transparency about how they operate. It would be like saying safety-enhancing features in vehicles aren’t necessary because everyone learned the basics in driving school… or worse, who needs to learn when autonomous cars do it for us anyway?

Why should a startup or tech goliath worry about concerns such as not exploiting users when people can examine the source code? When members of the coderati can customize changes, live in private clouds, and protect only themselves?

Even worse is that the “everybody should learn to code” mindset adds fuel to the wrongheaded tech-solutionist ideology that tech hacks can fix all our problems. And if you don’t have those abilities, well then, perhaps you’re incompatible with our high-tech economy and digital environment. If coding becomes a required skill to navigate a technological environment, then a large part of the population without the privilege of becoming fluent in code will be left behind. It will be the gap between the coding haves and have-nots. A world where filter bubbles and sameness rules over the messy realities of life.

But a world where coding dictates the future is not inevitable. Instead of making people adapt to technologies – in the process leaving behind large swaths of society – technologies should adapt to our needs and values. As the media theorist Marshall McLuhan said, “There is absolutely no inevitability as long as there is a willingness to contemplate what is happening.” There's still plenty of room for our contemplation – and deeper consideration – of how to advance our future without leaving anyone behind.

Wired Opinion Editor: Sonal Chokshi @smc90