HackerRank has published its 2018 Developer Skills Report. The paper looks at a number things essential to understanding the developer landscape, and explores things like the perks coders demand from their workplaces, the technologies they prefer to use, and how they entered the software development industry in the first place.

While perusing the paper, something struck me as particularly interesting. One of the questions HackerRank asked its community was when they started coding. It then organized the data by age and country.

Almost immediately, you notice an interesting trend. Those in the 18 to 24 age group overwhelmingly started their programming journey in their late teens. 68.2 percent started coding between the ages of 16 to 20.

When you look at older generations, you notice another striking trend: a comparatively larger proportion started programming between the ages of five and ten. 12.2 percent of those aged between 35 and 44 started programming then.

It’s obvious why that is. That generation was lucky enough to be born at the start of the home computing revolution, when machines bearing the logos of Acorn and Commodore first entered the living rooms of ordinary people.

Back then, if you wanted to play a game or run a bit of software, chances were high you’d have to build it yourself. While there was a nascent software publishing market, it was common for people to build their own programs, often cribbed entirely from computer magazines.

That’s right. Just speak to someone in their late thirties or early forties. If they owned (or had access to) a computer back then, they’ll inevitably tell you about typing literally word for word entire programs.

HackerRank observed that the UK in particular (closely followed by Australia) had a wider proportion of developers that starting coding between the ages of five and ten. 10.7 percent of Brits and 10.3 percent of Aussies started then.

HackerRank attributes this partially to the Computers for Schools scheme between Cambridge’s Acorn Computers and supermarket giant Tesco. Families shopped at Tesco, and they’d get vouchers which their school could redeem for a computer.

This was hugely helpful in bringing computers into the classroom, at a time when they were prohibitively expensive.

Unfortunately, we’re past that point. Computers are a lot more sophisticated now. Unlike perhaps in the 1980’s, it’s possible to use one without having to build your own software, or delve into its murky internals.

Another big change that can be attributed to developers starting at a later age is that programming is a lot harder now. The BASIC dialects, which many people learned to program with in the 1980’s, have largely taken a back seat to more powerful (but more complicated) languages like Python, JavaScript, and Java.

We’re never going to get back to the 1980’s, but I don’t feel particularly sad. Even though the programming languages might — at least at first glances — be a bit more complicated, we’re living in a golden age of self-starters.

With YouTube, Stack Overflow, and GitHub, it’s easy enough to find your feet. And the Internet has meant you can share your work and get feedback from well-qualified strangers living on the other side of the globe.

The HackerRank report has lots of other interesting takeaways. If you’re curious, you can read it here.

2018 Developer Skills Report on HackerRank

Read next: How I made a font from scratch and submitted it to Google in 24 hours