I first used a computer to do real work in 1985.

I was in college in the Twin Cities, and I remember using the DOS version of Word and later upgrading to the first version of Windows. People used to scoff at the massive gray machines in the computer lab, but secretly they suspected something was happening.

It was. You could say the information age started in 1965 when Gordon Moore invented Moore’s Law (a prediction about how transistors would double every year, later changed to every 18 months). It was all about computing power escalation, and he was right about the coming revolution. Some would argue the information age started long before then, when electricity replaced steam power. Or maybe it was when the library system in the U.S. started to expand in the 30s.

Who knows? My theory is it started when everyone had access to information on a personal computer. That was essentially what happened for me around 1985 — and a bit before that in high school. (Insert your own theory here about the Apple II ushering in the information age in 1977. I’d argue that was a little too much of a hobbyist machine.)

We can agree on one thing. We know that information is everywhere. That’s a given. Now, prepare for another shift.

In their book Machine, Platform, Crowd: Harnessing Our Digital Future, economic gurus Andrew McAfee and Erik Brynjolfsson suggest that we’re now in the “machine learning” age. They point to another momentous occasion that might be as significant as Moore’s Law. In March of last year, an AI finally beat a world champion player in Go, winning three out of four games.

Of course, pinpointing the start of the machine learning age is difficult. Beating Go was a milestone, but my adult kids have been relying on GPS in their phones for years. They don’t know how to read normal maps, and if they didn’t have a phone, they would get lost. They are already relying on a “machine” that essentially replaces human reasoning. I haven’t looked up showtimes for a movie theater in a browser for several years now. I leave that to Siri on my iPhone. I’ve been using an Amazon Echo speaker to control the thermostat in my home since 2015.

In their book, McAfee and Brynjolfsson make an interesting point about this radical shift. For anyone working in the field of artificial intelligence, we know that this will be a crowdsourced endeavor. It’s more than creating an account on Kickstarter. AI comes alive when it has access to the data generated by thousands or millions of users. The more data it has the better it will be. To beat the Go champion, Google DeepMind used a database of actual human-to-human games. AI cannot exist without crowdsourced data. We see this with chatbots and voicebots. The best bots know how to adapt to the user, how to use previous discussions as the basis for improved AI.

Even the term “machine learning” has crowdsourcing implications. The machine learns from the crowd, typically by gathering data. We are currently seeing this play out more vibrantly with autonomous cars than any other machine learning paradigm. Cars analyze thousands of data points using sensors that watch how people drive on the road. A Tesla Model S is constantly crowdsourcing. Now that GM is testing the self-driving Bolt on real roads, it’s clear the entire project is a way to make sure the cars understand all of the real-world variables.

The irony here? The machine age is still human-powered. In the book, the authors explain why the transition from steam power to electric power took a long time. People scoffed at the idea of using electric motors in place of a complex system of gears and pulleys. Not everyone was on board. Not everyone saw the value. As we experiment with AI, test and retest the algorithms, and deploy bots into the home and workplace, it’s important to always keep in mind that the machines will only improve as the crowdsourced data improves.

We’re still in full control. For now.