In the spring of 2011, Sebastian Thrun was having doubts about whether the classroom was really the right place to teach his course on artificial intelligence. Thrun, a computer-science professor at Stanford, had been inspired by Salman Khan, the founder of the online Khan Academy, whose videos and discussion groups have been used by millions to learn about everything from arithmetic to history. And so that summer, Thrun announced he would offer his fall course on Stanford’s website for free. He reorganized it into short segments rather than hour-long lectures, included problem sets and quizzes, and added a virtual office hour via Google Hangout. Enrollment jumped from 200 Stanford undergraduates to 160,000 students around the world (only 30 remained in the classroom). A few months later, he founded an online for-profit company called Udacity; his course, along with many others, is now available to anyone with a fast Internet connection.

Meanwhile, two of Thrun’s Stanford colleagues, Daphne Koller and Andrew Ng, founded another for-profit company, Coursera, that posts courses taught by faculty from leading universities such as Prince- ton, Michigan, Duke, and Penn. Three million students have signed on. Not to be outdone, Harvard and MIT announced last spring their own online partnership, edX, a nonprofit with an initial investment of $60 million. A new phenomenon requires a new name, and so MOOC—massive open online course—has now entered the lexicon. So far, MOOCs have been true to the first “o” in the acronym: Anyone can take these courses for free.

Many people outside academia—including New York Times columnists David Brooks and Thomas L. Friedman—are gushing that MOOCs are the best thing to happen to learning since movable type. Inside academia, however, they have been met with widespread skepticism. As Joseph Harris, a writing professor at Duke, recently remarked in The Chronicle of Higher Education, “I don’t see how a MOOC can be much more than a digitized textbook.”

In fact, MOOCs are the latest in a long series of efforts to use technology to make education more accessible. Sixty years ago, the Ford Foundation funded a group of academics to study what was then a cutting-edge technology: television. In language almost identical to that used today, a report on the project announced that television had the power to drive down costs, enable the collection of data on how students learn, and extend “the reach of the superior teacher to greater num- bers of students.” From 1957 to 1982, the local CBS channel in New York City broadcast a morning program of college lectures called “Sunrise Semester.” But the sun never rose on television as an educational “delivery system.”

In the 1990s, my own university, Columbia, started a venture called Fathom, using the relatively new technology of the Web. The idea was to sell online courses taught by star faculty such as Simon Schama and Brian Greene to throngs of supposedly eager customers. But the paying consumers never showed up in the anticipated numbers, and by the time it was shut down, Fathom had cost Columbia, according to some estimates, at least $20 million. Looking back, the project’s director, Ann Kirschner, concluded that she and her colleagues had arrived too soon—“pre-broadband, pre-videocasting and iPods, and all the rest.”