Fifty years ago, Isis, the undergraduate magazine for which I wrote occasional pieces, decided to run reviews of the local lecturers. This was not an attempt to bring to the University of Oxford the US practice of formulating elaborate student evaluations of courses and teachers - a practice that has had nefarious results for grade inflation and for the tenure process. Rather, it was a mildly frivolous exercise, reviewing the star lecturers of the day, such as Isaiah Berlin, much as students might today review the appearances of Anna Wintour or Pamela Anderson at the Oxford Union. In any event, we were told to stop it by the university and moved on to less controversial subjects, such as the threat of nuclear warfare in Cuba or Germany.

Our light-heartedness about the whole business partly reflected the fact that in those distant days lectures played a small part in the educational process in the humanities and social sciences. You read whatever books and articles tutors suggested, wrote the essays they set, read the essays to your tutors, failed to answer the questions they asked you and repeated the process. Some lectures you went to because you knew that the eventual book would be wonderful but would not appear for a few years: Peter Strawson’s lectures on Immanuel Kant’s Critique of Pure Reason were an example. I diligently took notes that I only began to understand when teaching Kant three years later.

But we also thought lectures were redundant. Indeed, one star performer, A.J.P. Taylor, who attracted vast audiences to his television lectures, too, told us so, although we all suspected that he did so only to prove himself wrong by telling us all sorts of things we would not find in books and would not hear from tutors. The thought was obvious enough: the Gutenberg Revolution had rendered lectures redundant as a means of imparting knowledge. Before the invention of moveable type and the possibility of producing books on a mass scale, oral transmission of knowledge (or speculation) depended on carefully constructed, often dictated, lectures and on students with ready pens and excellent memories. By 1960, lectures had long been redundant - since shortly after 1440. But lecturers were paid to lecture, so lecture they did.

Of course, all this has to be taken with a grain of salt: much university teaching takes place in small classes, whether in labs or in seminars. An audience of fewer than 30 lends itself to give-and-take discussion - beyond that you need the genius of the likes of Michael Sandel, Anne T. and Robert M. Bass professor of government at Harvard University, who somehow contrives to create a dialogue with 1,000 students in a lecture theatre.

But the arrival of massive open online courses (MOOCs) again brings into focus the question of whether old-fashioned lectures, understood as a device for handing out information, aren’t redundant. The MOOC is the latest stage in a series of technological changes that have made life easier for students and their teachers but have made the traditional lecture look ever more of an endangered species. The question is whether we have now reached a tipping point.

One reason why the Gutenberg Revolution hadn’t made lectures redundant by 1960 was that the popular availability of the photocopier, the personal computer and the internet were some way off. The tedium of correcting typescripts hammered out on old-fashioned typewriters and correcting master copies for duplicating machines was enough to ensure that handwritten lecture notes lasted longer than they really should. Cheap photocopying has been bad for trees but very good for ensuring that students’ notes are the notes that their teachers wish them to take away. PowerPoint slides have dumbed down presentations but where the information imparted lends itself to snapshots and bullet points, they are a way of keeping an audience’s attention.

MOOCs scale this up. When the creators of the first successful example said “massive”, they meant it: the course on artificial intelligence that former Stanford University professor Sebastian Thrun and Peter Norvig, director of research at Google, posted last year enrolled 160,000 students. It was equally true to its label in terms of openness: given a computer and internet access, you could take the course. Soon after, Stanford’s Andrew Ng co-founded Coursera, and all sorts of high-powered institutions including Stanford and the California Institute of Technology have piled in. The obvious questions have yet to be answered: who’s going to pay for it, will the courses carry credit, how can you assess students’ understanding? It is hard to believe that they will remain unanswered for long.

Less dwelled upon are the utopian and dystopian mirror images of a future in which MOOCs are the order of the day. The dystopian vision that chills the soul of even the least Luddite among us is of undergraduate education dominated by uniform courses, no doubt put together by wonderful teachers but turning everyone beyond the course builders themselves into something like the monitors of the 19th-century Bell-Lancaster schools, checking their students’ work against a schedule determined elsewhere, with little or no scope for their own pedagogical ideas. “Course delivery”, in the awful idiom of the Quality Assurance Agency, will be almost everyone’s lot. The utopian version is the reverse, the anarchist’s vision of an almost magically decentralised education, with no authority determining course content as everyone listens and responds. It is very reminiscent of the two sides of Marx’s vision of the future - uniformity, efficiency and the elimination of effort on the one hand, and the liberation of imagination and frictionless cooperation on the other. I’d bet on the first.