read

This talk was given today to Eddie Maloney’s class at Georgetown University (specifically, its Learning and Design program) on “Technology & Innovation By Design”

A couple of weeks ago, I saw an educator say on Twitter something about how the changes we’ll see in the next 30 years are “so radical” that history won’t be much help. “I’m beginning to suspect,” he tweeted, that history is “less and less relevant in understanding the near (and far) future.”

I decided not to pick a fight on the Internet – not then, at least. But I want to talk a little bit today about why I think this claim regarding the irrelevance of history is quite wrong and perhaps even quite dangerous. (Obviously – full disclosure – I’m pretty invested in history being relevant as I’m writing a book about education technology in the mid-twentieth century, about the rise of education psychology, the automation of education, and teaching machines.)

I do not disagree, however, that the next 30 years will likely bring about great upheaval. If nothing else, I believe that we may be facing a cataclysm of global proportions due to climate change. The planet could, quite conceivably in coming decades, become uninhabitable for many of its current life forms. Is this unprecedented? Perhaps. But we aren’t moving into the future without any knowledge or understanding.

We have science, sure, But we also have history.

We know – from history (and not just from climatology or paleontology) – what happens during environmental catastrophes; we know – from history – what happens during mass migrations and dramatic shifts in demographics. We know how these events have played out in the past – for rich people and poor people; for white people and brown people; for those from the Global North and those from the Global South; for men, for women, for children.

Ideally, we heed science and history, and we recognize the implications of global climate change on the planet but also on people, on institutions. We act sensibly; we act responsibly; we act justly. We learn from the past. We try to do better. Scientists do this too, you know.

The present can never be extricated from what’s come before. History will never be irrelevant (even if our leaders and musical guest stars on Saturday Night Live appear to be ignorant of it).

Now, I’m not sure if this particular educator was referring to global climate change or not when he dismissed history. I don’t want to put words in his mouth – any more than I might have already – but I don’t think he was. And I think it’s fair to say that it’s far more likely that when you hear this sort of quip – that history doesn’t really matter any more – that it is in reference to some sort of other profound and unprecedented shift people believe we are facing: a shift in technology, in digital technologies in particular.

“Technology is changing faster than ever” – this is a related, repeated claim. It’s a claim that seems to be based on history, one that suggests that, in the past, technological changes were slow; now, they’re happening so fast and we’re adopting new technologies so quickly – or so the story goes – that we can no longer make any sense of what is happening around us, and we’re just all being swept along in a wave of techno-inevitability.

I’ve written previously about this. (And I probably should have recommended you read that article as I don’t want to get too side-tracked here.) Needless to say, I don’t think the claim is true – or at the very least, it is a highly debatable one. It depends on how you count and what you count as technological change and how you measure the pace of change. Some of this, I’d argue, is simply a matter of confusing technology consumption for technology innovation. Some of this is a matter of confusing upgrades for breakthroughs – Google updating Docs more regularly than Microsoft updates Office or Apple releasing a new iPhone every year might not be the best rationale for insisting we are experiencing rapid technological change. Moreover, much of the pace of change can be accounted for by the fact that many new technologies are built atop – quite literally – pre-existing systems: railroads followed the canals; telegraphs followed the railroads; telephones followed the telegraphs; cable television followed the phone lines; most of us (in the US) probably use an Internet provider today that began as either a phone company or a cable company. If, indeed, Internet adoption has moved rapidly, it’s because it’s utilized existing infrastructure as much as because new technologies are somehow inherently zippier.

“Technology is changing faster than ever.” It makes for a nice sound bite, to be sure. It might feel true. (It’s probably felt true at every moment in time.) And surely it’s a good rhetorical hook to hang other directives upon: “you simply must buy this new shiny thing today because if you don’t then the world is going to pass you by.”

That directive is a particularly powerful one in education as it often works in tandem with another narrative: the story that education hasn’t changed in one hundred (or more) years. It’s another historically dubious claim (and another topic I’ve written about – indeed, I have an article in Vice coming soon pushing back on this notion). Nevertheless, it is an incredibly popular claim, particularly among education reformers and education technologists and venture philanthropists – current Secretary of Education Betsy DeVos has invoked it, as has former Secretary Arne Duncan. Laurene Powell Jobs, the widow of a famous Steve, has funded a massive initiative to “rethink school” that also rests on this narrative that school hasn’t changed. Khan Academy’s Sal Khan likes to tell this story too, as does the CEO of edX, Anant Agarwal. (He says education hasn’t changed in 500 years, which I think is supposed to be shorthand for “since the printing press.”)

These are powerful, influential people shaping education policy, and they have no idea what they’re talking about.

This story – that school hasn’t changed in a hundred or more years – has a corollary, of course: one that contends that this unchanging system was modeled on the factory. Schools look the way they do now – which is exactly how they looked 100 years ago – because Horace Mann went to Prussia in the 1830s and brought the “Prussian model,” “the factory model of education” back to the US – or something like that. There weren’t a lot of factories in Prussia back then, but no matter. People really like this story: the reason there are bells in schools, they tell us, is that it helps shape young students in preparation for work, so that they’re prepared to respond to the bells and whistles of the factory floor. (This story is wrong. And I’m sorry, but another aside: if you get your history of education from a guy who calls Thomas Jefferson’s slaves “employees” then you might have some real problems discerning the kinds of sources – primary or secondary – that you should take seriously.)

Now, the popular (libertarian) story about bells should immediately give you pause – I hope it gives you pause – as education technologists and instructional designers, because school bells are, no doubt, “ed-tech.” And you should be asking, “what do school bells do,” “why do we have them,” “where does the technology come from,” “who sells bells to teachers,” “who makes bells for schools,” and perhaps even “how are we replicating and hard-coding this analog technology in our digital world?”

Technology, as physicist Ursula Franklin reminds us, is a practice, not simply an object. “Technology,” she writes, “is not the sum of the artifacts of the wheels and gears, of the nails and electronic transmitters. Technology is a system. It entails far more than its individual material components. Technology involves organization, procedures, symbols, new words, equations, and most of all, a mindset.” As such, technology always has a history – and not just a history of invention or adoption, but a much broader one: the whole context in which practices are imagined and developed and rationalized (and perhaps even rendered invisible, in a way, by becoming utterly ubiquitous, commonplace).

Why bells? When bells? What kinds of bells? Who gets to ring them? How has this changed over time?

Why windows? What kinds of windows? Which classrooms, whose classrooms have sunlight? Which doors have locks? Who has the key? Which schools have metal detectors? Which schools have surveillance cameras? When were these technologies installed, and why?

Why blackboards? When blackboards? What kinds of pedagogical practices led to the adoption of blackboards? What kinds of practices have emerged from them?

Many instructional technologists can answer none of these questions. Much more troubling, I’d say: they don’t think they should have to. They don’t think it’s relevant, when in fact, these questions – really, any deep and critical historical thinking about the objects and practices of everyday school life (recognizing that “everyday school life” has looked very different in urban and rural areas, in Black communities in the South, in Native communities in the Northwest, and so on) – can help to crack open all sorts of institutional legacies, on- and offline: what does a space look like, how are we supposed to move around in it, who has power and privilege and access? And I don’t just mean that we crack these open so we can mouth some pithy condemnations about “factory models” and “weapons of mass instruction.”

I’ll skip the history of the school bell (although I’d add that with the increased focus lately on school shootings and on “school safety” and the adoption of more and more surveillance technologies, you should, as education technologists, know a bit about when and why bells and alarms came to be, and how that’s changed the culture and the environment of a school. And spoiler alert: the introduction of bells wasn’t thanks to Frederick Winslow Taylor). I’ll skip the history of the window too (although I’d add that with those safety fears and with the never-ending concern that students are suffering from “distraction,” perhaps it is worth thinking about historical arguments for more or less light). But I will talk briefly about the history of the blackboard, and I’ll talk about it for a couple of reasons.

I think that too often, we rush through the history of education technology to get to the part about computers. Somehow, we’ve decided that computers are the pinnacle of technological (and even intellectual) achievement, that they’re the most innovative and influential and important thing we can consider, perhaps the only thing we need to consider. We talk about how these machines might augment human intelligence, might even display an intelligence of their own, without really considering the history – the ugly history – of intelligence quotients and intelligence testing, for starters.

And when I say we rush through history to get to the part about computers, I really mean we rush to get to the bit about personal computers. We spend little time on the history of PLATO, for example, the educational computing system built on a mainframe at the University of Illinois Urbana Champaign in the late 1960s – and that’s even though PLATO provided a template of sorts for an incredible amount of tech and ed-tech that followed. (You should all read Brian Dear’s book The Friendly Orange Glow for a long and loving look at PLATO’s contribution to computer culture.)

It is almost as though, according to how some folks tell it, computers suddenly emerged in the classroom, unencumbered by the past or by ideology – like Athena springing out of Zeus’s skull, fully armored, supernaturally intelligent, a god of wisdom and warfare.

That’s another key piece to remember about the history of computing technology and the history of education technology: they are deeply intertwined with the military and with technologies of war. You know this, of course, because you read an article I wrote about the military’s role in developing the Link Trainer and learning objects, and you also read Donna Haraway’s “Cyborg Manifesto.” “Modern production seems like a dream of cyborg colonization work,” Haraway writes, “a dream that makes the nightmare of Taylorism seem idyllic. And modern war is a cyborg orgy, coded by C3I, command-control-communication-intelligence, an $84 billion item in 1984’s defense budget.” 1984 – clearly this is an essay penned at a particular moment in American politics and American culture, the mid–1980s: the time of Star Wars (the movie and President Reagan’s plans for a missile defense system); War Games; Terminator; the personal computer; that famous Macintosh “1984” Super Bowl ad; and of course one of the most important reports in the last 35 years, A Nation at Risk, a survey of various studies that concluded that the American school system was failing and in desperate need of reform. (A particular kind of reform.)

“The educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a Nation and a people,” its opening paragraph pronounced. “If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war. As it stands, we have allowed this to happen to ourselves,” the report continues, underscoring how our education system had become a national security risk. “We have even squandered the gains in student achievement made in the wake of the Sputnik challenge.”

Sputnik, another key event in twentieth century education technology, another time when education and national security were overtly linked: when in 1957 the Soviet Union beat the US in launching the first artificial satellite into space, resulting in (among other things) the passage of the National Defense Education Act of 1958 and over a billion dollars in funding for science education and for the development of various education technologies, including but not limited to teaching machines. Haraway, with her PhD in biology, describes herself as a “Sputnik Catholic,” incidentally – her enormous contribution to feminism and to science studies in the 1980s (and since) stemming from this moment in the history of the American education system. The Sputnik moment – just one event that demonstrates a much longer shared history of the military, science, engineering, gender, and school.

…Which brings us back to the blackboard.

The use of writing slates dates back centuries; their origin, unclear. Larger slates, “black boards,” have also been in use for hundreds of years But historian Christopher Phillips argues that the US Military Academy at West Point played a crucial role in establishing the blackboard for classroom use in early nineteenth century.

West Point was formally established in 1802 by Thomas Jefferson, who’d initially opposed George Washington’s call for a military academy on the grounds it would be elitist, anti-democratic. Nevertheless Jefferson signed legislation decreeing a “Corps of Engineers,” and thus West Point was to become “one of the premier science and engineering schools, producing generations of technically skilled graduates who populated the growing ranks of military and civilian engineers.”

West Point was to be modeled on the recently developed French officer training and engineering schools – École Spécial Militaire and École Polytechnique respectively; and Phillips argues it was in the geometry courses of Claudius Crozet, who had trained at École Polytechnique and had come to the US after the restoration of Louis XVIII, that the blackboard and its particular pedagogical practices first came to West Point.

Crozet taught descriptive geometry, believing, as did his own instructors in France, that mathematical understanding was central to military education, that military education was about engineering, and that descriptive geometry in particular was practical, socially and politically useful, and as such deeply republican. (Lower case r republican.)

Geometry is, of course, also a subject well suited for work on a blackboard: you’re representing three-dimensional objects in two dimensions. But the use of the blackboard at West Point was not about facilitating large-scale lecture-based math instruction. Rather, the blackboard was used for recitation – cadets were taught in small group settings (about 8 to 12 young (white) men) and they stood at the board to answer questions from the instructor, something that became known as the “West Point Method.” The short period in which Crozet taught at the academy was one of many reforms in its curriculum, instruction, and rules, and by the 1830s, math education was a core part of that. The academy’s section rooms were furnished with blackboards on all the walls, not just at the front of the class. “It is on the blackboard where the workings of [a cadet’s] mind are chiefly exhibited,” one observer wrote in 1863.

It’s not that they didn’t have paper at West Point. It’s not that they didn’t have textbooks or preprinted diagrams. It’s not that they didn’t hold lectures. But the blackboard was a crucial part of how students were drilled and examined – assessed not just on how well they could solve geometry problems, but how well they could perform this skill orally and visually, how clearly and confidently they could demonstrate their knowledge.

(All this predates the establishment of psychology and educational psychology as a field, of course, but it’s worth pointing out that this idea of assessing behavioral comportment as a way to gain insight into the brain and into “the mind” certainly carried forward in behaviorism and is at the root of the development of teaching machines in the twentieth-century. Like I said, new technologies are built on old technologies; new practices are built on older practices. Everything has a history.)

At West Point,

Regulations required that students stand at attention on the side of the board farthest from the central line of the room; hold the pointer in the hand nearest the board with point downward unless in active use; face the instructor when speaking; and refrain from unnecessary motions or “nervous habits.” Rules specified every movement, with students at the front boards working on the main lesson for the day as students at the side boards demonstrated applications or particular solutions using those lessons. According to the regulations, each student began his board work by writing his name on the upper-right-hand corner of the blackboard and then proceeded to write his answer while the instructor orally quizzed any students not currently scribbling. Once a student at the board finished writing, he then stood at attention with his pointer in hand, awaiting the command to explain his work. … No extraneous writing was allowed on the board and the eraser could only be used with permission of the instructor.

Blackboards purported to make visible what the cadets knew, what and how they were thinking; but they were also, Phillips argues, “tools for revealing cadets’ characters.” Blackboards demonstrated cadets’ intellectual and physical and moral discipline; they were disciplinary technologies connected to disciplinary practices – an example of what Michel Foucault talks about as disciplinary institutions (like the Panopticon, the prison) and the development, in the early nineteenth century, of a “disciplinary society.”

The blackboard and the “West Point method” attracted a fair amount of attention, as influential educators, including Horace Mann, visited and observed the instructional practices at the academy – and it’s worth noting that the method puts both the teacher and the student on display. West Point graduate Nicholas Tillinghast became the principal of the Bridgewater Normal School in Massachusetts, a school founded by Mann in 1840. (This was the key lesson Mann learned from the Prussians, by the way: train school teachers.) Tiillinghast promoted the use of the blackboard, and the technology and associated pedagogy spread throughout New England. “If West Point had done nothing else,” one Massachusetts Board of Education member said in 1860, “it would not be easy to estimate the value to the cause of public instruction of the blackboard, the cheapest and most used and the most useful of all educational apparatus, and also of the West Point method.” Only a few decades later, reports from the region indicate that the blackboard was common in almost every classroom, used in almost every grade – from elementary school through college – and in almost every subject.

No doubt, the pedagogical practices associated with the blackboard have shifted over the course of the past two hundred years. Now it’s more likely to be a device used by a teacher (a female teacher, a shift facilitated by Horace Mann’s normal schools) and not the student. Increasingly, I suppose, it’s a whiteboard, perhaps one with a touchscreen computer attached. But it is still worth thinking about the blackboard as a disciplinary technology – one that molds and constrains what happens in the classroom, one that (ostensibly) makes visible the mind and the character of the person at the board, whether that’s a student or a teacher.

Indeed, the history of the teaching profession suggests we have long been obsessed with the morals of the latter. But obviously “character education,” as popular as it is with today’s education reformers and education psychologists, also has a long history – a history bound up in the technologies of the classroom. Grit, mindsets, behavior management – this push for disciplinary practices and disciplinary technologies is not new. Framing this in terms of engineering – behavioral engineering, social engineering, educational engineering, learning engineering – is also centuries old.

Again, I don’t say this to suggest that “nothing has changed.” I don’t say this to suggest that ClassDojo is inevitable.

So why then does the history of ed-tech matter? It matters because it helps us think about beliefs and practices and systems and institutions and ideology. It helps make visible, I’d hope, some of the things that time and familiarity has made invisible. It helps us think about context. It helps us think about continuity as much as change. And I think it helps us be more attuned to the storytelling and the myth-making that happens so frequently in technology and reform circles.

Remember: “He who controls the past controls the future,” George Orwell famously cautioned. “He who controls the present controls the past.” The stakes are pretty high here – this is education, knowledge, democracy. Recognize, I’d say too, that the stakes always have been.

Works Cited:

Christopher Phillips. “An Officer and a Scholar: Nineteenth-Century West Point and the Invention of the Blackboard.” History of Education Quarterly. Vol. 55, No. 1, February 2015.