IT'S a question that's bothered cultural critics for decades: while we know more than ever, are we getting dumber as a result of the increasing amount of technology at our disposal? Reading historical debates, and hearing of the attention paid to them by a thoughtful populace, certainly makes one wonder. Speaking in the 1820s of the mechanical Difference Engine he had devised for computing polynomial functions, Charles Babbage, the father of the programmable computer and our web-log's namesake, told the House of Commons:

On two occasions I have been asked [by Members of Parliament], “Pray, Mr Babbage, if you put into the machine wrong figures, will the right answers come out?” I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

Incisive eloquence—in Latin and Greek as well as their mother tongue—was common fare among Georgians and Victorians lucky enough to have had at least a dozen years of schooling. One wonders how the founders of Facebook, Twitter or YouTube might respond to similarly banal queries tossed at them during congressional testimony.

The current debate about intelligence, sparked by Nicholas Carr's recent and eminently readable “The Shallows”, asks what is the internet doing to our brains? Like Susan Jacoby's “The Age of American Unreason” and Adam Winer's “How Dumb Are You?” earlier in the decade, Mr Carr taps into the sense of despair among American intellectuals about the country's poor educational showing when compared with other countries.

In reading, mathematics and science, American 15-year-olds languish in the lower half of the OECD rankings for the 30 wealthiest countries. Other English-speaking nations such as Canada, New Zealand, Australia and even Britain are all in the upper quartile. South Korea and Japan are in the top decile.

Such indisputable facts are rightly a concern for policy-makers and parents throughout the United States. But the reasons for the abject failure of American education—especially at middle- and high-school levels—are well understood, and the corrective measures widely accepted. Implementing them, however, remains as politically intractable as ever.

But it is not just the chagrin of seeing a nation's youth so poorly served. Even more so, an unspoken nostalgia for an age when book-learning was the noblest of pursuits has invigorated the debate about the dumbing down of America. Tellingly, the most astringent critics are invariably middle-aged or older.

Among other things, Ms Jacoby blames a rising tide of anti-intellectualism. She notes that the reading of books, newspapers and magazines has declined across the board. The proportion of 17-year-olds who read nothing whatsoever (unless required to do so for school) more than doubled between 1984 and 2004—a period that oversaw the rise of personal computers, the internet and video games. She bemoans the way electronic media, with their demand for spectacle and brevity, have shortened our attention spans. Sound bites by presidential candidates, she points out, dropped from 42 seconds in 1968 to less than eight seconds by 2000.

But things are rarely as they seem. For one thing, e-books barely existed a decade ago, but have exploded in popularity since Amazon introduced its Kindle a few short years back, and a host of rivals rushed in with copycat versions. For many readers, the ability to interact with e-books digitally—searching them automatically, inserting digital bookmarks and annotations, zooming in on the small type—has rendered hardcovers and paperbacks obsolete. So much so, e-books are now outselling hardcovers. Perhaps we are witnessing not a decline in book reading but a renaissance. The irony is that had computers been invented before books, we would now be wringing our hands over the loss of multi-media, multi-tasking, computer-gaming skills as our children frittered away their time by burying their noses in single-topic paper tomes.

To the specific question that Mr Carr asks about what the internet is doing to our brains, the simple answer is that it is making us think and behave differently. Of that, there is no doubt. But that does not mean we are getting dumber in the process. What makes people intelligent is their ability to learn and reason—in short, to adapt and thrive within their environment. That fundamental capacity has not changed in thousands of years, and is unlikely to do so because some new technology comes along, whether television, mobile phones or the internet.

Adaptation to one's changing surroundings is a different matter. Every new medium introduced since the invention of the printing press has molded our minds in different ways. It would be alarming if it didn't. Today, confronted with the ubiquity of the internet, we need a whole new set of skills to navigate the information-laden environment we inhabit. In other words, each new set of skills we learn and memories we create builds on our existing mental capacities without changing them in any fundamental way.

Still, the Jeremiahs have a point. Their concern is that prolonged use of the internet—with its smorgasbord of tantalising titbits of information—is producing a generation of magpie minds, as users hop from one bright trinket to another, rarely focussing long enough on any one topic to comprehend it thoroughly. According to this view of the brain, the lack of “deep thinking” lies at the heart of the present generation's inability to sweat the hard stuff. Google, with its instant access to factoids of dubious veracity, is singled out as a primary source of the malaise.

The problem, says Mr Carr, is that most of us with access to the web spend at least a couple of hours a day online—and sometimes much more. During that time, we tend to repeat the same or similar actions over and over again. As we go through these motions, the net delivers a steady stream of inputs to our visual, somatosensory and auditory cortices. “The net's cacophony of stimuli short-circuits both conscious and unconscious thought, preventing people from thinking either deeply or creatively.” There is evidence, the author affirms, that the internet is damaging people's long-term memory consolidation that he singles out as the true basis of intelligence.

As plausible as it may sound, such an explanation is markedly different from anything your correspondent has experienced. Perhaps that's because he, like so many other computer users, spends far less time online than social critics imagine. According to Nielsen, a media research company, Americans with access to the internet devote around 26 hours a month to online activity—in other words, just 5% of their waking hours. Even then, half that time is taken up with proactive, even creative, activities—social networking, playing games, e-mailing, visiting portals and instant messaging. Pecking at the despised low-hanging fruit found on Google and other search engines accounts for a minuscule 3.5% of the average user's online time.

What seems to be forgotten in the rush to judgment about the internet making us dumber is that the brain's basic architecture is created by genetic programs and biochemical interactions that do their job long before a child starts tapping away at a keyboard. “There is simply no experimental evidence to show that living with new technologies fundamentally changes brain organisation in a way that affects one's ability to focus,” say Christopher Chabris and Daniel Simons, psychologists at Union College, New York, and the University of Illinois, respectively.

The danger, if there is one, is that the easy, on-demand access to reams of information from the internet may delude us into mistaking the data we download for genuine wisdom worth acting upon. The internet is just another reference source, albeit one on steroids that sucks up content so fast that little of it ever gets peer reviewed. Only fools would venture into such a forest with anything less than their eyes wide open and their brains fully engaged. Fortunately, there are fewer fools around than some of the scaremongers like to think.