Prior to the invention of writing — which is to say, for more than 90 percent of the time *homo sapiens *have existed — people learned things mainly by interacting with things. Spoken words helped, of course, * *but to a considerable degree our distant ancestors must have learned how to hunt and fish, and how to make axes and baskets, by watching their elders do it and trying it for themselves. In short, they learned by doing.

Writing and printing changed that. Books made it possible to learn a great deal without physically doing much of anything.

A new class emerged — the intellectuals.

Being an intellectual had more to do with fashioning fresh ideas than with finding fresh facts. Facts used to be scarce on the ground anyway, so it was easy to skirt or ignore them while constructing an argument. The wildly popular 18th-century thinker Jean-Jacques Rousseau, whose disciples range from Robespierre and Hitler to the anti-vaccination crusaders

Freud discovered nothing and cured nobody. Marx was a hypocrite whose theories failed in about as hideously spectacular a way as can be imagined.currently bringing San Francisco to the brink of a public health crisis, built an entire philosophy (nature good, civilization bad) on almost no facts at all. Karl Marx studiously ignored the improving living standards of working-class Londoners — he visited no factories and interviewed not a single worker — while writing Das Kapital, which declared it an “iron law” that the lot of the proletariat must be getting worse. The 20th-century philosopher of science Paul Feyerabend boasted of having lectured on cosmology "without mentioning a single fact."

Eventually it became fashionable in intellectual circles to assert that there was no such thing as a fact, or at least not an objective fact. Instead, many intellectuals maintained, facts depend on the perspective from which are adduced. Millions were taught as much in schools; many still believe it today.

Reform-minded intellectuals found the low-on-facts, high-on-ideas diet well suited to formulating the socially prescriptive systems that came to be called ideologies. The beauty of being an ideologue was (and is) that the real world with all its imperfections could be criticized by comparing it, not to what had actually happened or is happening, but to one’s utopian visions of future perfection. As perfection exists neither in human society nor anywhere else in the material universe, the ideologues were obliged to settle into postures of sustained indignation. "Blind resentment of things as they were was thereby given principle, reason, and eschatological force, and directed to definite political goals," as the sociologist Daniel Bell observed*.*

While the intellectuals were busy with all that, the world’s scientists and engineers took a very different path. They judged ideas ("hypotheses") not by their brilliance but by whether they survived experimental tests. Hypotheses that failed such tests were eventually discarded, no matter how wonderful they might have seemed to be. In this, the careers of scientists and engineers resemble those of batters in major-league baseball: Everybody fails most of the time; the great ones fail a little less often.

So it could be said that humanity ran an experiment, over the past couple of centuries, in two competing approaches — intellectualism and ideology on one hand, science and technology on the other. The stark differences between the two was the subject of C. P. Snow’s influential 1959 essay, "The Two Cultures."

Their outcomes proved starkly different, too.

Continue reading ...

When ideologies were put into action, the results were disastrous. During the twentieth century alone, ideologically inspired regimes — mainly Communism and its reactionary brother, Fascism — murdered more than thirty million of their own citizens, mostly through purges and in the state-sponsored famines that resulted when governments adopted reforms based on dogma rather than fact. That this is not more widely known and appreciated, but instead is so often brushed aside as somehow irrelevant to the argument at hand, demonstrates the extent to which the dead hand of ideology still grips many a mind.

Meanwhile the world’s grubby, error-prone scientists and engineers toiled away. And what did they produce? The greatest increases in knowledge, health, wealth, and happiness in all human history.

Since 1800, when scientific technology really got going, human life expectancy at birth has more than doubled, from 30 years of age to 67 and rising. During the same period, the per-capita annual income of the average human soared, from around $700 in 1800 to over $10,000 in 2010, while the rate of global economic growth more than tripled. Education boomed: In 1800, the vast majority of people were illiterate; today, four out of every five adults can read and write.

As incomes rose and the cost of technology fell, billions of people gained access to tools originally enjoyed by only a few. Nearly a third of humanity can now get on the internet, and mobile phones (which among other things have proved effective at combatting third-world joblessness) are selling at a rate of fifty per second. We are rapidly approaching the day when most of the world’s students will have access to most of the world’s knowledge — a tipping point that may turn out to mark the most important educational advance since printing.

So the experiment has been run, and the results are in. Science and technology wins; ideology loses.

Needless to say, this verdict has not yet been taken to heart by all ideologues. Basing one’s opinions on facts is, after all, hard work, and less immediately gratifying than fuming with intellectual fervor. Hence the far left continues to attack free trade and the pharmaceuticals industry, no matter how many people’s lives have been improved or saved thereby, while the far right rejects every scientific finding that trespasses on its presuppositions, from biological evolution to global warming.

Ideologues aside, many worthy thinkers fear that “the life of the mind” is being crowded out by the current explosion of scientific information and technological innovation. "We are living in an increasingly post-idea world,” warns Neal Gabler, in a New York Times op-ed essay mourning the loss of an era when “Marx pointed out the relationship between the means of production and our social and political systems [and] Freud taught us to explore our minds.”

But in what sense is this a loss? Freud discovered nothing and cured nobody. Marx was a hypocrite whose theories failed in about as hideously spectacular a way as can be imagined.

What is fading, it seems to me, is not the world of ideas but the celebration of big, pretentious ideas untethered to facts. That world has fallen out of favor because fact-starved ideas, when put into practice, produced indefensible amounts of human suffering, and because we today know a lot more facts than was the case back when a Freud could be ranked with an Einstein.

In a sense, science and technology are nudging humanity toward the old path of learning by interacting with things rather than with abstractions — as one can readily see by, say, putting an iPad in the hands of a child. Science may be new, but scientific experimentation is essentially a refinement of the preliterate practice of interrogating nature directly — of trying things out, getting your hands dirty, and discarding what doesn’t actually work.

A Neanderthal axe-maker might not make sense of a postmodernist lecture, but I doubt that he’d have much trouble getting comfortable with a laboratory lathe.

Timothy Ferris wrote the foreword to The Big Idea: How Breakthroughs of the Past Shape the Future, published by the National Geographic Society.