ANALYSIS/OPINION:

Winston Churchill once noted, “If you’re not a liberal at 20, you have no heart, and if you’re not a conservative at 40, you have no head.”

That might explain — at least in part — why the term “liberal” has often been shunned. As the American population grows older, it is becoming more conservative.

But other factors also help explain the negative connation awarded the term:

• The extension of individual rights to more groups perceived by the public to be outside the pale of acceptability (criminals, for example) has been equated with liberalism.

• The perception of liberals as spenders and taxers has been popularized, and by educating more and more Americans (not just the affluent), young Americans are less geared to the prospect of righting the nation’s and world’s wrongs and more toward making it big for themselves.

Moreover, during the era of the early 20th century when social reform was in vogue, especially under Presidents Theodore Roosevelt and Woodrow Wilson, advocates used the term “progressive” to describe their efforts because they recognized that “liberalism” had a far different history and meaning.

Indeed, most presidents in the 19th century were liberals because it was defined as a philosophy that would now be deemed conservative. In other words, a liberal stressed the individual’s freedom of activity in society and marketplace and relegated government to modest police functions.

Under such a definition, President Thomas Jefferson was a liberal and so were the Democratic Party presidents that followed him before the Civil War. They took issue with the Federalist party of George Washington, John Adams and Alexander Hamilton that focused on an activist federal government. They anguished over the role of the federal government in interfering in the rights of states and individuals. And although they sometimes supported businessmen in building a nation independent of foreign sources or veterans who had served their country, such interference was considered justifiable exception.

After the Civil War, liberalism was synonymous with laissez-faire politics (let matters take their own course without government interference) and prevailed in both the Republican and Democratic parties. Influenced by the English philosopher Herbert Spencer’s Social Darwinism, which argued that human society witnessed a survival of the fittest, American politics saw little intervention in the economic world, permitting individuals and businessmen to do their own thing. State and federal courts largely supported property rather than civil rights on the Social Darwinian grounds that what the fittest had amassed should not be threatened by criminals, unsuccessful entrepreneurs or government. Even the income tax would be declared null and void because no constitutional authority could be found for it.

What eventually became the undergirding for liberalism being converted to reform was the application of Social Darwinist thinking to the world of academe. Instead of academic fields being viewed as static, taught almost as they had since the Middle Ages, Darwinian methodology induced the theory that fields of study underwent a kind of survival of the fittest, with the only constant being change. No academic fields underwent more change as a result of the Darwinian application than sociology and law.

But sociology — the study of society — was a new and elusive field. Other social sciences such as history had more prestige, and reformers who thought the pathway to applying their views was sociology were disappointed. In fact, the subject wasn’t even taught at an American university until 1890 — and at a no-name institution, the University of Kansas. Harvard didn’t establish a sociology department until 1930, the University of California at Berkeley until 1950.

The field of law, however, became the fulcrum for liberalism because it was applied in courts and was largely unamenable to influence by the public. Justice Oliver Wendell Holmes Jr. would illustrate the legal field’s reformist view when he noted, “The life of the law has not been logic: it has been experience.” In Lochner v. New York (1905), involving a New York state law limiting bakery employees to not more than 10-hour workdays, Holmes dissented from the majority opinion, declaring the law a violation of contractual rights. He argued that “a reasonable man might think it a proper measure on the score of health.”

And this thinking eventually became the majority opinion in court cases.

• Thomas V. DiBacco is professor emeritus at American University.

Sign up for Daily Opinion Newsletter Manage Newsletters

Copyright © 2020 The Washington Times, LLC. Click here for reprint permission.