It has been said (in the Gale Encyclopedia of U. S. History) of colonial America that:

In colonial America, feudalism began as an extension of the English manorial system. In addition to the Puritans and the Protestants, who came from England to the New World seeking religious freedom, some early colonists came to expand their estates by establishing feudal domains. While the Puritans and the Protestants established colonies in New England, the Anglicans established the proprietary colonies of Maryland, the Carolinas, and Delaware, and the Dutch brought similar systems to New Amsterdam (later New York) and New Jersey. Similar systems came to the Americas in the seigneurial system of New France (Canada) and the encomienda system of the Spanish colonies of Latin America

“Feudalism” itself (so far as Western history is concerned) has been defined as:

The medieval model of government predating the birth of the modern nation-state. Feudal society is a military hierarchy in which a ruler or lord offers mounted fighters a fief (medieval beneficium), a unit of land to control in exchange for a military service. The individual who accepted this land became a vassal, and the man who granted the land become known as his liege or his lord. The deal was often sealed by swearing oaths on the Bible or on the relics of saints. Often this military service amounted to forty days’ service each year in times of peace or indefinite service in times of war, but the actual terms of service and duties varied considerably on a case-by-case basis. Factors such as the quality of land, the skill of the fighter, local custom, and the financial status of the liege lord always played a part. For instance, in the late medieval period, this military service was often abandoned in preference for cash payment, or agreement to provide a certain number of men-at-arms or mounted knights for the lord’s use.

And as to the so-called American “Revolution,” some have argued that:

In substance the American Revolution was no more than a group of Englishmen fighting on distant shores for traditionally English political rights against a government that had sought to exploit and tyrannize. According to this argument, it was a war of restitution and liberation, not revolution; the outcome, one set of political governors replacing another.

Relative to American history, one might argue that whatever “progress” has been made in moving away from feudalism in this country, for decades now we have been moving in a feudalistic direction—to the point that we are now a full-blown feudalistic society once again. The irony here is that although many of the “vassals” in today’s United States may sense their lowly state, the “lords” who control our society have, through their promotion of “free market” ideology and the myth of “meritocracy,” have gained such firm control over the minds of their vassals that the latter are unable fully to recognize, and thereby be able to articulate, their subservient status.

What seems to have “aided and abetted” their development, however, was “religion” in the form of Christianity. The early years of the “Jesus Movement” were characterized by considerable diversity (discussed in Bart Ehrman’s Lost Christianities: The Battles for Scripture and the Faith’s We Never Knew. “Christianity” arose, however, because Emperor Constantine I (“the Great”?!) [272 – 337 CE] needed a “religion” that would be politically useful to him, and began to favor the Christian “religion.” Then, later in the fourth century, Emperor Theodosius I (also “the Great”?!), via his Edict of Thessalonica (380 CE), made Nicene Christianity the official “religion” of the empire.

Two features of especial importance characterized Nicene Christianity:

A hierarchical structure, consisting of many “layers,” which—partially through frightening congregants with threats of “eternal damnation,” partially (especially after 380) through the authority granted to its “officials”—enabled “religious” leaders to exert (as spokesmen for the Emperor!) control over congregants.

An orientation to belief (orthodoxy, in fact), rather than behavior.

Jesus, as portrayed in the canonical gospels (Matthew, Mark, Luke, and John), would have approved of neither of these features—which is why I refer to “Christianity” as a parasitic development relative to the early Jesus Movement—as, in fact, a pseudo-“religion”!

When the Protestant Reformation occurred centuries later (with several denominations emerging, and them, in turn, giving rise to still more denominations), the general tendency was for a “looser” structure (with different denominations having varying degrees of structure, of course), but an orientation to belief (over behavior) was continued—with each denomination developing its own particular creed/dogma (the “Quakers” being one notable exception).

In addition, the Protestant denominations that emerged tended to emphasize one’s this-worldly activities of an economic nature as constituting preparation for an other-worldly existence—which fact was given notable attention by Max Weber in his The Protestant Ethic and the Spirit of Capitalism (1905 – 1906), and Richard H. Tawney in Religion and the Rise of Capitalism (1926). There is a good basis for arguing, then, that:

The “ethic” promulgated by certain of the early Protestant leaders (John Calvin, and to a lesser degree Martin Luther) promoted (if but inadvertently) such values as individualism, greed, materialism, and selfishness, thereby conducing the development of capitalism.

As capitalism developed, because it gave the above-listed values “success value,” it helped intensify the development of those values in capitalistic societies.

Given the values (or lack of such!) associated with capitalism, along with the fact that the household is our basic societal unit, it is not surprising that our society has become increasingly inegalitarian. Some have argued, however, that the development and growth of the internet—because it has given a voice to all those who have computers, and have purchased internet service—has been a “leveling” factor in our society—an argument that does have some measure of merit, I will admit (but not pursue here).

Among the important points made by Astra Taylor, in her recent The People’s Platform: Taking Back Power and Culture in the Digital Age (2014), though, are the following:

After posing a series of questions, such as is “utopia on the horizon or dystopia around the bend?,” she states (p. 6): “These questions are important, but the way they are framed tends to make technology too central, granting an agency to tools while sidestepping the thorny issue of the larger social structures in which we and our technologies are embedded.” (emphasis added)

The (p. 6) “current obsession with the neurological repercussions of technology—what the Internet is doing to our brains, . . . whether Google is making us stupid . . . . This [sort of] focus ignores the business imperatives that accelerate media consumption and the market forces that encourage compulsive online engagement.” (emphasis added)

Social media and memes (p. 7) “will remake reality—for better or for worse.” However, in Ms. Taylor’s view, “there is as much continuity as change in our new world, for good and for ill.”

Many (p. 7) of the “problems that plagued our media system before the Internet was widely adopted have carried over into the digital domain—consolidation, centralization, and commercialization—and will continue to shape it.”

(Another extremely important subject addressed in her book is the loss of privacy, but I give no attention to that matter in this essay because of its lack of {major] relevance for the present essay.)

The economic—and political—power that accompanies the consolidation and centralization mentioned by Ms. Taylor has, in recent decades, been associated especially with the fossil fuel energy lobby, the Big Pharma lobby, and AIPAC. However, in recent years Big Internet Media has been emerging as a major player on the lobbying scene. Ms. Taylor points out (p. 156) that “Google is now one of the top ten spenders in Washington.”

In her “Conclusion” chapter (pp. 214 – 232) Taylor makes a number of recommendations, such as:

To (p. 216) “escape the cycle of churnalism [a term introduced by Nick Davies (p. 89)] and expendable content in favor of sustainable culture, we need to develop supports that allow for the prolonged immersion and engagement artistic and journalistic endeavors often require, nurturing projects that are timeless rather than timely.”

The (p. 218) “shift to sustainable culture is possible, but implementing the necessary changes cannot fall to individuals and the marketplace alone. The solutions we need require collective, political action.”

I have two problems with her various recommendations, however:

They are naïve from a political standpoint. That is, our society has become so inegalitarian—and is becoming ever more so—with wealthy individuals, corporations, and other organizations spending millions on “buying” politicians, it is virtually certain that none of her suggestions will ever be implemented. If wealthy individuals and organizations had an orientation to the general welfare, their position of dominance in our society might at least be tolerable. But few individuals/organizations in that category “give a hoot” about the public interest; as a (i.e., one) consequence, her proposals have as much of a chance of being implemented as a “snowball in Hell” has a chance of surviving!

Like so many in our society, Ms. Taylor assumes, implicitly, that tomorrow will be much like today, the day after tomorrow likewise, etc. In making this (tacit) assumption, she fails to recognize—and address—one of the most important threats facing us humans at present, that of global warming (or “climate change,” as some put it). True, she is aware (e.g., pp. 179 – 181) of some of the environmental implications of the new technology, but her grasp of those implications is rather limited.

For example, she shows no awareness of the fact that:

One of the most frightening aspects of global warming, which few people realize, is that there’s a time delay for the consequences of our actions to show up. The really catastrophic effects won’t become obvious until it is too late to reverse them. The news media have been completely remiss in explaining this to the public. There are two key points about the science which everyone should know: lags, which delay and disguise the effects of our greenhouse gas emissions, and feedbacks, which magnify the effects when they do occur. Put these two effects together, and the science is clear that a “wait and see” approach to climate change is an invitation to disaster. it takes very roughly forty years* from the time we increase CO2 levels for most of warming to occur in response to that extra CO2.

Now if today’s weather conditions have their origins in what we humans did around 1974 (i.e., the 40-year lag referred to above), the question that I ask is: What will the weather situation be in, say, 2050?! One scientist who has provided an answer to this question is John Davies, who in an article posted last year stated as his very first line!: “The world is probably at the start of a runaway Greenhouse Event which will end most human life on Earth before 2040.”

Professor Emeritus Guy McPherson, in commenting on that posting, has said: “He [Davies] considers only atmospheric carbon dioxide concentration, not the many self-reinforcing feedback loops described below” [in Guy’s posting]. Indeed, in an earlier posting Prof. McPherson had said: “A decade ago, as I was editing a book on climate change, I realized we had triggered events likely to cause human extinction by 2030.”

These statements by Davies and McPherson are sobering, to say the least! What they suggest, however, is that:

It may now be too late to do anything about global warming.

If geo-engineering measures are introduced,

This may occur too late to be effective.

There is the possibility that if such measures are introduced, they will exacerbate, rather than alleviate, the problem. (Recall that Al Gore has referred to such measures as “insane”!)

The above is not calculated to give the reader any degree of comfort, but that’s not my fault! We have known about global warming as a potential threat since the late 1930s, thanks to Guy Callendar, but our “leaders” (a misnomer if ever there was one!) have ignored the problem over the years—and it is likely now too late to “correct” this problem! Given this, a good investment today (were it legal!) would be “life-termination” clinics á la Soylent Green!

In conclusion (and to return to my title), not only is our society becoming (once again!) a feudal society, but that fact bodes ill not only for our future (as USans—i.e., citizens of the United States), but the future of the species of which we are members! Had Ms. Taylor recognized the fact that her book serves no useful purpose, it’s possible that she wouldn’t have written it. The same goes for most books/articles written these days!