WORD'S FAIL; WORD'S FAIL...



To clarify this screed’s rant, and the particularly annoying variety of left-wing “newspeak” that provoked it, it may help to explain what we aren’t ranting about–this time. The Internet is ablaze with lists of politically correct claptrap compiled by pathologically-officious arbiters of allowable speech. These lexical bullies infest faculty lounges across the land, united in their quest to denude American discourse of expressive scope by restricting it, in the name of social justice, to words they deem inoffensive, or, if offensive, offensive to practices. politics, or persons who are White, Christian, to the philosophical Right of Noam Chomsky, or otherwise deplorable. While we abhor these despots-manque, it is not our purpose here to catalog their verbal impositions. Anyone with a little nerve and a righteous abhorrence of gibberish can spot, list, and mock these lingual contaminants. And while ridicule is an entirely appropriate reaction to such taradiddle, we are pleased to note numerous lexicographic freedom fighters have taken up the work. Our business here involves a more refined grievance.

Our more refined grievance.

In this screed, we cavil about more than the Left’s enthusiasm for paralyzing our common tongue –we are particularly peeved by a substratum of usages so woefully misconstrued or wrongly etymologized as to give offense on two counts: First, as components of the balmy lexicon of tone-deaf PC idioms, and second, (here’s the important part) by adding a layer of fatuity insofar as they imply meanings contrary to or starkly different from the ones intended. To grasp our point precisely, one need only consider the line famously uttered by Mandy Patinkin in the role of Inigo Montoya from the 1987 romantic comedy “The Princess Bride,” since immortalized as a ubiquitous meme: “You keep using that word, I do not think it means what you think it means.” If you’ve seen the movie, you know the context, and if you haven’t, suffice it that for our purposes, the line speaks for itself—(forgive the irresistible anthropomorphism). To illustrate by example, let’s turn to the not-too-distant past, or what we might call the dawn of modern feminism. Many contemporary feminists lack even a basic familiarity with the movement in that era, recalling it, if at all, with condescension–or consigning it to moth balls as “Second Wave Feminism.” But we recall it vividly, together with the now forsaken term “Fem Lib” (short, of course, for Female Liberation), a term no longer in use, and not currently at issue. Our displeasure is incurred by a different, far more persistent coinage emblematic of those fractious times–although, to be historically exact, as early as the dawn of the 20th century there was similar deviltry afoot.

The slur’s the word!

I n 1901, the editors of the Springfield (Massachusetts) Republican, preoccupied with niceties that seem quaint at this remove, fretted that “To call a maiden Mrs. is only a shade worse than to insult a matron with the inferior title Miss. Yet it is not always easy to know the facts…” And in order that no married female be thenceforth addressed as “Miss,” the editors proposed a solution. “The abbreviation Ms is simple, it is easy to write,” they explained, and “for oral use it might be rendered as ‘Mizz,’ which would be a close parallel to the practice long universal in many bucolic regions, where a slurred Mis’ does duty for Miss and Mrs. alike.” Simple enough, one might suppose, but in an era when “progressivism” meant Teddy Roosevelt, the idea went nowhere. Indeed, the editors of the Springfield Republican probably went to their graves lamenting their failure to save married women from the ignominy of being mistaken for single.

There followed a dark age of indifference to such matters, during which women in every walk of life faced the daily risk of hymeneal mislabeling. Fortunately for humankind, during an otherwise lackluster interview on WBAI-radio in 1969, Feminist Sheila Michaels resurrected “Ms,” recommending it as a means of uniting the entire distaff sex within a single, collectivizing, Mao-jacket-ish sort of honorific, (and causing anyone enunciating it to be instantly remindful of a character from Pogo). Michaels credited the idea to a pamphlet she’d been handed by some Marxist group now lost to time, as might the entire idea have been, had not a friend of Gloria Steinem’s been listening who afterwards mentioned Michaels’s suggestion to Steinem, who pronounced it inspired.

In 1969, Gloria Steinem was the most widely known champion of American feminism, largely—if somewhat ironically—because she was good-looking, a quality conspicuous by its absence among the majority of her confederates. Besides making her a favorite talk-show guest, Steinham’s looks sufficed to win her employment as a Playboy Bunny, a stunt she followed with a widely read exposé in Show magazine detailing the veritable hell on earth endured—apparently—by Playboy Bunnies. Her bunnyhood behind her, Steinem was eager to found a magazine devoted to the feminist cause and thought “Ms.” would make a perfect title—smacking of rebellion, liberation from prosaic sexual roles, and the kind of controversy that drives sales at the newsstands.

The 1970s stood in sharp contrast to the days when a stringent lexical conservatorship consigned silly ideas like “Ms” to the waste basket. The burgeoning influence of the radical Left in American media and higher education meant that Steinem’s advocacy of “Ms” became instantly vogue. If this seems incomprehensible, remind yourself: this was the decade that popularized Disco, earth shoes, mood rings, plaid polyester leisure suits, and Jimmy Carter. Any doubt regarding the continued communist subversion of government and the arts was set to rest as the U.S. Printing Office raced to approve “Ms.” on all official government documents, while Marvel Comics announced a new superhero named Ms. Marvel, billing her as the “first feminist superhero.” Gloria Steinem was really the first one, we think, but she probably refused to wear another costume.

So, what’s our quibble with Steinham’s epochal choice of magazine titles? Let’s hear from Australia’s ABC News (not to be confused with its execrable American homograph) where contributor James Valentine sums up our objection, writing: “If you choose Ms as your honorific, others may think you mean more than you do…and it may not be a meaning that applies to you or any way related to why you choose to be a Ms.” In other words, Valentine might as well have written, “You keep using that word…”

Besides its ties to Marxism, which probably bother no one at this point except us, there is the inevitable problem of disambiguation that follows the forced introduction of most nonce-words like a faithful skunk. For example, current polls showing “Ms” on the skids, indicate a sizable population of women believe the term applies exclusively to divorcees, particularly in the United Kingdom; speaking of which, Debrett’s Peerage and Baronetage, disallows “Ms”, insisting that “”The ugly-sounding Ms is problematic. Although many women have assumed this bland epithet, it remains incorrect to use when addressing a social letter.” The Queen’s English Society likewise dismisses Ms as “an abbreviation that is not short for anything,” which would be concerning, if true, which it isn’t, really, which begs an additional remonstrance.

Lacking any pedigree in popular usage. the term often confused even those determined to adopt it. People assumed the letters were separately pronounced, as in “Welcome to your interview, Em Ess Smith!” And the absence of formal standardization saw the term rendered in all caps as often as not. Capitalized, of course, the letters have long denoted Multiple Sclerosis, but in either format, as an exasperated (female) editor famously stormed at the height of the term’s popularity, “Ms means manuscript, look it up!” And so it did, and does, and will continue to—or are we unduly sanguine?

Using MS WORD…

A side from the improbable existence of some editor at some publishing house, whose extraordinary skills at enhancing and correcting writers’ submissions led to her admiring coworkers calling her “Manuscript” Jones, it is impossible to imagine any woman, no matter her sociopolitical convictions, intentionally describing herself as a manuscript, or, for that matter, as somehow associated with an incurable neurological disease linked to double vision, psychiatric problems, loss of physical coordination, and death. More recently, we have Bill Gates’s patriarchally insensitive usurpation of the term in his marketing of MS Word, which may seem relatively inconsequential until you review the 2010 census and note 6,177 Americans surnamed Word, assumedly half of whom are liable to being confused with the world’s first and most famous word processor, not to mention numerous iterations of its popular document format, (now available as an Office 365 app, whatever that is). And if you don’t think this can spell trouble, consider the plight of the comely but demure young lady bearing this surname, obliged to work at a desk proximal to a bulletin board to which some heedless functionary has affixed a promotional poster emblazoned with: “Thank you for using MS WORD!” Very, very top of mind….

But before we leave “Ms” to its fate, or at least Webster’s Third, we should mention another source of mounting dissatisfaction with the embattled abbreviation, namely, contemporary liberalism. The problem with “Ms,” nowadays, is that people who identify with it are women, and people who apply it, apply it to women. Social Justice, meanwhile, has outgrown such callow paradigms, meaning that societal efforts to confine people to restrictive sexual categories or to distinguish between such categories, is highly offensive. The Left’s current obsession with gender-fluidity engenders (sorry!) the corollary dogma that one can pretend one’s gender is whatever one prefers while requiring everyone else to behave as though identically deluded. Given this recent advance in progressive social doctrine, any honorific specifying an individual’s sex, no matter how radically chic in its day, is suddenly archaic and microaggressive.

“Sociolinguist” and Atlantic editor Ben Zimmer, for one, insists the lack of gender-neutral terminology in English “has caused a lot of headache over the years,” which we at WOOF confess we hadn’t realized. Fortunately, lexicographer Jane Solomon was more alert. “The need for a gender-neutral prefix seems to be very, very top of mind for people,” the language expert assured TIME magazine. To this end, Solomon is confident the uni-sexual prefix “Mx” will prevail where previous efforts like xe, thon, and zhe, inexplicably foundered.

A fire with the vision of Americans everywhere demanding a usable gender-neutral means of address, constituents of “Fourth Wave” feminism saw “Ms” for what it was–just another gender-specific tool of patriarchal oppression no better than Miss or Mrs. The way forward, they realized, entailed everyone using the prefix Mx, which Sociolinguist Zimmer helpfully instructs readers to pronounce “Mix,” rather than “Em Ex,” an emphasis meant, one assumes, to prevent anyone from accidentally self-identifying as a land-based intercontinental ballistic missile.

Blacklisting “Black!”

In a similar vein, the essayist Dallion Rew (himself African by ethnicity) writes that “Black” is now “becoming more and more disagreeable to people who read and study History!” We maintain the opposite. The Black and certifiably-liberal essayist Kimberly Alexander points out that the problem is not history, but rather “whiney sensitive snowflakes [who] chose ‘black’ as something they can no longer say.” Right! The problem is that Black political perceptions are influenced far more than anyone dares acknowledge by White, “whiney sensitive snowflakes” who are also media mavens, political leaders, and university professors. So we are now beset by a “fourth wave” (more or less) of Black militants protesting the descriptor “Black” as “racist” because a few benighted snowflakes in Birkenstocks miseducated them to believe the term was at some juncture foisted upon them by their White oppressors. Pamela Oliver, a professor of sociology at the University of Wisconsin – Madison, points out that ballooning numbers of graduates “…educated in predominantly-White schools…have been taught that Black is insulting or that the only correct term is African American.”

Calling Blacks “Black,” rather than Negroes (the preferred term throughout the ‘50s and early ‘60s, tarnished beyond retrieval by President Lyndon Johnson’s propensity for pronouncing it “Nigras”) is in fact ascribable entirely to Blacks—and radical Blacks, at that. Advocacy for the term ran the gamut from the erudite young SNCC spokesman, Stokely Carmichael, to the barely comprehensible H. Rap Brown. Malcolm X advocated “Black,” and the Black Panthers insisted on it. (Who, after all, would take militants seriously who styled themselves African American Panthers?) The entire Black Power movement demanded it—yet only a few decades later we find Blacks eschewing it because Whites (who are, after all, the tenants of radical chic) misremember it as an insult.

African Americans from Haiti, and beyond…

Which brings us to “African American,” the usage most often invoked in place of the spuriously-maligned “Black.” Jesse Jackson, in fact, began advocating the term in the 1980s, because it de-emphasized color. Today, however, the “woke” custodians of permitted verbiage apply it as synonymous with “Black,” defeating the logic that drove Jackson to recommend it, while conjuring a host of fresh inconsistencies. The decorum impelling us to call any dark-complected individual “African American” is now rampant in our culture—thus students submit essays littered with absurdities like “2.5 billion African Americans currently populate Africa”—or complaining that Haitian refugees are discriminated against because they are “African American.” But Gary Player, the White golfer who hails from Johannesburg, is African. In fact, numerous African American golf stars have gained fame, but with the notable exception of Tiger Woods, who isn’t really from Africa, they are all White. Someone should look into that.

South African President Nelson Mandela was Black, which may explain why a US News reporter notoriously eulogized him as “a famous African American.” And how does one politely strive to mitigate the outrage expressed by Jamaicans and other citizens of the West Indies who come here to study and find themselves enrolled as African Americans, when, in point of fact, they are neither?

Jesse Jackson exposed!

The panjandrums of political correctness are hastening even now to elasticize the definition of their clumsy surrogate, rationalizing its application to darkly complected peoples regardless of origin. The ramifications are groan-worthy, but we are pleased to note (excuse our lexicographical Schadenfreude) that “African American” is lately beset by baseless aspersions similar to those that previously overwhelmed “Black.” Take the blog maintained by the feminist “Sabriyya,” who identifies as a “Political Science and Public Policy Studies double major… interning at the Hate Crimes Working Group,” and who learnedly declares that “labels, like ‘African-American’…seek to establish…‘racial’ groups as second class citizens…” adding that “’African-American’ implies that black Americans are not full Americans.” It gets worse, because, “this makes more sense once the term is contextualized in the period of legal slavery in the U.S. These labels are the same mechanism that Apartheid leaders were able to strategically use to create divisions within the black population…and prevent their unification in an attempt to overhaul the oppressive Apartheid regime.” Take that, Jesse Jackson, you cracker! Sabriyya, it seems, would welcome a return to the word Black—placing her in agreement with us, however much in consequence of her erroneous assumptions, and whether she likes it or not.

Check the box, America!

And now, about Native Americans. We always check the box on this one—just like Elizabeth Warren, who has every right to claim she’s native American, having been born in Oklahoma; or Ward Churchill, whose real sin, beyond serial plagiarism, faking a combat record, and falsely claiming Creek and Cherokee ancestry, was emulating Gloria Steinem’s hair style, to disastrous effect. Ward is certifiably native American, though–he was born in Urbana, Illinois. In every ascertainable context prior to its engineered mutation, the adjective “native” referred to one’s place of birth, as in native Nebraskan, native Englishman, native Bostonian, and so on. Moreover, the distorted application is so painfully ad hoc, no attempt is ever made to extend it in keeping with semantic consistency. Tell any population of liberals–no matter how dogmatically liberal– that you are “a native New Yorker,” and not one–not a single one– will reply, “Oh, so you’re of Lenape descent?” By every standard of usage and precedent, then, those born in the USA are native Americans—including, of course, American Indians, but we see no reason to allow them to monopolize the term–and in point of fact, they don’t seem to want to.

When last surveyed by the Census Bureau, a majority of American Indians preferred the term Indian to Native American. And don’t look to the activists for support- an outspoken number of them reject “Native American” as an unbidden imposition confected by some hyperactive Ivy-League wasichu. In his essay “I Am An American Indian, Not a Native American!” Russell Means, the Lakota Sioux who founded the radical American Indian Movement (AIM), averred, “I abhor the term ‘Native American’…a generic government term used to describe all the indigenous prisoners of the United States. I prefer the term American Indian because I know its origins.” Right on! WOOF joins Russell Means in denouncing this flagrant offense against his people, not to mention elementary semantics, and encourages all U.S. citizens by birth to “check the box”– and reclaim their native American heritage!

And speaking of chauvinism…

Which reminds us, for no apparent reason, of our second and greater annoyance with feminism. Granted, one cannot expect a generation of radicalized women who sought to advance female equality by burning their bras, exhorting one another to hirsuteness, smoking Virginia Slims, and otherwise mimicking the rabidly bellicose politics of the (then) New Left, to concern themselves with the derivation of nouns and their attendant adjectives; but that said, the wanton misapplication of “chauvinism,” which today is misapplied almost universally, is particularly bizarre; and more aggravating (to us, at least), than generations of females insisting they are manuscripts or guided missiles.

Consult any “descriptive” dictionary currently preferred by the “usage-is-king” mob, and you will see pride of place given to the same definition of “chauvinism” blathered by every miseducated college graduate, feminist Androphobe, and Oprah viewer, i.e., “An attitude of superiority toward members of the opposite sex.” The precise point at which chauvinism transmogrified into this current acceptation is lost in the radical murk, but as any dictionary published before 1970 will verify, the word derives from a Napoleonic soldier named Nicolas Chauvin. Reputedly, Chauvin maintained a fanatical loyalty to the Emperor, augmented by a patriotic zeal for all things French. Thus, “chauvinism” has always connoted what the Oxford English Dictionary persists in defining as “an exaggerated patriotism of a bellicose sort; blind enthusiasm for national glory or military ascendancy.”

To be denounced as a chauvinist pig, in other words, is simply to be called a fanatically patriotic pig—which leaves the porcine implications in limbo, given how few famously patriotic pigs are available to draw inferences from. The feminist application isn’t merely tenuous–it is impossible to justify by any means—there is simply no connotative linkage. The Feminist movement might as logically have re-purposed the word “federalist,” or “oboist,” with no less justification.

Ecce homophobe!

H omophobia (only because of its relative prominence) is the cardinal irritant among an array of artificially manufactured “phobias,” each of which truncates handily into the bumper-sticker-sized “phobe.” “Phobe” is a suffix assigned by Leftists to anyone they feel an impulse to debate, but who, on reflection, they deem it safer to merely label. Labeling in this context amounts to a sneaky means by which the labeler can resort to the ad hominem attack, dissemble his sneakiness as righteous indignation, and affect moral superiority while sidestepping eristical humiliation. The rule is simple: an offending sentiment is aired and must be countered with a suitable “phobe.” Obviously, selecting phobes pertinent to whatever viewpoint one wishes to suppress is vital. If one’s adversary says, “wall,” for instance, one may counter with “xenophobe!” whereas a viewpoint critical of, say, Sharia Law, can be negated with “Islamophobe!” Meanwhile, sufferers of actual phobias are afflicted with an unreasoning terror of specific objects, entities, or circumstances. Their symptomatology is free of malice or disdain, yet progressivism distorts this diagnosis to connote “bigoted person driven by whichever specific bigotry we are currently accusing him of.” True, “phobe” is the available opposite of “phile,” but it fails miserably as a true antonym. Miso is wanted, but only functions as a prefix, as in “misogynist,” a usage sometimes employed in Liberalese, meaning, e.g., anyone who thought Christine Blasey Ford was making things up. Similarly, Misosodamist might serve where homophobe so conspicuously fails —not that we personally know any misosodomists, nor would we tolerate their company!

More problems arise from the fact that the prefix in homosexual does not derive from Latin, but rather from the Greek word homos, meaning “the same.” To be called homophobic, then, is to be outed for harboring a morbid and irrational fear of sameness by someone attempting to accuse you of hating Gay people. Of course, while we’re in the neighborhood, we might as well point out that Gay actually means lighthearted, vivacious, and carefree…but we’re an open-minded bunch here in the WOOF cave, and we gave up on that one a while back.

Alt what?

Alt-right is an interesting latecomer to the PC lexicon. It bids fair to replace its threadbare predecessor, Neocon, long overdue for retirement. Properly understood, the Alt-right is an anti-Semitic, totalitarian, collectivist, anti-Israeli, and (may we say) misosodomistic, cohort harboring scorn for races and religions it presumes inferior. Designating any conservative personality or website as “Alt Right” is a current fad of the Left. But the “Right” in Alt Right evokes the fascistic European specimen, which resembles strains of conservatism only slightly, and even then, only the European species that Robert Nisbet liked to call “throne and alter.” Aside from a mutual devotion to military preparedness, it bears no resemblance to American conservatism which is anti-throne, anti-establishment, anti-totalitarian, individualistic, and socially Judeo-Christian. As a catchall for right-wingers who annoy left-wingers, however, the term seems to have retired “Neocon,” which has been misapplied by everyone from John McGowan to the conformistically progressive Rolling Stones, (whose political forays always resonate about as compellingly as Mitch McConnell might, if he tried to perform Jumpin’ Jack Flash).

In fact, Neocon derives from “Neoconservative” and means something specific. A Neocon, properly understood, is a recovering liberal who typically renounced the brand during the late ‘70s or early ‘80s, repulsed by the lemming-like pro-disarmament posture that consumed the American Left until the collapse of the Soviet Union consigned the doctrine of accommodation to what Leo Trotsky liked to call the dustbin of history. Many Neocons remained Democrats, inspired by party luminaries like Senator Henry “Scoop” Jackson and the sporadically formidable Daniel Patrick Moynihan. It would be decades before Nancy Pelosi and Harry Reid hunted the Blue Dogs to extinction, permanently ridding the “party of JFK” of anyone suspected of thinking like—well—like JFK. By contorting “Neo” into a synonym for “ultra,” the Left conflated the likes of Norman Podhoretz and Irving Kristol with Pat Buchanan and Phyllis Schlafly. In any case, “Neocon” was so ruthlessly commandeered and so contemptuously applied in the final years of liberalism’s media monopoly that by the dawn of the 21st century, no matter who applied the term to whom, it was functionally an epithet.

Let’s beat up some fascists!

Speaking of Neocons and the Alt-Right, who is more stalwart in opposition to such tyrants, than ANTIFA? True, their acronym sounds like an aid mission to West Africa, or a bad trade deal, but as readers are doubtless aware, ANTIFA is a compression of “Anti-Fascist,” a designation so willfully incongruous, it encroaches on Dadaism. Of course, the rank and file ANTIFA goon is clueless; he throws urine balloons and feces at Trump supporters and police officers and concusses stragglers with chains or crowbars, insensible of the irony. He, like his witless media supporters, takes the anti-fascist misnomer seriously—a quick lesson in the deficiencies of the American educational system, and one that explains how brick-hurling ANTIFA thugs can tear into a peaceful “anti-racist march” in Berkeley, while shouting “End fascism!” –all without a moment’s introspection.

Fascism, long cherished by liberals as the obvious antithesis of communism, (a sophistry Ayn Rand once disintegrated by sneering, “heads collectivism, tails, collectivism”) is a phrase with a rich history. American communists thunderously denounced fascists in the ‘30s until the Hitler/Stalin pact made fascism okay–until Hitler invaded Russia, making fascism horrible again. You can call political opponents fascists and imagine a philosophical kinship with Picasso, Gandhi, or Woody Guthrie. You don’t have to know what you’re saying. ANTIFA, of course, exemplifies fascism. but the word has morphed into another one of those epithets liberals hurl at anyone whose ideas, if widely circulated, might threaten liberalism–an ideology liberals never hesitate to defend fascistically.

Common Sense Adjective Control?

Gun control is an interesting concept. To many of us it suggests the use of both hands from a sturdy isosceles stance, but to liberals it has always meant the abolition of firearms in civilian hands, except for civilians retained by liberals to protect them. Overturning an amendment vouchsafing Americans the right to keep and bear arms is hard work–and harder still when one approaches the task fortified by one’s principled resolve to know as little as possible about the disgusting things one is trying to get rid of. Long before AR-15s became emblematic of the NRA’s satanic grip on the polity, Americans awoke to the threat of “Saturday Night Specials” –cheap, small pistols said to be ubiquitous on our city streets and prone to killing anyone who happened within their limited range. Intent on halting the slaughter, congress banned these infernal engines of death in 1968, without pausing to effectively define them. This resulted in the usual slew of unintended consequences. For example, James Bond’s expensively crafted Walther PPK became, by congressional decree, a “Saturday Night Special,” having flunked the arbitrary weight standard by one point. Walther promptly removed itself from the Saturday Night Special business by placing the PPK’s slide and barrel on their heavier PP model’s frame, adding enough weight to un-Saturday-Night it, while creating space for an extra round in the magazine, thus making the gun less deadly by congressional standards. But this was long ago, and but a single battle in liberalism’s endless war on guns. A trail of mangled nouns and adjectives marks its advance.

“Assault Rifles” transpired to be even more dangerous than Saturday Night Specials, and capable of dealing out death in greater doses because they were said to be fully automatic machine guns meant for soldiers, placed by mercenary gun store owners in the hands of unpredictable rednecks, homicidal white supremacists, and sadistic hunters intent on slaughtering the nation’s wildlife en masse. Actually, even Wikipedia admits that an assault rifle is “a selective-fire rifle that uses an intermediate cartridge and a detachable magazine.” Selective meaning the weapon can be set to produce fully automatic streams of fire. If you want such a weapon, you should consider joining the military. Oblivious of such details, the Left became so frantic to make assault rifles illegal, it never paused to notice they already were–and had been since the passage of the National Firearms Act in 1934. Eventually, at least at the legislative level, liberalism grasped that “assault rifles” were a non-issue. “Assault weapons,” on the other hand, turned out to be everywhere.

To assure our rights shall not be unlimited….

B ut what on earth are Assault Weapons? Or, more to the point, what aren’t they? In 2004, the Public Safety and Recreational Firearms Use Protection Act of 1994 expired without any discernible effect on shootings or crime in general. The tsunami of mass slayings predicted as a result of the act’s expiration failed to materialize. Never one to take chances, Representative David N. Cicilline (D-RI) and a posse of 123 co-sponsors submitted the Assault Weapons Ban of 2015, which if adopted will ban all semi-automatic guns, including handguns and shotguns, and anything scary-looking that attaches to them, which seems academic since there will be nothing left to attach them to. In a virtuoso display of the “common sense” always said to inhere in such efforts, Cicilline explained that his bill was intended “to regulate assault weapons, to ensure that the right to keep and bear arms is not unlimited, and for other purposes.”

Polls from 2018 suggested the public was finally “woke,” or as Wikipedia puts it, “most Americans supported a ban on assault weapons.” We suggest a poll asking “most Americans” what an “assault weapon” is, as opposed to, say, non-assault weapons. The phrase is indefensibly absurd. The Oxford English Dictionary defines a weapon as “A thing designed or used for inflicting bodily harm or physical damage.” In other words, all weapons are assault weapons—not only guns, but also kitchen knives, tasers, pepper spray, rolling pins, bricks, sticks, and vases. Given the ambient nature of this fabricated redundancy, Americans who affirm support for abolishing assault weapons may react negatively to the term, but it doesn’t mean what they think it does.

Weather or not….

But sneakily replacing insupportable phraseology with ridiculously latitudinous substitutes is not unique to gun banners. Consider the weather. As the 1970’s drew to a welcome close, scientists, futurologists, and, of course, political hacks, abandoned their decade-long insistence that earth was doomed owing to the onrushing second ice age, (which followed a decade of insisting it was doomed because of the onrushing population explosion), and agreed that in fact, the opposite was now the case, and earth was doomed because of onrushing planetary warming. Science aside, this was vastly preferable from a political point of view, because warming could be blamed on industry, automobiles, hairspray, air conditioning, and cows–or, in other words, on America’s failure to accept the tenets of centralized regulation and radical environmentalism.

But the validity or invalidity of what Al Gore likes to call “settled science” (except insofar as the phrase itself is unscientific) is not our concern. We are less affrighted by ostensible shifts in climate than shifts in the idioms by which those ostensible shifts are described. Almost everyone noticed the failure of various cataclysmic events and life-threatening transformations to materialize, let alone materialize by their expertly predicted “point(s) of no return.” Add to this a general sense, at least among Americans, that things are getting colder rather than hotter, and the problem becomes one of perceptions. Our complaint relates exclusively to how the climate experts set about managing those perceptions.

More warming = more cold snaps….

Of course, they said, states were experiencing record low temperatures. Of course the Arctic wasn’t melting on schedule and cities remained above water, but this was easily explained. Warming expert Michael Mann took point, declaring that plunging temperatures and record snowfalls were “precisely the sort of extreme winter weather we expect” because, he explained, “As the planet warms, we’ll see more cold snaps.” As expected, then, global warming was raising ocean temperatures off the West Coast, which affected the jet stream, causing a “bulge.” In fact, Jeffrey Dukes director of Climate Change Research at Purdue University, warned that the jet stream now looked “less like a skull cap on the planet and more like a wavy snake,” which, he assured journalists, induced “blobs of cold air to migrate south.”

T he Union of Concerned Scientists (whose name is intended, one assumes, to distinguish them from unconcerned scientists) hastened to correct the public misconception that colder winters, more snow, and a lack of discernible warming suggested global warming wasn’t happening. “Such misinformation,” huffed the UCS, “obscures the work scientists are doing to figure out just how climate change is affecting weather patterns year-round.” One thing the scientists definitely figured out was that global warming caused polar vortexes to go rogue, spreading Arctic cold southward, resulting in the record cold winters of 2013/2014 and 2014/2015. Global warming even brought unprecedented snowfall and plunging temperatures as far southward as Texas, Mississippi, and Alabama.

Loathe to leave any detail unattended, scientists also explained average temperatures. Even here, our planetary fever played a hand insofar as weather that seemed perfectly ordinary was really an illusion created when frigid vortexes resulting from global warming collided with warming fronts caused by global warming, cancelling each other out and producing an ironic sense of normality–further evidence of global warming. But settled science here encountered an unanticipated problem: Science. Some in the scientific community (where critics are usually muted by fears of academic persecution) began to voice dissent. It wasn’t that they necessarily rejected global warming–the unifying objection was entirely procedural. Science demands certain proofs of legitimacy and one of the most basic is “falsifiability,” a simple application of logic first introduced by the Austrian philosopher Karl Popper, now deemed a vital component of any testable theory or hypothesis.

Science deniers…

F alsifiability is simply the inbuilt capacity of any proposition, statement, theory or hypothesis to be proven wrong. The classic illustration is the black swan. If you hypothesize all swans are white, you have a falsifiable hypothesis, because it only requires the discovery of at least one black (or any other color) swan to disprove your theory. Scientific theories, to qualify as testable, must include falsifiability. If, on the extreme other hand, you insist that absolutely anything that happens somehow substantiates your hypothesis, your approach lacks a falsifiable component—and that’s called pseudoscience. In their desperation to retain popular support, the climatologists rationalized data contradictory to their forecasts, pretended they’d predicted such data, and cited every measurable climate event (or, as in the case of massive serial hurricanes, every non-event) as further proof of their hypothesis. In doing so, they abandoned falsifiability and transformed themselves into the villains they had sworn to oppose: Science deniers.

But “climate change” is real!

R ather than admit their excesses, climatologists got semantical. The problem, it turned out, was not warming, per se, but Climate Change—and it had been, all along. The obedient media chimed in, and one was made to feel dopily unhip if one uttered the laughable misnomer “global warming,” where any intelligent, scientifically educated person would, of course, know to say, “climate change!” Thus, falsifiability was instantly finessed. Changing climate, after all, is glaringly, almost embarrassingly, falsifiable.

The overcompensation begat efforts to prove “climate” and “weather” distinct categories of inquiry- like, say, gender and sex; but that’s an issue for a different screed. For now, bear in mind: when earnest environmentalists prevail upon you to help fight climate change, they don’t really mean it. Climate change is undeniable, but it is also vital, natural, and desirable. For now, because its very obviousness makes climate change wonderfully falsifiable, it suffices (however awkwardly) as a synonym for imminent global devastation. On the bright side, once climatologists compile unequivocal evidence supporting their original claims, we can all go back to saying “global warming.” Good science takes time.

———————————————

You get the idea—even more than politically-correct babble in general, we’ve grown weary of drivel that gains currency despite its blatant lack of semantic credibility—words that when uttered by cretins affecting sociopolitical urbanity, have so little to do with whatever concept was intended as to conjure the image of Inigo Montoya, intoning, “You keep using that word…I don’t think it means what you think it means.”