Eliminating the filibuster for Supreme Court nominees was the natural culmination of a tit-for-tat escalation by both parties. The brinksmanship is all symptomatic of a much larger problem.

Speculation abounds regarding what the Senate’s having gone “nuclear” means for those of us living in the brave new morning after. For every commentator who rues that our justices will now decline in quality, there’s one who explains that this moment actually breaks the fever of our toxic judicial politics.

Given that judges are now primarily selected for jurisprudential correctness (and on the Left for demographic correctness) rather than party loyalty and cronyism, I can’t imagine that nominees will be substantially different. Opportunities for obstruction will continue, too—pushed down to the “blue slip” and other arcane steps—even as control of the Senate remains the most important aspect of the whole endeavor.

Eliminating the filibuster for Supreme Court nominees was the natural culmination of a tit-for-tat escalation by both parties, with partisan disagreements over when it all began. The Gorsuch denouement was retaliation for the Garland blockade, which in turn followed Sen. Harry Reid’s nuking of filibusters for lower-court (and executive-branch) nominees in 2013, which came a decade after Reid used the tactic to block George W. Bush’s nominations (most notably Miguel Estrada). But that unprecedented move was only used because Bush was an “illegitimate” president, after losing the popular vote and being “selected” by the Supreme Court in Bush v. Gore.

Ending the Filibuster Is Symptomatic of a Larger Problem

At a certain point, it doesn’t really matter who started it. The senatorial brinksmanship is all symptomatic of a much larger problem that began long before Sen. Ted Kennedy smeared Supreme Court nominee Robert Bork: the Supreme Court’s own self-corruption, aiding and abetting Congress and the executive branch in warping federal power. Living constitutionalists and their judicial-restraint handmaidens have politicized the law such that judges quail at enforcing the Constitution’s structural limits and face attacks for not interpreting statutes in a way that favors “the little guy.”

As government has grown, judges are declaring what laws Congress can pass, what regulations administrative agencies can promulgate, and the scope of new rights. As we’ve gone down the wrong jurisprudential track since the New Deal, the judiciary now affects the direction of public policy more than it ever did, and those decisions increasingly turn on the party of the president who nominated the judge or justice. So of course confirmations will be fraught.

This is a new development. It’s unusual for our political parties to be so ideologically polarized, and therefore for jurists nominated by presidents from different parties to have notably different views on legal interpretation.

Congress Used to Take Care for Constitutionality

Under the Founders’ Constitution, the Supreme Court hardly ever had to strike down a law. The Congressional Record of the eighteenth and nineteenth centuries shows Congress discussing whether laws were constitutional much more than whether they were a good idea. Debates focused on whether something was genuinely for the general welfare. “Do we have the power to do this?” was the central issue with any aspect of public policy.

My two favorite examples came right around the turn of the nineteenth century. President Grover Cleveland vetoed an appropriation of $10,000 for seeds to drought-stricken Texas farmers in 1887 because he could find no constitutional warrant for such action. Then, in the course of a water-rights dispute between Kansas and Colorado, the Supreme Court explained in 1907 that “the proposition that there are legislative powers affecting the nation as a whole although not expressed in the specific grant of powers is in direct conflict with the doctrine that this is a government of enumerated powers.”

Bad judges played their part in changing all that. The idea that the General Welfare Clause allows the government to legislate on any issue so long as its action fits the majority’s conception of what’s “good” emerged in the Progressive Era and was soon judicially codified. After the high court in 1937 began approving expansive legislation of the sort it had previously rejected, no federal law would be struck down on enumerated-powers grounds until 1995. The New Deal Court is the one that politicized the Constitution, and therefore the confirmation process, by laying the foundation for judicial mischief of every stripe, both letting laws stand that should be struck down or striking down laws that should be upheld.

As President Roosevelt wrote to House Ways and Means Committee chairman Robert Daughton in 1935, “I hope your committee will not permit doubts as to constitutionality, however reasonable, to block the suggested legislation.” New Deal architect Rexford Tugwell later explained that “to the extent that these [policies] developed they were tortured interpretations of a document intended to prevent them.” During the 1930s and ’40s, we thus had a perverse expansion of the Commerce Clause that returned to center stage during this decade’s Obamacare litigation.

Courts Are Politicized Because the Left Rejects Rule of Law

In that light, modern confirmation battles are all a logical response to political incentives. When judges act as super-legislators, then senators, the media, and the public rightly scrutinize their ideologies and treat them as if they were super-politicians with lifetime tenure.

Moreover, those who view the Constitution as a living, breathing document that evolves with the times and who demand that judges read laws to maximize “justice”—never mind statutory text—will always be suspicious of originalists and textualists. (I took down this bit of “progsplaining” during the Gorsuch hearings, but truly you should read Georgetown law professor Lawrence Solum’s pithy takedown of originalism myths in his testimony.)

The principle benefit of a written constitution is that it subjects all government officials to rules and principles they can’t unilaterally change. To be sure, judges of good will can read the same words and reach different conclusions—see the dueling originalist opinions of justices Scalia and Stevens in the Second Amendment case District of Columbia v. Heller—but it’s impossible to conceive a better method for producing consistent results or a credible judiciary. If we value the rule of law, there is no substitute for a good-faith effort to apply the meaning of the Constitution, especially in light of changing circumstances and exigencies.

In a country ruled by law, and not men, the proper response to an unpopular legal decision is to change the law or amend the Constitution. Any other perspective on the judicial role leads to judicial abdication and the loss of those very rights that can only be vindicated through the judicial process. Or to government by black-robed philosopher-kings—and, as Justice Scalia was fond of saying, why would we want to be ruled by nine lawyers?

In remarks at his nomination ceremony on January 31, Justice Neil Gorsuch struck a similar tone. “It is the role of judges to apply, not alter, the work of the people’s representatives. A judge who likes every outcome he reaches is very likely a bad judge—stretching for results he prefers rather than those the law demands.”

De-politicizing the judiciary is a laudable goal, but that will happen only when judges go back to judging rather than either ratifying the excesses of the other branches or rewriting laws as they see fit for whatever external reason.