As U.S. politics becomes less democratic, we need primaries even more.

The start of a presidential primary season occasions a lot of sanctimonious blather about the glories of our democratic system, but one aspect of the nominating process is actually underappreciated. In an era when two pillars of government are visibly thwarting democracy—defined literally as rule by majority or plurality—the manner in which political parties choose nominees is, refreshingly, becoming visibly more democratic.

The first pillar is the U.S. Senate, which is of course unrepresentative by design. Until the 1913 ratification of the Seventeenth Amendment, senators were chosen by state legislatures. Even today, each state gets two senators, giving small states power disproportionate to their populations; political scientists call this “small-state bias.” If the two-senators-per-state allocation weren’t spelled out in the Constitution, the Supreme Court would rule it unconstitutional under the “one person, one vote” standard imposed since the 1960s on state legislatures (upper and lower chambers both) and the House of Representatives. The Senate’s small-state bias these days translates to “one Republican, many votes,” because states with small (i.e., largely rural) populations tend to vote Republican.

The Senate has also become less democratic in recent years through growing use of the filibuster. Between 1917 and 1970, the number of cloture motions filed (a proxy for the number of filibusters) never exceeded seven in any Congress; typically, it was fewer than five. But, starting in the ’70s, various procedural changes and a gradual loss of inhibition made it increasingly easy to block legislation by filibustering. By the last Congress, the number of cloture motions filed had risen to three digits. Since you need a 60-vote majority to break a filibuster, 41 senators representing as little as 11 percent of the combined U.S. population can block any bill.

The electoral college, unlike the Senate, hasn’t become less hospitable to democratic principles; rather, it has remained as glaringly inhospitable as it ever was. Like the Senate, the electoral college is unrepresentative, though in more complicated ways. It has a small-state bias because the number of electors each state gets equals the number of its representatives (which is proportionate to its population) plus the number of its senators (which is not). But it also has a big-state bias because, in every state except Nebraska and Maine, electors are awarded on a “winner-take-all” basis. This translates into a disadvantage for Goldilocks states—neither small nor big—like Arizona and Indiana.

The electoral college has no consistent partisan bias. In 2008, it slightly favored Democrats; in 2000, it slightly favored Republicans. But, in 2000, you may recall, that slight balance enabled George W. Bush to become president even though he lost the popular vote. Some would say it was only the third presidential election in U.S. history in which the popular will was thwarted (the other two were 1876 and 1888). I would say that was three times too many. And, because the third instance came at a stage in the country’s political evolution when suffrage had been extended to a lot more people—women, young people aged 18 to 21, and Native Americans and African Americans theoretically able to vote but previously blocked from doing so—the flouting of the popular vote seemed (to me, anyway) that much more jarring.