Not all bills on the table take an omnibus approach. Some appear to be highly specific swipes at Facebook. For example, a social media privacy bill introduced by Senators Amy Klobuchar and John Kennedy does not add very much to consumer privacy, but each of its provisions — like one that forbids a change to a product that “overrides the privacy preferences of a user” — seems to be a reference to something Facebook has done in the past. Senators Mark Warner and Deb Fischer have introduced a bill circumscribing experimentation on users without their consent . It might seem shocking that any company would do such a thing, but, in fact, Facebook tinkered with its News Feed in 2014 to test whether it could alter its users’ emotions. (The bill also bars designing sites targeted at children under the age of 13 “with the purpose or substantial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user” — a provision aimed at YouTube and its effect on children.)

Where the Warner/Fischer bill looks to alleviate the harmful effects of data collection on consumers, Senator Josh Hawley’s Do Not Track Act seeks to stop the problem much closer to the source, by creating a Do Not Track system administered by the Federal Trade Commission. Commercial websites would be required by law not to harvest unnecessary data from consumers who have Do Not Track turned on.

A similar idea appeared in a more comprehensive draft bill circulated last year by Senator Ron Wyden, but Mr. Wyden has yet to introduce that bill this session. Instead, like Mr. Warner, he seems to have turned his attention to downstream effects — for the time being, at least. This year, he is sponsoring a bill for algorithmic accountability, requiring the largest tech companies to test their artificial intelligence systems for biases, such as racial discrimination, and to fix those biases that are found.

A grand bargain privacy bill is said to be in the works, with a handful of lawmakers from both parties haggling privately over the details. Forward-thinking legislation — and the public hearings that would inform its passage — are urgently needed. Americans deserve a robust discussion of what privacy rights they are entitled to and strong privacy laws to protect them.

Congress’s earliest attempts to regulate computing in the 1980s and 1990s were embarrassing. The Congressional Record shows that t he Computer Fraud and Abuse Act of 198 4, for instance, was prompted by a fantastical Hollywood film about a boy hacker. The Communications Decency Act of 1996 — many sections of which were deemed unconstitutional by the Supreme Court in the following year — had its origins in a moral panic about internet pornography to uched off by questionable research. All this lent support to the received wisdom that the tech industry is best left to its own devices without the interference of a clueless legislature. More recent attempts, like the abortive Stop Online Piracy Act , an overbroad piece of copyright enforcement legislation that was killed in 2012 after furious backlash from internet users, have not instilled much confidence in Capitol Hill’s understanding of technology. But encouragingly, many of the privacy bills introduced this session show a sophisticated understanding of the market for personal information, the nation’s woefully inadequate cybersecurity and the many dangers posed by a sector of the economy that has proved itself incapable of self-regulation. Legislators have stepped up their game.

A single bill is of course not the end of government’s responsibilities to its citizens. Any regulation must evolve alongside technology to safeguard fundamental freedoms. But a strong law would be a welcome start. The California privacy law will go into effect in less than seven months . Congress should seize the moment and the public momentum to enshrine digital privacy rights into federal law.

Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.