The latest privacy framework introduced on Capitol Hill would offer consumers sweeping control of their data while forcing companies to be more transparent about how they use and secure that information.

The Privacy Bill of Rights Act, introduced Friday by Sen. Ed Markey, D-Mass., also takes aim at a number of related big data issues, including algorithmic bias, biometric information sharing and data breach notifications.

Like most of the privacy bills introduced in recent months, Markey’s legislation would grant the Federal Trade Commission greater authority to write and enforce rules for how companies can collect and use consumer data. But while some previous proposals leave the FTC to chart its own path, this latest bill would create a number of specific policies for commissioners to enforce.

Under the bill, every company that collects consumer data would be required to publish a “short-form notice” disclosing the types of data being gathered, who it’s shared with, how it’s used and how long it’s stored. Notices would be written in “clear, concise, well-organized and understandabl[e]” language and regularly updated to reflect changes in company practices.

Users would need to opt in before companies could collect their data, and they must always have the opportunity to access, change and delete their personal information. Companies would be barred from denying services or changing prices for users who opt out of data collection.

Facebook, Google and other data collectors would be prohibited from sharing anyone’s biometric data without their explicit consent or using data for any service that could potentially lead to discrimination. Companies would also need to publicly outline the steps they’re taking to protect consumer data and inform consumers when their information is compromised.

The bill would also place tight constraints on third-party data brokers, which today buy and sell personal information with little legal oversight. Under the legislation, third parties would only be allowed to share data among themselves when it’s necessary for performing specific services, and companies would be required to audit the security and privacy practices of their third-party collaborators every two years.

The proposal also upholds the principle of data minimization, which states companies shouldn’t collect personal data “beyond what is adequate, relevant and necessary” for the product or service they’re providing. For example, a pizza delivery app may need to access your credit card information but not your entire photo library.

Limiting the scope of the data companies can collect will be one of the most challenging topics facing lawmakers as they hammer out a national privacy framework, according to Michelle Richardson, director of the privacy and data project at the Center for Democracy and Technology. While allowing consumers to access, change and delete data would be relatively easy for companies to do, data minimization and use limitations require “systemic change to corporate behavior,” and would likely face more backlash from industry, she told Nextgov.

“There’s still a lot of debate to come about the scope [those limitations] … but it’s going to be one of the biggest fights and core to whether a federal privacy bill is able to pass at all,” Richardson said. “If you don’t deal with these issues, it’s not worth doing a bill.”

Markey, who’s recently spearheaded the upper chamber’s efforts to restore net neutrality regulations, still has no co-sponsors on the bill. However, Richardson said, his proposal and others that came before it are giving lawmakers more ground to stand on when discussing abstract concepts like privacy.

“The more [proposals] that come out, I think the more people will have to be making some concrete decisions,” she said. “I think you’re seeing more corporate representatives acknowledge you’re going to have to get into data use issues. These are all good signs about the debate picking up and us moving in a legislative direction.”