The ACLU, along with our client Communications Workers of America and other civil rights groups, announced a historic settlement agreement with Facebook that will result in major changes to Facebook’s advertising platform. Advertisers will no longer be able to exclude users from learning about opportunities for housing, employment, or credit based on gender, age, or other protected characteristics.

This policy change follows years of work by civil rights advocates — including a legal challenge from the ACLU, the Communications Workers of America, and the civil rights law firm Outten & Golden LLP. In September, we collectively filed charges with the Equal Employment Opportunity Commission on behalf of CWA and individual job seekers against Facebook and a number of companies that targeted certain ads for jobs to younger male Facebook users. These charges joined other litigation asserting race discrimination in job, housing, and credit ads and age discrimination in job ads.

Most Facebook users were likely not even aware that this type of exclusionary ad targeting was happening. Some 30 years into the digitization of our daily lives, we’re still coming to grips with the fact that the vast trove of data we hand over with each and every “like,” search, post, or click — often without our knowledge or consent — will be used to target advertisements to us.

This kind of data mining is ubiquitous on Facebook, which attracts advertisers by touting its targeting tool’s power to show users only the ads Facebook or advertisers think they’d be interested in, based on how individualized data describes them. But there’s a discriminatory flip side to this practice. Ad-targeting platforms can be used to exclude users on the basis of race, gender, or age as well as interests or groups that can serve as proxies for those categories (think “soccer moms” or “Kwanzaa celebrators”).

As more people turn to the internet to find jobs, apartments, and loans, there is a real risk that ad targeting will replicate and even exacerbate existing racial and gender biases in society. Imagine if an employer chooses to display ads for engineering jobs only to men — not only will users who aren’t identified as men never see those ads, they’ll also never know what they missed. After all, we seldom have a way to identify the ads we’re not seeing online. That this discrimination is invisible to the excluded user makes it all the more difficult to stop.

Whether you call it weblining, algorithmic discrimination, or automated inequality, it’s now clear that the rise of big data and the highly personalized marketing it enables has led to these new forms of discrimination. This targeting has undermined longstanding civil rights laws, including Title VII of the Civil Rights Act, the Age Discrimination in Employment Act, the Fair Housing Act, the Equal Credit Opportunity Act, and similar civil rights laws, which prohibit discrimination on the basis of protected characteristics — such as race, gender, and age — in advertising housing, employment, and credit opportunities. It was only after the passage of Title VII in 1964 that job ads stopped specifying whether employers were seeking male or female applicants. It’s imperative that online platforms act to stop these archaic forms of discrimination from taking on new life in the 21st century.

In the first-of-its-kind settlement announced today, Facebook has agreed to create a separate place on its platform for advertisers to create ads for jobs, housing, and credit. Within the separate space, Facebook will eliminate age- and gender-based targeting as well as options for targeting associated with protected characteristics or groups. Targeting based on ZIP code or a geographic area that is less than a 15-mile radius will not be allowed. And Facebook will stop considering users’ age, gender, ZIP code, or membership in Facebook “groups” when creating “Lookalike” Audiences for advertisers.

Facebook will also require advertisers for employment, housing, and credit to certify compliance with anti-discrimination laws, and it will institute a system of automated and human review to ensure that such ads are properly identified and channeled into the separate flow. Additionally, due to a three-year monitoring period in the agreement, we’ll be watching Facebook’s progress closely to ensure that it implements these changes fully.

The ACLU and partner civil rights groups have been advocating for changes like these for close to three years, and Facebook had already agreed to remove some targeting options that could serve as a proxy for race after investigative journalism exposed the practice. Per today’s settlement, Facebook will also remove targeting options based on gender, age, and other protected characteristics while committing itself to ensuring that advertisers using its targeting tools comply with the law.

These are important steps to addressing the problem of weblining, and we expect other platforms to follow suit.

But there’s still much more to do. Figuring out when discrimination is occurring online requires robust independent auditing, including the investigative journalism that exposed some of these practices in the first place. But researchers and journalists engaging in common activities to test for online discrimination may be subject to liability for violating website terms of service.

Facebook has allowed some of the parties to the settlement to run tests on Facebook’s ad platform to ensure that Facebook complies with the agreement. But this kind of testing — and monitoring of ads that are published that journalists and researchers have done — must be able to take place unimpeded across all platforms in the same way that audit testing has long occurred in the offline world to enforce civil rights laws.

Because Facebook is such a dominant player in online advertising, today’s settlement marks a significant step toward ensuring that we don’t lose our civil rights when we go online to find a house, job, or loan. But we’ll keep working to ensure that those rights remain intact no matter where we click.