The ACLU filed comments today with the FTC urging it and the Consumer Financial Protection Bureau to investigate whether big data is being used in online marketing in ways that are racially discriminatory. If companies are in fact engaging in this kind of racial discrimination, we urge the agencies to take enforcement action against them.

In the pre-digital era, advertisers could only target particular kinds of consumers in fairly general terms. Ads placed in the New York Times, for example, would reach a different audience than ads placed in the Amsterdam News, New York’s African-American newspaper. But this form of market segmentation was limited—advertisers would have no way of keeping News readers from seeing, and responding to, the Times ad.

That proposition is very different in the era of big data. Behavioral targeting allows advertisers to decide which ad to show a specific person based on the data known about that person. So, for example, advertisers may opt to show users different prices for products based on data functioning in the background, such that Mac users, or those who live in more affluent zip codes, or those who live farther from a competitor’s store see higher prices. Although many of us find this more precise targeting more troubling, and the existence and aggregation of the personal data that enables it certainly raises privacy concerns, the consensus view is that this kind of price discrimination isn’t illegal.

Except that price discrimination can also be race discrimination, if it’s based on data about race or factors closely linked to race (it can also be sex discrimination, or disability discrimination, or some other form of discrimination, depending on which data an algorithm relies upon). So, if an advertiser were to display and charge one price to Asian users and another, higher price to Latino users, that would be straightforward racial discrimination.

And, at least in some areas, existing anti-discrimination law makes clear that such practices are illegal. The Equal Credit Opportunity Act (ECOA) prohibits discrimination in any aspect of a credit transaction by any entity that extends credit. If lenders are intentionally using behavioral targeting to advertise more expensive credit products to people of color, or if they are marketing credit products in ways that cause borrowers of color to end up receiving credit on less favorable terms than equally creditworthy white borrowers, those practices violate the ECOA.

This is not a farfetched idea: as the Wall Street Journal revealed more than four years ago, big data is being harnessed to decide which credit card offer to show particular users, and it could just as easily be used to decide which auto loan or mortgage product to display. We believe that the online marketplace has incredible potential to render obsolete the discrimination that has all too often infected lending and consumer transactions in this country. But the unregulated use of big data could spoil that potential by transposing offline biases into the algorithms that shape our digital experiences. It’s our hope that the FTC and CFPB will take decisive action to preserve a bright future for all consumers on the web.