Illustration : Jim Cooke

The next time you go to buy toilet paper online, an algorithm may decide to charge you $5 more than your neighbor. You’d probably never know. But even if you did, there’s no way for you to find out why.




For at least a half-decade, researchers have documented and studied the use of so-called “surveillance scoring,” the shadowy, but widely adopted practice of using computer algorithms that, in commerce, result in customers automatically paying different prices for the same product. The term also encompasses tactics used by employers and landlords to deny applicants jobs and housing, respectively, based on suggestions an algorithm spits out. Now experts allege that much of this surveillance scoring behavior is illegal, and they’re are asking the Federal Trade Commission (FTC) to investigate.

In a 38-page petition filed Tuesday, the Consumer Education Foundation, a California nonprofit with close ties to the group Consumer Watchdog, asked the FTC to explore whether the use of surveillance scores constitute “unfair or deceptive practices” under the Federal Trade Commission Act.


Its case, lofted by years of research and documentation by academics, journalists, and other consumer advocacy groups, is a compelling one.

The group’s petition submits that Americans aghast by reports of China’s use of “social credit scores” to determine a citizen’s “trustworthiness” are mostly unaware that many U.S. businesses are already using strikingly similar creations; pernicious systems that are even less perceptible than China’s monstrosity thanks to U.S. laws that protect the inner-workings of commercial algorithms as “trade secrets.”

“The fact that American corporations are mimicking the actions of an authoritarian government to score and treat consumers differently is disturbing,” Laura Antonini, policy director at #REPRESENT, a Consumer Education Foundation project, told Gizmodo. U.S. companies, she argued, should not be given license to hide behind the “sham argument” that algorithmically-derived consumer scores are trade secrets to “evade accountability.”


A box of ballpoint pens: $9.69 when Walmart had access to the customers’ personal data; $4.15 when it didn’t.

The petition, which identifies dozens of major U.S. businesses known to use surveillance scores, as well as 11 U.S.-based firms known to provide them, describes how as many 121 analytics companies across the country “categorize, grade, or assign a numerical value to a consumer based on the consumer’s estimated predicted behavior.” These scores may determine which customers are “treated poorly”—refused the opportunity to return items, for example—and which receive “preferential treatment.”




Customer value scores, such as those used by major retailers like Walmart, enable retailers to render “instantaneous, automated judgments about a consumer that may result in consumers paying different prices for the same product based on how much profit the algorithm decides a particular consumer will produce,” according to the petition.

Just as retailers conceal the use of customer value scores from consumers, the analytics firms generating these scores keep their inner-processes secret from the businesses contracting them. The origins of their data are also secret. This invariably means that even if a customer was somehow to learn a bad score is based on inaccurate or outdated information, there exists no recourse to appeal.


On the matter of unreliable data, the petition points to a 2013 FTC study that found roughly five percent of actual credit scores—which are generated under a far more transparent system—contained errors that could lead consumers to “paying more for products such as auto loans and insurance.” Howard Shelanski, head of the FTC’s Bureau of Economics at the time, called the study “eye-opening,” saying the first-of-its-kind study made clear that “consumers should check their credit reports regularly.” Or else, “they are potentially putting their pocketbooks at risk.”

Whereas flaws on credit scores can be remedied through a tightly regulated process available to all consumers, consumers have no way to verify that companies aren’t charging them more money—or denying them access to a job or housing opportunity—based purely on erroneous data


The petition cites a study conducted by researchers at Northeastern University in 2014 examining the pricing practices of online stores. The study found that consumers faced higher prices on a number of sites compared to those offered an automated browser with no personal history.

The same researchers subsequently released a Chrome extension enabling users to perform searches across five online stores to learn if they were being charged a different price based on their personal data. The Consumer Education Foundation writes that it recently used the tool to conduct its own research. In several instances, the browser with no links to a real person was offered products at prices cheaper than those of its researchers.


The difference in price for a toilet-paper-roll holder on Walmart’s website was more than $10. A box of ballpoint pens: $9.69 when Walmart had access to the customers’ personal data; $4.15 when it didn’t. Walmart did not respond to a request for comment.

The petition also cites differences in prices between items on Home Depot’s website, such as a bucket of paint that petitioners say was priced at $62.96 for them and $59.87 for an anonymous user . I n a call with Gizmodo, however, Home Depot denied using any kind of customer scores whatsoever to set the price of its products . Customers on its website are quoted the cost of products as provided by their nearest Home Depot store, the company said, or else, given a nationally listed price. Those prices can vary, it said.


The petition goes on to argue that the true purpose of algorithmically-generated customer value scores is to assign consumers a monetary value “based on the predicted profit that they will generate for the company.” That is to say, the reason they exist has nothing to do with imparting any benefit on consumers, or ensuring the company’s prices are more competitive.

“Companies should not be permitted to hide behind the sham argument that these scores are trade secrets in order to evade accountability.”


Companies may consider customers less “valuable,” for example, if they make frequent use of customer service or routinely buy items on sale. This, even though repeated calls to customer service, the petitioners argue, potentially indicates that a product or service is bad. “Indeed, it appears that the purpose of the customer value score is to punish a savvy consumer who shops for the best deal or knows how to assert her rights,” the petition says.

Surveillance scores, Antonini told Gizmodo, also offer companies a quiet way to discriminate against consumers based on wealth. “Any scores based on income or zip code—characteristics that often serve as surrogates for race—will weed out consumers that they do not want to do business with, do not want to hire, or do not want to provide housing to,” she said.


So how do these methods violate existing federal law? The petitioners argue that surveillance scores (and specifically customer value scores) meet the definition of “unfair or deceptive acts or practices” under Section 5 of the Federal Trade Commissions Act.

The FTC Act deems a commercial practice “unfair” when it meets three qualifications: The practice causes, or is likely to cause, substantial injury to consumers; consumers cannot reasonably avoid it; and benefits to consumers or competition do not outweigh the impact of the injury itself.




Whether the individual differences in price—$10 more for a piece metal that holds toilet paper—are significant or not doesn’t matter, the petitioners argue. They are likely inflicting a “relatively small harm” on a “large number of consumers,” to cite the FTC’s standard.

Any pricing algorithms that rely on scores involving factors such as age, race, gender, ethnicity, or religion—demographic details that can be easily determined based on a slew of secondary factors, such as what neighborhood a person lives in, what schools they attend, or what charities they give to, not to mention which Facebook groups they’ve joined—would automatically be an express violation of the FTC Act.


“Providing poor customer service to consumers based on secret customer value scores causes, or is likely to cause, substantial injury to consumers,” the petition argues, adding that labeling a customer a “fraudster” based on inaccurate or irrelevant information is likely to do the same.

The petitioners point to research described last year in the Wall Street Journal that shows high “fraud scores” may be attached to customers who purchase items “without checking the return policy” and those who pay extra for the fastest shipping option. Companies such as Macy’s and Finish Line did not deny using such metrics, which were generated by Riskified, a Tel-Aviv based firm.


Ultimately, the point is it’s impossible for consumers to avoid the harm inflicted by algorithms when one decides to charge them a different price or results in them receiving lousy customer service. All the evidence this is happening is stamped “Confidential and Proprietary,” and is shielded from the eyes of consumers by the very agency tasked with protecting them.

The FTC did not respond to a request for comment.

You can read the full petition filed by the #REPRESENT and the Consumer Education Foundation here.


Update, June 25: A previous version of the article stated that Home Depot relied on customer value scores citing research by the Consumer Education Foundation. Home Depot did not initially respond to a request for comment, but was later adamant that it does not rely on any scores to determine how much customers pay for items. The article was updated to include Home Depot’s denial.