We all know that ride-share companies like Uber and Lyft operate dynamic, or “surge”, pricing: they change their prices in real time, according to supply and demand. But is there something else behind these fluctuations in fees? Is your taxi fare actually being personalized according to how much the company thinks you are willing to pay?



I’ve long had the hunch that this might be the case. Partly because I’ve occasionally noticed that I’m being quoted a different price than a friend who happens to call an Uber to the same place at the same time. Other people have had similar observations.



They’ve noticed that they get sent 50% off Uber discounts every week, for example, while other people don’t. And one of my friends, Dan, says that when he switches from his personal credit card to his corporate credit card in the Uber app, his quoted price often decreases. “It might have something to do with my kid vomiting in some cabs,” he hypothesizes. There may well be an algorithm that has figured out that dad-Dan is not as desirable a passenger as corporate-Dan, and charges him accordingly.

Of course, anecdata doesn’t prove anything and can easily be explained away. Two people getting quoted different prices for the same Uber ride might be due to the fact that Uber’s dynamic pricing algorithm is very sensitive and changes every split-second. However, I find it hard to believe that the likes of Uber and Lyft haven’t experimented with personalizing their prices by analyzing your personal data and figuring out how price-sensitive you are.



Personalized pricing, which is also known as price discrimination or price optimization, depending on whether you’re an economist or an online marketer, is a growing trend. According to a recent Deloitte and Salesforce report, 40% of brands that currently use AI to personalize the customer experience have used it to tailor pricing and promotions in real time.



More companies are taking advantage of the wealth of customer data they have access to in order to figure out pricing on an individual level. If it seems like someone might be willing to pay more than the reserve price, it makes sense to charge them more than someone who is careful about what they spend.



How many companies are currently engaging in practices such as these? “It’s extremely hard to detect so no one really knows,” explains Maurice Stucke, professor of law at the University of Tennessee and co-author of Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy. Information about personalized pricing practices typically “only comes out when there’s a leak, when someone from the inside divulges it”.

Companies keep these practices under wraps for a reason: customers don’t want to feel like they’re being charged more than other people. It seems unfair. However, despite all the secrecy there have been a number of well-documented instances of personalized pricing in action.



In 2012, for example, a Wall Street Journal investigation found that Staples.com was quoting people different prices based on where they were located. If they lived near a Staples competitor they would get a cheaper price than someone who had no other options near them.

Last year, Uber actually went some way to admitting that it had implemented personalized pricing. In an interview with Bloomberg, the company acknowledged that it had a new fare system called “route-based pricing”. This means it calculates people’s propensity to pay more to travel a certain route at a certain time of day, and charge more for that route. Traveling between a fancy neighborhood and a city center during peak commuting hours, for example, might cost a premium rate, because the company expects people will pay for it.



While this may not be the purest form of personalized pricing, as Stucke notes, it is “third-degree price discrimination. You’re discriminating against people who fall into certain groups.” When I asked Uber if they practiced personalized pricing, referencing this example, they flatly denied it.



The only statement an Uber spokesman would offer was: “We may price routes differently based on our understanding of demand patterns, not individual riders.” This carefully worded answer was, at least, better than Lyft, who didn’t respond to multiple requests for comment.



There are other clues that Uber et al might be examining your personal data in order to toggle their prices based on your propensity to pay more at a particular moment. In 2016, for example, a behavioral scientist at Uber divulged that the company knew that people were more willing to pay a higher fare when their phone batteries were low. While they said they “absolutely don’t use that” information, one has to wonder why they have a behavioural economist on staff in the first place, if it isn’t to manipulate prices based on people’s behavior.

What other data points might Uber be looking at to gauge your price sensitivity? Obvious factors include the sort of credit card you use, where you live, the make of phone you’re using, and your ride history. But, really, we have no idea how much information the likes of Uber know about us and how they are using this data; their privacy policies are incredibly broad.



That there is so much mystery around pricing in the ride-share industry is a deeply troubling thing. Not least because the rise of ride-sharing companies is having a negative impact on public transit and deepening inequality. Uber et al are increasingly being used as an alternative to public transit by higher-income people, meaning these systems lose money.



We’re in effect seeing the privatization of mass transit. In this scenario it’s incredibly important to monitor price discrimination and ensure it doesn’t turn into other forms of discrimination. A pricing algorithm could very easily start penalizing people who live in poorer neighborhoods, with fewer transport options, for example.



“There has to be greater transparency and accountability on the parts of ones who are setting the prices,” Stucke says. “These algorithms really are black boxes. You don’t know how they’re arriving at those prices and whether you’re being discriminated against. Consumers should have the ability to avoid such forms of discrimination if they so choose and have greater ability to protect their anonymity.”

With Facebook’s Cambridge Analytica scandal dominating headlines over the last few weeks, it seems like data collection and manipulation by tech companies has finally roused mainstream interest. So let’s not let the story stop at Facebook. Let’s ensure we’re constantly holding all of these tech companies accountable and making sure they’re not shamelessly taking us all for a ride.

