On my way back from London a few weeks ago, I slipped down a rabbit hole that I haven’t tunnelled out of since. I knew what I had paid for my seat, how many miles I had used for the pleasure of an upgrade. But I had no idea if the woman across the aisle had spent just a few points, like I had, or if she’d paid more than the $10,000 the airline might charge for the same trip. Booking a flight has long been about playing a game where only the airline knows the rules, with countless booking codes, loyalty programs and fare changes weaponizing your data in your wallet. But after I got off, I kept seeing the same rigged game everywhere: on every Uber ride, every Amazon order, every trip to the supermarket. All these businesses now know so much about me that they can see a number flashing over my head: the exact price I’d be willing to pay at any given moment. Your number is flashing above your head right now.
In the algorithmic age, price volatility is increasingly entering digital commerce, with rates rising and falling in real time.
What’s far more concerning is the rise of personalized pricing, the practice of digital retailers using your data to charge the exact price you’re willing to pay, which may be different from what the guy next door would pay. you. Custom pricing is not only biased and can fuel inflation, it creates a world where you never know when your apps are ripping you off.
Now, when I’m about to pay for anything on my phone or laptop, I wonder if I’d pay less if I used someone else’s account.
I still remember the low-grade shock I felt a decade ago when I learned that price discrimination is often perfectly legal in the United States. In law school, my antitrust professor introduced us to the obscure Depression-era Robinson-Patman Anti-Discrimination Act, quickly pointing out that this law failed to live up to its title. Under the longstanding law, companies can face devastating penalties for price discrimination only if they discriminate against other businesses. If a wholesaler charged a store, the store could take it to court, but there was nothing then (or now) to stop the store from doing the same to its customers. That is, store owners have more price protection than their customers. If a store generally charges some customers more than others because of gender, race, or other legally protected characteristics, that is certainly illegal. But when companies want to shock each customer for the maximum they are individually willing to pay, they are free to engage in highway robbery.
I say low-grade shock because at the time personalized price discrimination was much less widespread and harmful than it is today. Sure, coupon culture allowed companies to sell the same product in the same store at the same time at different prices—but it gave customers agency. Price-sensitive shoppers took the time to look for scraps, and the less thrifty paid full freight. Coupons, loyalty cards, seasonal discounts—many traditional price discriminations allow individual shoppers to choose which price bracket they want to fall into.
But algorithmic price discrimination removes that choice. And the data-mining methods to sort people into price groups are more invasive than you might realize. Take your last Uber ride. When you ordered that car, you probably knew that the distance you were driving and the time of day were price factors, as we’ve become unbearably accustomed to the cold, attractive efficiency of price gouging. But have you considered connecting your phone before ordering the ride? If you did, it could have saved you a few bucks, because battery level is supposedly one of the factors Uber uses to rate your ride, a charge Uber strongly denies. If the allegations against Uber are true, it’s easy to see a rationale: those with less battery left are more desperate, and those whose phones are minutes away from dying won’t hesitate to pay almost any price for get a car before they die. blocked.
As The American Prospect recently detailed, this type of individualized pricing is spreading across nearly every sector of the economy (broadcasting, fast food, and even dating apps), and it can be surprising which variables will cost you the most. In the 2010s, retailers relied on somewhat raw data to perfect prices. Consumers may have paid more for a flight they booked on a Mac (versus a PC) or paid a higher fee for test prep in ZIP codes with larger Asian communities. But in recent years, companies have moved from price discrimination at the neighborhood level to individualized pricing.
Retailers like Amazon know so much about what you buy, both on and off its platform. And you have no way of knowing when your choices are changing what you pay. In 2018, it was headline news that Amazon adjusted prices 2.5 million times a day. Given the growth of Amazon and the rise of AI, the number is likely to be an order of magnitude higher today. For retailers like Walmart, it’s not enough to use our purchase history. In February, the retail giant agreed to buy smart TV maker Vizio for more than $2 billion, giving Walmart an unexpected wealth of intimate consumer data. Smart TVs not only monitor what we watch with Orwellian precision, but track other nearby devices with ultrasonic beacons and can even listen to what we say in the privacy of our own homes. Vizio specifically has been fined millions of dollars for allegedly illegally spying on customers.
Not only do retailers know what you’ve bought and how much money you make, but they often know where you are, how your day is going and what your mood is, all of which can be neatly synthesized by AI neural networks to to. calculate how much you will pay for a given item at a given time.
Your age, gender and sexual orientation can determine what AI decides you should pay for love.
No area of commerce is too personal to be off limits. Dating apps are harvesting our romantic lives for data, but some openly boast that they do it to increase profitability. And many of those who don’t disclose the use of personalized pricing still do. Tinder rarely talks about its pricing technology, but Mozilla and Consumers International recently revealed that the dating app used dozens of variables to radically adjust prices for users. Your age, gender and sexual orientation can determine what AI decides you should pay for love.
If left unchecked, personalized pricing will have detrimental effects on society as a whole. Nikolas Guggenberger, an assistant professor at the University of Houston Law Center, says that “hidden algorithmic price discrimination can undermine public confidence in price-making mechanisms and thus undermine the market.” The price of AI also means that those who are most desperate and vulnerable will often pay the most. Even worse, people can be penalized because of their race, age or class. Get phone battery claim. Older people are more than twice as likely as younger users to have a phone that is at least three years old. Since older smartphones tend to have lower battery life, older people may end up paying more than younger people for the same Uber rides.
“Algorithmic price discrimination can essentially automate interest,” says Guggenberger. “If your battery is about to die and you’re out and about, a ride-sharing app can drastically increase your ‘custom price’.”
So much of AI pricing acts as a regressive tax, charging those who have more with less. For people in underserved areas, with fewer stores, fewer options, there’s often no choice but to click “buy now,” even when it hurts. As law professor and consumer watchdog Zephyr Teachout told The American Prospect, we shouldn’t think of this practice as something as harmless as personalized pricing — instead, she calls it surveillance pricing.
We know how to prove human discrimination. If a store in a predominantly black neighborhood charges more than its counterpart in a predominantly white neighborhood, testers can go to any store, record the prices, and file a lawsuit. This type of testing has been at the core of consumer protection for most of a century. But how do you prove when an algorithm discriminates? No stores to visit, no prices to compare, just millions of screens locked in people’s pockets. The result can be a Catch-22, where you can only get enough data to prove discrimination by suing a company, but you can’t sue the company without first having the data. We could see the rise of a twisted, weird legal world where companies that use bias-prone AI to fix prices in secret face less legal scrutiny than brick-and-mortar stores.
I hope this situation is so grim, the potential for abuse so clear, that even our dysfunctional democracy won’t accept it. Our lawmakers have been so slow to curb the damage of new technology, even when it becomes clear, for example, that it is undermining our democracy. But even in these polarized times, AI pickpocketing may be one of those rare issues that can unite us in outrage.
Albert Fox Cahn is the founder and executive director of the Surveillance Technology Oversight Project, or STOP, a civil rights and privacy group based in New York.