AI is quietly being used to pick your pocket
Retailers have figured out how to set prices based on your age, mood, and sexual orientation
Personalized pricing not only bakes in bias and can drive inflation but creates a world where you never know when your apps are ripping you off.
When I was flying back from London a few weeks ago, I slipped into a rabbit hole I haven’t tunneled out of since. I knew what I had paid for my seat, how many miles I had used for the indulgence of an upgrade. But I had no idea if the woman across the aisle had spent only a few points, as I had, or paid the more than $10,000 the airline could charge for the same trip. To book a flight has long been to play a game where only the airline knows the rules, with countless booking codes, loyalty programs, and fare changes that weaponize your data against your wallet. But after I landed, I kept seeing the same rigged game everywhere: in every Uber ride, every Amazon order, every trip to the supermarket. All these businesses now know so much about me that they can see a number blinking above my head: the exact price I’d be willing to pay in a given moment. Your own number is blinking above your head right now.
In the algorithmic age, pricing variability is increasingly creeping into digital commerce, with charges going up and down in real time.
What’s far more disturbing is the rise of personalized pricing, digital retailers’ practice of exploiting your own data to charge the precise price you’re willing to pay, which might be different from what the guy next to you would pay. Personalized pricing not only bakes in bias and can drive inflation but creates a world where you never know when your apps are ripping you off.
Now, when I’m on the verge of paying for anything on my phone or laptop, I second-guess whether I’d be paying less if I were using someone else’s account.
I still remember the low-grade shock I felt a decade ago when I learned that price discrimination is often perfectly legal in the United States. In law school, my antitrust professor introduced us to the obscure Depression-era Robinson-Patman Antidiscrimination Act by quickly highlighting that this law very much failed to live up to its title. Under the long-standing law, companies can face ruinous penalties for price discrimination only if they’re discriminating against other businesses. If a wholesaler overcharged a store, the store could take it to court, but there was nothing then (or now) to stop the store from doing the same thing to its customers. That is, store owners have more price protections than their customers. If a store generally charges some customers more than others because of their gender, race, or other legally protected characteristics, that’s certainly illegal. But when companies want to shake down each customer for the most they’re individually willing to pay, they’re free to engage in highway robbery.
Even in polarized times, AI pickpocketing may be one of those rare issues that can unite us in outrage
I say low-grade shock because at the time personalized pricing discrimination was far less widespread and harmful than it is today. Sure, coupon culture let companies sell the same product in the same store at the same time at different prices — but it gave customers agency. Price-sensitive shoppers took the time to scour for clippings, and less thrifty ones paid full freight. Coupons, loyalty cards, seasonal discounts — a lot of traditional price discrimination lets individual shoppers choose which price group they want to fall into.
But algorithmic price discrimination takes away that choice. And the methods to extract data to sort people into pricing groups are more invasive than you may realize. Take your latest Uber trip. When you ordered that car, you probably knew that the distance you were going and the time of day were price factors, as we’ve grown begrudgingly accustomed to the cold, extractive efficiency of surge pricing. But did you think about plugging in your phone before ordering the ride? If you did, it might have saved you a few bucks, because your battery level is allegedly one of the factors Uber uses to price your trip, a charge that Uber vigorously denies. If the allegations against Uber are true, it’s easy to see a rationale: Those with less battery left are more desperate, and those whose phones are minutes away from dying won’t hesitate to pay nearly any price to get a car before they’re stranded.
As The American Prospect recently detailed, this type of individualized pricing is proliferating across nearly every sector of the economy (streaming, fast food, and even dating apps), and it can be surprising which variables will get you charged more. In the 2010s, retailers relied on somewhat crude data to perfect pricing. Customers might’ve paid more for a flight they booked on a Mac (versus a PC) or paid a higher rate for test prep in ZIP codes with larger Asian communities. But in recent years companies have moved from neighborhood-level price discrimination to individualized pricing.
Retailers like Amazon know so, so much about what you buy, both on its platform and off. And you have no way of knowing when your choices are changing what you pay. In 2018, it was headline news that Amazon adjusted prices 2.5 million times a day. Given Amazon’s growth and the growth of AI, the number is likely an order of magnitude higher today. For retailers like Walmart, it’s not enough to use our shopping history. In February, the retail behemoth agreed to buy the smart-TV maker Vizio for more than $2 billion, potentially giving Walmart a windfall of intimate consumer data. Smart TVs not only monitor what we watch with Orwellian precision but track other nearby devices with ultrasonic beacons, and can even listen in to what we say in the privacy of our own homes. Vizio specifically has been fined millions of dollars over allegations that it illegally spied on customers.
Not only do retailers know what you’ve bought and how much money you make, but often they know where you are, how your day is going, and what your mood is like, all of which can be neatly synthesized by AI neural networks to calculate how much you’d pay for a given item in a given moment.
No area of commerce is too personal to be off-limits. Dating apps are harvesting our romantic lives for data, but some openly brag about doing so to increase profitability. And many of those that don’t disclose using personalized pricing still do it. Tinder rarely talks about its pricing technology, but Mozilla and Consumers International recently found that the dating app used dozens of variables to radically adjust pricing for users. Your age, gender, and sexual orientation might determine what the AI decides you need to pay for love.
Left unchecked, personalized pricing will have pernicious effects across society. Nikolas Guggenberger, an assistant professor at the University of Houston Law Center, says that “hidden algorithmic price discrimination can undermine public trust in price-building mechanisms and thus undermine the marketplace.” AI pricing also means that those who are the most desperate and most vulnerable will often pay the most. Even worse, people could be penalized because of their race, age, or class. Take the phone-battery allegation. Older people are more than twice as likely than younger users to have a phone that’s at least three years old. Since older smartphones tend to have lower battery life, older people could end up paying more than younger people for the same Uber rides.
“Algorithmic price discrimination can basically automate usury,” Guggenberger says. “If your battery is about to die and you are out in the country, a ride-sharing app may drastically raise your ‘personalized price.'”
So much of AI pricing acts as a regressive tax, charging those with the most the least. For people in underserved areas, with fewer stores, fewer alternatives, there’s often no choice but to click “buy now,” even when it hurts. As the law professor and consumer watchdog Zephyr Teachout told The American Prospect, we shouldn’t think of this practice as something as innocuous-sounding as personalized pricing — instead, she calls it surveillance pricing.
We know how to prove human discrimination. If a store in a majority-Black neighborhood charges more than its counterpart in a majority-white neighborhood, testers can go to each store, record the prices, and bring a lawsuit. This sort of testing has been at the core of consumer protections for most of a century. But how do you prove when an algorithm discriminates? There are no stores to visit, no price tags to compare, just millions of screens siloed in people’s pockets. The result can be a Catch-22, where you can get enough data to prove the discrimination only by suing a company, but you can’t sue the company without first having the data. We could see the rise of a perverse, bizarro legal world where companies using bias-prone AI to adjust prices in secret face less legal scrutiny than brick-and-mortar stores.
My hope is that this situation is so bleak, the potential for abuse so clear, that not even our dysfunctional democracy will accept it. Our lawmakers have been so slow to rein in the harms of novel technology, even when it becomes clear, for example, that it’s undermining our democracy. But even in these polarized times, AI pickpocketing may be one of those rare issues that can unite us in outrage.