Federal Trade Commission (FTC) Chair Lina Khan is once again stoking unsubstantiated fears about algorithmic pricing—the practice of using algorithms to offer customers different prices based on dynamic market conditions—but this time, her examples are so outlandish that it is hard to tell whether she is making a serious policy argument or auditioning to be a guest writer for a low-budget reboot of Black Mirror.
In a recent speech, Chair Khan warned about “the possibility of each of us being charged a different price based on what firms know about us.” She cited two examples: first, “somebody being charged more for an airplane ticket because the company knows that they just had a death in the family and need to fly across the country,” and second, “a family where a kid has a nut allergy being charged more for the granola bars without nuts.”
First, neither example withstands any serious scrutiny. For instance, many U.S. airlines offer bereavement policies that provide lower fares or greater flexibility for those traveling due to a family death. While airlines could charge more in these cases, they have chosen a more compassionate stance. Indeed, it may come as a surprise to the FTC chair, but not all big businesses are run by heartless cartoon villains scheming to exploit their customers.
Consider the granola bar example. Businesses are already aware that many consumers who buy allergen-free products have allergies—they don’t need detailed consumer data to understand this. As one BBC article put it, “Manufacturers do not appear to gouge special-diet consumers simply because they can.” Rather, these businesses charge consumers more for allergen-free products due to the costs of maintaining allergen-free environments, the smaller economies of scale in selling niche products, and the liability risks if their products cause harm.
Second, the FTC abandons any pretense of engaging in an honest discussion about algorithmic pricing by labeling it “surveillance pricing.” This choice of language is a deliberate tactic used by anti-tech activists to misrepresent data-driven technologies as “surveillance,” insinuating that they violate privacy, limit consumer control, and lack transparency. Critics of the tech industry often use terms such as “surveillance capitalism,” “surveillance advertising,” and “face surveillance” to describe otherwise innocuous activities, such as ad-supported businesses, personalized advertising, and facial recognition.
Third, Chair Khan disregards the benefits that algorithmic pricing can offer consumers. By adjusting prices based on consumers’ ability to pay, algorithmic pricing can actually enhance overall welfare. Price differentiation is already a vital aspect of the global economy; for instance, companies often sell medicines or video games at reduced prices in low-income countries to increase accessibility, while charging higher prices in high-income countries. This principle applies domestically as well: if a single price is set, many low-income consumers could be priced out entirely. However, by offering lower prices for low-income consumers and higher ones for high-income consumers, sellers can broaden access, benefiting both consumers and producers. Consider discounts for seniors or students—would the FTC chair advocate for eliminating those as well?
Bashing algorithmic pricing in public speeches may generate headlines, but it does little to advance the public discourse around the legitimate uses of technology. As the FTC continues to study this issue, it should stick to the facts and evidence, rather than get caught up in dystopian rhetoric disconnected from reality.
Image Credits: Tom Williams/CQ Roll Call via AP Images