The concept of equal pay for equal work is fundamental to Americans’ sense of fairness in the workplace. But the rise of artificial intelligence allows some big companies to pay workers different amounts for doing the same job, a new research report has alleged.
The study found that companies like Uber and Amazon, which rely on independent contractors for labor, use artificial intelligence to create so-called “algorithmic wage discrimination,” similar to consumer price discrimination.
Retailers and advertisers charge consumers Different prices for the same product, based on how much they believe a person is willing to pay, vendors glean details like what web browser they’re using. Similarly, companies that use independent contractors collect detailed information about where they live, when and where they work, how much money they want to make, and what types of work they want to accept or reject. The author of the report is University of California Hastings law professor Veena Dubal.
From the perspective of rideshare drivers, pay based on these metrics leads to unpredictable and variable pay, according to Dubal, who himself has conducted hundreds of interviews with gig workers.
Some ride-hailing drivers say the companies they work for are “gamifying” jobs, manipulating them and forcing them to gamble just to make a living.
“Algorithmic wage discrimination allows firms to personalize and differentiate wages for workers in ways unknown to them, paying them to behave as the firm wishes, perhaps [paying] As little as the system determines they may be willing to accept,” the report reads in part.
Workers have to “guess” their wages
And while companies have a wealth of information on employees, they have little insight into how their pay is determined.
“Given the information asymmetry between workers and firms, companies can calculate the exact wage rates needed to encourage desired behavior, while workers can only guess why they make what they do,” the report said.
Dubal added that workers cannot rely on their jobs for economic stability or security and called the company’s pay practices “deeply predatory.”
“It’s like gambling! Home always wins,” said Ben, a rideshare driver Dubal interviewed.
Another Uber driver, Domingo, said he completed 95 of the 96 trips required for the $100 bonus. Despite being located in a busy part of town, she had to wait 45 minutes to secure her final ride and earn the $100 she was counting on to pay for groceries. He believes Uber is pushing him to work long hours.
“It looks like the algorithm turned against you. There was one night at the end of the week, if it felt like the algorithm was punishing me. I had 95 out of 96 rides for the $100 bonus… It was ten o’clock at night in a popular area. That last ride in a popular area. It took me 45 minutes to get,” he told Professor Dubal. “The algorithm was passing me over people who weren’t close to their bonus. There’s no way to verify that, but that’s what seemed to be happening.”
Getting your boss “inside your head”.
This close workplace monitoring effectively removes a worker’s most powerful bargaining tool: the fact that, typically, only they know what wage or salary they are willing to accept for a job.
According to Dubal, that’s the scariest thing about the practice.
“One source of my power is that I know what I’m willing to accept, and my employer doesn’t,” he told CBS MoneyWatch. “These exercises remove that, because they learn what a worker is willing to accept in a particular context. They get inside your head.”
He said that this kind of insight into how workers think, combined with the availability of other information such as credit data or how much a given worker might owe in rent “could lead to an extraordinarily controlled economy where the people in control are the companies and no one else.” ”
It could undo decades of social and labor movement in favor of equal pay for equal work, he added.
“It’s really scary, and that’s how you get retrenchment — bringing back workers’ rights — through a new cultural sense of what’s right and what’s not right,” he said.
.