Gigworkers to be under ‘unprecedented surveillance’ in 2021


Photo: Siegfried Modola (Getty Images)

Pa Edrissa Manjang, an Uber driver in the UK, worked for the taxi company for a year of his life before being abruptly fired by an algorithm. “It was not good at all,” Manjang said in a… interview with Worker Info Exchange. “It felt like you were dealing with robots most of the time.”

In a newly released reportManjang claims he was fired after Uber’s facial recognition system failed to recognize the photos he submitted to the app. Uber introduced this verification system as a security measure to assure customers that their drivers are who they say they are, but in this case, and others like it, the detection system was wrong. Manjang, who is black and generally knew facial recognition systems conflict to identify non-white users, appealed the case and insisted on having his photos reviewed by a human, but claims he was unsuccessful.

“It’s not what we’re used to,” Manjang said in the report. “I have worked with the government and public companies in this country. You have that access to your employer, but that is not the case with Uber. You have the feeling that you are working in front of a computer.”

Manjang’s story is indicative of a wider dilemma plaguing gig workers around the world, detailed in a new 82-page report released Monday by Worker Info Exchange, Privacy International and App Drivers and Couriers Union titled, Operated by bots: data-driven exploitation in the gig economy. The report details the plethora of ways gig workers are regularly subjected to “unprecedented surveillance” techniques needed to complete their jobs throughout the day. Worse, many of these workers are on the receiving end of surveillance systems, even if they’re not on time waiting for a new job.

While the specific types of monitoring techniques described vary widely, the report delves deep into fraud detection software and facial recognition verification systems, both of which are growing in popularity. Facial recognition systems, in particular, are often billed by app makers as a means of strengthening security, but the report claims there are relatively few real-life examples of gig workers trying to circumvent rules.

“The industry’s adoption of facial recognition technology is completely disproportionate to the perceived risk,” the report’s authors say.

The report also details how apps are increasingly using AI systems to perform roles once associated with a manager, going as far as firing employees in some cases. The report questions the use of algorithms by DIY companies to perform management and dictate prices through the use of digital driver monitoring techniques, such as GPS, customer ratings and job completion. In the case of Uber, past driver preferences and behavior may also play a role in whether the apps direct a driver to a customer.

Investigators also found that employee accounts were improperly terminated due to geolocation controls that falsely accused drivers of attempting to share their accounts fraudulently. These examples point to both the closely monitored nature of the apps and the real-world consequences of AI-induced management decisions.

“Platform companies operate in a lawless space where they think they can make the rules,” said Open Society Foundations fellow Bama Athreya. “Unfortunately, this is not a game; virtual reality has harsh implications for gig workers in real life.”

Aside from contributing to an environment where employees feel increasingly unappreciated vending machines, the continued outsourcing of key management decisions to AI systems could also potentially violate some European legal protections.

In particular, the report alleges that the gig industry has seen an increase in AI-driven layoffs, which they believe may violate Article 22 of the European Union’s General Data Protection Regulation (GDPR). underneath stock, employees cannot be subject to legal decisions based on automated data processing alone.

Article 20 of the GDPR, meanwhile, states that subjects (in this case the gigworkers) have the right to receive the data they provide. And while most work apps do provide their employees with some data, the report’s authors argue that they often fall short in providing the data drivers need to meaningfully challenge their pay or other benefits. In other cases, employees must navigate a maze of complex websites to access the data they are supposedly guaranteed. The report states that there is currently an “information asymmetry” where app makers have all the necessary data, while the drivers themselves are often left in the dark.

While this may all sound rather bleak to gig workers engaged in digital monitoring, some optimistic legal action and changes are on the way.

Earlier this year, the Italian data protection authority took action against handyman Deliveroo, to spend fined $2.5 million for alleged breach of GDPR protection. In its ruling, the agency said the company was not transparent about how its algorithms were used to allocate employee orders and book shifts.

In Spain, lawmakers recently approved a groundbreaking law that would force delivery platforms to hire about 30,000 couriers previously considered independent contractors and provide greater transparency about how algorithms are used in management. As part of the new law, companies will be required to provide employees or their legal representatives with information about how algorithms are used to assess their job performance. Meanwhile, in the UK, the country’s Supreme Court confirmed a ruling earlier this year that forced Uber to classify its drivers as “workers,” rather than as independent contractors, a distinction that gives them additional employment protections.

There is also some movement around algorithmic transparency in the US. Just last month, the New York City Council passed a first of its kind bill banning employers from using AI screening tools to hire candidates unless those tools have undergone a bias audit.

The gig worker report makes it clear that these issues of worker surveillance and AI management are now part of this unbalanced ecosystem, especially as more and more traditional employers keep an eye on the gig work model as an attractive business opportunity. In that context, the authors argue that labor rights are “inextricably linked to the exercise of data rights”.


Stay tuned for more such real estate news and updates at zavalinka.in

Leave a Comment