Gig workers: Regulating Work Platforms, exposing Algorithms

Gig workers: Regulating Work Platforms, exposing Algorithms

By A.S. Josie

Worldwide, gig and platform workers are struggling to be considered “workers” and not “independent contractors”, so as to gain the protection of labour laws and social safety nets.

Gig workers have been recognized by the courts in the UK, Canada, Spain, Netherlands, France, Denmark, some U.S. states, and an expanding number of countries. A few employers and unions, such as in domestic work, have signed collective bargaining agreements for platform workers.

The new Rajasthan Platform Based Gig Workers Act (2023) is the first legislative Act worldwide to recognize gig workers as workers, in relation to obtaining social security benefits.

Earlier, the Indian Federation of App-based Workers (IFAT) had approached the Supreme Court through a PIL to obtain social security protections for “unorganized workers” under the Unorganized Workers Social Security Act, 2008. This case is not yet decided.

The next step in the struggle for rights for gig workers must be to address the algorithms through which working hours, wages, incentives, leave, quotas, etc. are determined. For this, not just legislation, but regulation of AI is necessary.

This article will explain the role of algorithms in controlling workers lives, and the importance of exposing them.

Algorithms

Algorithms are formulas that shape the new workplace and work culture, as well as consumer behaviour. They are secret, heavily protected by employers, and so are called “black boxes”. This is partly intentional, and partly due to the complex processes within “machine thinking”. Their effects, however, can be measured.

It is known that algorithms can have harmful effects by promoting competitiveness between workers, causing new types of discrimination, recommending hate speech, polarizing, increasing workload and intensity, and otherwise negatively impacting citizens’ digital rights, workers’ rights and consumer protection.

For example, UK Postal Service workers lost their jobs due to a mistake in an AI system that falsely determined that they had stolen money from the company. In the Netherlands, drivers sued Uber after an algorithm suspended their accounts for allegedly committing fraud. In the U.S., AI is used in the criminal justice system, to determine whether a prisoner should get parole but the AI tool was found to be biased and wrong in its predictions. In Canada, pre-risk removal assessments for refugee applications are to be assessed by AI systems. This is mentioned here to show the wide variety of extremely sensitive decisions that are being carried out by AI.

Workers everywhere report that their take-home pay is lowered by the algorithms of their platform-employers.

The PIL filed in the Supreme Court of India also notes that “the lack of information and transparency available with the driver partners’ in the manner of working of the App based companies as regards the fixation of fares, promotional costs, surge pricing, incentives, penalties and bonuses. There is little or no information available as to how exactly rides are allocated.”

There are also now massive databases containing full employee profiles. In India, a huge reserve pool of labour, which can be electronically tracked, is a complete boon for employers.

Data protection legislation (even a good one, not the one passed recently) will not be able to address the types of discrimination and omission that flow from algorithmic preferencing and price setting.

For example, when an Italian court ordered the platform Deliveroo to disclose their algorithm, it was found the algorithm had not taken into account employment law which granted sick leave and the right to strike. It was against the country’s laws.

The next horizon of citizens’ and workers’ struggles may be to unpack or open up the algorithms that control working life.

Controlling workers’ lives through new technology

Cutting-edge research in management studies state that “employers … use algorithms to help direct workers by restricting and recommending, evaluate workers by recording and rating, and discipline workers by replacing and rewarding.” (“Algorithms at work: The New Contested Terrain of Control“, Kellog, Valentine, Christin, in Academy of Management Annals 2020, Vol. 14, No. 1, p.369.)

While these six mechanisms of control are not new to the workplace, the data upon which they draw are is more comprehensive than ever before.

Workers’ identity, movement, emotions, conversations, expressions, text-data, consumer reaction, are captured, for instantaneous processing and feedback by chatbots.

When the recommendation (or direction/ instruction) to the worker makes no sense, or does not conform with the workers’ own observations, habits, decisions, choices or intuitions, it becomes a point of frustration, leading to workers to feel powerless. A simple example is how rider apps do not inform the driver of the final destination for the ride, which entirely removes worker’s choice or negotiation leverage. This alienates the worker from the job and raises the level of dissatisfaction.

Restrictions occur in the limited range of options for workers. For instance, when their offline interactions with clients are monitored and actively discouraged.

Feedback mechanisms are designed to not capture workers’ actual opinion but only what fits some predetermined criteria.

Workers are constrained from speaking to each other ; when they file complaints there is no human recipient.

Surveys done with Indian gig workers show extreme dissatisfaction with the existing complaints redressal mechanisms at their jobs.

Workers’ profiles are also changed without their knowledge which can render them entirely invisible.

The evaluation of workers has historically relied on observation, feedback, and performance. A much wider range of behaviours are now being monitored, including communication between workers, their work patterns, and how their time is spent minute by minute.

Workers are not told what is being observed and where, with many worrying that their home life is also being monitored. At the same time, they also receive selective real-time feedback, especially around targets, which keep them constantly on edge and drive them towards injury.

Whereas the premise of gig work is flexibility, workers’ constant availability and engagement with the platform is a key aspect of evaluation. Workers’ experiences include a feeling of invasion of privacy and vulnerability. When customers’ feedback is decisive, then customers’ own biases come to further influence the outcome.

Instant penalization for refusing to or delay in following orders is severely disciplining and demoralizing. Workers are not informed about pay changes, and again this is destabilizing for workers.

The ease with which especially low-skilled workers can be replaced is a qualitative difference than earlier forms and obligations of employment.

Finally, new “personality types” are being generated in young men and women by the experience of being gambled, which includes insecurity, isolation, competitiveness, and unhealthy dependence and powerlessness. This is extremely worrying for the health of any democracy.

A few of these issues have now been addressed in the new Rajasthan Act which, firstly, requires mandatory registration of all gig workers.

Secondly, the Act promises a grievance redressal mechanism, which hopefully channels grievances appropriately to actual humans, and without silent AI-based repercussions. However, the majority of the effects of algorithms still remain unchallenged.

Blinkit worker

Regulations, with input from Workers Organizations

Every government in the world has to, therefore, develop regulatory frameworks for AI and algorithms given their huge control over working peoples’ lives.

Canada, New Zealand, the UK and the EU are ahead on regulation. The 2021 EU Proposal for a Regulation of the European Parliament and of the Council laying down Harmonized Rules on Artificial Intelligence and Amending certain Union Legislative Acts specifically addresses employment-related AI systems as “high risk”.

However, like contemporary soft law on labour in value chains, the Proposal requires only self-monitoring by the platform owners. The

Proposal also does not specify when and how the negative outcomes of surveillance and off-boarding etc. are to be reduced.

As well, the EU Proposal does not require the participation of trade unions and workers’ bodies in drafting or overseeing any regulations to

manage the algorithms. This undermines the tripartite structure – government, employer, employee – that the ILO has sought to normalize across the world.

Spain is ahead of the EU even. After a social dialogue process, a Royal Decree was passed for the transparency of the algorithms which

control and manage “riders”. Trade unions have been given the right of access to the algorithms. This ensures that violations will be handled collectively and not as individual grievances. This is a pioneering law attempting to regulate tech-management.

The Decree too has its limitations – it does not transfer any control over the algorithm to Workers Councils. Nor does it give real-time

access, but only periodic access. Further, no decision-making or veto power is given to the Councils.
In India, the Ministry of Electronics and IT has recently stated that it has assessed the ethical concerns and risks of bias and discrimination associated with AI and it does not intend to introduce legislation to regulate it.

https://www.workersunity.com/wp-content/uploads/2023/05/Gig-workers.jpg

Back-End of Algorithm

A complementary approach at regulation targets not the algorithm directly, but the effects of automated surveillance and decision-making on workers.

In September 2021, California passed a law AB 701 to protect workers from an unfair assessment of productivity in warehouse distribution centers (eg. Amazon). This law emerged after a legal case which revealed that Amazon used automated surveillance to fire workers for lack of productivity based on actions like the use of the bathroom and the exercise of other basic rights like meal and rest periods. It also revealed the high rate of injury due to the extreme “work quota” targets placed on employees.

Jeff Bezos, CEO of Amazon, does not at all deny that algorithms targets workers’ bodies. He has stated “complicated algorithms that micromanage workers’ bodies are needed because “many” new employees “might be working in a physical role for the first time.”

The Bill now requires employers with large warehouse distribution centers to disclose quotas and pace-of-work standards to each employee upon hire.

https://www.workersunity.com/wp-content/uploads/2022/06/zomato-gig-worker-.jpeg

Workers’ Struggles

Remedial laws and better regulations have always emerged from protest and publicity on the harms generated by the new technologies. In India, protests have, at times, produced better immediate results than legal struggle and policy debates.

In 2021-22, professional women workers of Urban Company protested outside the UC Gurugram office against high commissions, assigning night bookings to women, high costs of necessary purchases, and being penalized for not accepting all bookings. Urban Company subsequently surveyed its employee base and then issued a 12-point program of reform. It promised to reduce the blocks placed by algorithms by 80%, and cap penalty amounts.

This suggests that checks and balances in real time could be found through empowered workers’ organizations.

However, in the longer run, Urban Company continues to set unrealistic productivity targets, including very long-distance travel, which has forced many of its employees out of work, all without human mediation.

Long-term gains will, therefore, require regulation of AI based wage-setting, job security, and working conditions.
This article has shown that the struggle for “good safe jobs” has many dimensions of the new tech-management to tackle.

Subscribe to support Workers Unity – Click Here

(Workers can follow Unity’s FacebookTwitter and YouTube. Click here to subscribe to the Telegram channel. Download the app for easy and direct reading on mobile.)

Workers Unity Team

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.