In a recent article from MIT’s Technology Review, the director of West Virginia University’s human-computer interaction lab, Sapih Savage, spoke about the problem of “invisible workers” in the AI industry. Many of the large, enterprise-scale deep learning applications require a massive amount of training data to be reliable, and the data labeling is typically done by many low paid workers distributed across the world.
The biggest and most successful machine learning models are often trained on data labeled by gig workers, frequently through platforms like Amazon’s Mechanical Turk. Mechanical Turk workers take on microwork/microtasks that involve the labeling of data. As an example, workers might label objects in images so that a computer vision system can recognize objects, or transcribe dialogue so that a voice recognition system can be used for a digital assistant.
Some estimates put the number of Mechanical Turk workers in the US at more than half a million people, and more than half of them earn three-quarters or more of their income through the platform. The number of gig workers on platforms like Mechanical Turk has grown in recent months due to the Covid-19 pandemic putting many out of work.
Savage spoke about how, while crowd work isn’t an inherently bad thing, it can potentially be exploitative. The majority of these workers earn below minimum wage. These positions also tend to be stagnant as they don’t allow workers to upskill or do work that they could easily list on a resume. Other tech companies like Microsoft or Google may have their own platforms that they recruit workers on, but the process is often the same.
Savage believes it isn’t intentional that large tech companies who employ distributed workers are underpaying workers. Savage argues that it’s more likely that tech companies don’t understand how involved and skilled the work they are asking their workers to do actually is, expecting it to not take as long as it actually does.
Savage argues that a number of changes can be made to improve working conditions and career trajectory for the invisible workers who enable the creation of AI models. It’s possible to create systems that help task workers evaluate how long a task will take them to complete, letting them determine if taking on the task will be worth their time. In fact, Savage is attempting to create an AI model that will help workers better predict which tasks are the most worth their time and which tasks will help them build the desired skills. The proposed AI model would learn what type of advice is the most effective for its current user, taking feedback, and improving over time. If a worker wanted to increase the amount of money they are making they could use the AI tool to determine which tasks they should focus on.
In terms of helping invisible workers improve career options, workers could be guided towards tasks that would help them build new skills. Companies posting tasks on these microwork platforms could also provide internships and classes in addition to training sessions. Ultimately, Savage argues that gig workers in the tech space need to be granted agency and respected, just like workers in any other part of the tech sector. As Savage was quoted via MIT Technology Review:
“It’s about changing the narrative, too. I recently met with two crowdworkers that I’ve been talking to and they actually call themselves tech workers, which—I mean, they are tech workers in a certain way because they are powering our tech. When we talk about crowdworkers they are typically presented as having these horrible jobs. But it can be helpful to change the way we think about who these people are. It’s just another tech job.”
Savage’s interview comes as more attention is being paid to the right of gig workers in the tech sector. Just recently the German Federal Labour Court recognized a crowd worker as having the legal status of an employee, perhaps having implications for future treatment of crowdworkers in Germany.