Inside Scale AI: Gig Workers Power Meta’s Social Media AI

scale ai gig workers - Inside Scale AI: Gig Workers Power Meta’s Social Media AI

How Scale AI Uses Gig Workers to Train Social Media AI

Scale AI gig workers have become the backbone of a growing industry, powering advanced artificial intelligence systems for tech giants like Meta. As the demand for labelled data surges, tens of thousands of people are recruited to comb through social media content, transcribe sensitive materials, and tag images—all tasks crucial for refining AI models.

The Outlier Platform: Where Experts Meet Gig Work

Scale AI, 49% controlled by Meta, operates a platform called Outlier that advertises flexible work for individuals with expertise in fields such as medicine, physics, and economics. The platform boasts, “Become the expert that AI learns from,” attracting highly qualified contributors. However, many workers report that their actual assignments involve much more than expert-level input—instead, they’re often scraping personal data from Instagram and Facebook accounts, labelling explicit or disturbing content, and transcribing audio from a range of sources.

Behind the Scenes: The Work of Scale AI Gig Workers

According to interviews with a diverse group of Outlier contractors—including journalists, teachers, graduate students, and artists—the reality of this gig work is far from glamorous. Many are driven to these tasks by economic necessity amid growing automation threats. “A lot of us were really desperate,” one worker explained. “Many people really needed this job, myself included, and really tried to make the best of a bad situation.”

Some taskers describe feelings of guilt and discomfort, especially when handling personal images or content belonging to children. There are reports of assignments requiring the ordering of Facebook photos by the age of the user, or tagging individuals and their locations, sometimes from the accounts of minors. While Scale AI insists that only public data is used and that no accounts set to “private” are accessed, many workers remain ethically conflicted about the nature of their contributions.

Constant Monitoring and Unstable Pay

Scale AI gig workers operate in a tightly monitored environment. Platforms like Hubstaff are used to track workers’ activity, including taking periodic screenshots of their screens. While Scale AI claims this is to ensure accurate payment, workers often feel surveilled. The pay structure itself is unstable, with some reporting a “bait-and-switch” approach—promised high pay during recruitment that drops once they’re onboard. To qualify for certain assignments, workers must undergo repeated, unpaid AI interviews, fueling suspicions that these interviews themselves are recycled to train AI models.

Unsettling Tasks and Sensitive Content

Taskers recount being assigned to transcribe explicit audio, label images of dog feces, or sort through violent scenarios, despite initial assurances that such content would not be included. One doctoral student shared, “We had already been told before that there would be no nudity in this mission… But then I would get an audio transcript thing for porn or there would be just random clips of people throwing up for some reason.”

In addition to social media scraping, some assignments involve harvesting images of copyrighted artwork to train generative AI models. Despite guidelines instructing workers not to use AI-generated images, the pressure to provide new content often leads them to scour the social accounts of artists and creators, raising questions about copyright and ethical standards.

The Broader Impact and Controversies

Scale AI has contracts with major technology companies, including Google, Meta, and Anthropic, as well as government entities like the US Department of Defense. The company’s need for fresh, labelled data is only increasing as AI models grow more sophisticated. Yet, the process relies heavily on the invisible labor of gig workers, many of whom feel they are training the very systems that may one day replace them.

Legal challenges are emerging, with lawsuits alleging exploitative practices and lack of transparency. Law firms representing AI gig workers estimate that hundreds of thousands now participate in this type of work globally. Despite the unsteady pay and occasional mass layoffs, most taskers interviewed continue to accept assignments, seeing few better alternatives in an AI-driven job market.

The Future for Scale AI Gig Workers

The story of Scale AI gig workers highlights the complex ethical, economic, and social challenges at the heart of the AI industry. As companies race to develop more advanced models, the demand for human-labelled data grows, but so do concerns about worker exploitation and data privacy. While Scale AI maintains that its platform offers flexibility and valued opportunities, many workers remain uneasy about their role in shaping the future of technology.

“I have to be positive about AI because the alternative is not great,” said one worker. “So I think eventually things will get figured out.”


This article is inspired by content from Original Source. It has been rephrased for originality. Images are credited to the original source.

Subscribe to our Newsletter