All the way back in 1968, writer Arthur C. Clarke claimed that “Any sufficiently advanced technology is indistinguishable from magic”. Clarke hadn’t tried ChatGPT.
To most of us, the rapid advancement of artificial intelligence (AI), particularly generative AI, appears to function magically. Its uses have already impacted, if not directly taken over many parts of our day-to-day life. For many people, it’s their accountant, their assistant, their therapist. To its regular users, generative AI seems to offer instantaneous solutions to their most complex problems, often passably and usually for free.
However, this actually comes at immense cost. The illusion of ubiquity and effortlessness conceals an economic model built on environmental and human exploitation. The massive data centres built to house them run on immense volumes of non-renewable energy and drain entire communities’ drinking water. Aside from the planet’s resources, generative AI is powered by human labour. It is, in fact, not magic, but workers predominantly from the Global South who power this technology.
Who are data labourers?
Dr Milagros Miceli, a sociologist and computer scientist at the Weizenbaum Institute, researches the human labour behind AI. As she recently pointed out in an interview with Netzpolitik, “Before, for example, image recognition can identify a photo of a cat, humans have to label numerous images of cats.” Before each and every task executed by AI, immense datasets must be curated, cleaned, and labelled by human hands. “There is no AI without the work that goes into data collection, cleaning, and commenting, and without algorithmic review.” Magic? Maybe not. Human effort is essential for generating the industry’s profits. But, the model explicitly demands that such effort “be available and cheap.”
The number of data workers globally has grown exponentially in recent years and continues to rise rapidly. The World Bank estimates this workforce, which is responsible for curating and labelling the vast datasets needed to train AI, to be between 154 million and 435 million people worldwide. These data labourers toil tirelessly for global platforms like Meta, Scale AI and the unbelievably named Amazon Mechanical Turk. The latter, which provides large numbers of workers available worldwide at any time and at low prices, is named after the 18th-century “Mechanical Turk”. This chess-playing automaton—a famous “magic” machine that secretly concealed a human operator—is now similarly hiding human labour behind the pretence of an automated system. Wonders never cease, it seems.
While the highest concentration of data workers in a single nation is still found in the United States, the overwhelming majority of this labour force resides in the Global South. Significant populations of data workers are located in countries like Brazil, India, Kenya, the Philippines, Syria and Venezuela. Many of these people are internally or externally displaced due to war or economic strain.
The human cost of our AI obsession
Their labour may take place on a computer, instead of the brutal cobalt mines we’re accustomed to associating with the production of our digital goods (and then conveniently forgetting). But it’s far from easy work. Richard Mathenge, a Kenyan worker who labelled text passages for OpenAI’s safety systems, spends gruelling nine-hour shifts reviewing graphic descriptions of “child sexual abuse, torture, and murder”.
Workers are tracked “to the second”. Extreme productivity pressure, combined with the often distressing nature of the work, results in—abracadabra—severe psychological issues. Workers are developing PTSD and lasting trauma from constant exposure to graphic content. Yet, the corporate structures they toil for offer at best minimal support. As one worker, Nathan Nkunzimana, testified to CBS News, “It was proven by a psychiatrist that… we are all sick, thoroughly sick.” However, Dr Miclei has “seen cases where workers have not sought psychological or legal help. They were told that doing so would violate the non-disclosure agreements they had signed with their employers.”
You don’t pay wages to magic, right?
And what is their compensation for such gruelling work? According to Dr Miceli, “Most data workers…are not paid for their time, but only for completed tasks. They usually receive meagre hourly wages of [up to] just $2 in Kenya or $1.70 in Argentina. This is while their counterparts in the US start at $18.
To compound this issue, a huge portion of these meagre wages don’t even end up in the pockets of data labourers. Uchechukwu Ajuzieogu, founder of Aylgorith, an investigative publication covering AI economics, has conducted extensive research into the topic of AI data labour. We spoke to Ajuzieogu directly about what he calls this “infrastructure apartheid”. In his investigation, The Hidden Cost of AI: How Africa Fuels Global AI While Being Left Behind, he found that “Kenyan data workers training ChatGPT earn $2/hour while paying $40-60/month for reliable internet connections. That’s 20-30 hours of labour just to access the infrastructure that lets them work.”
This infuriating injustice is obvious to everyone, workers included. “For too long we, the workers powering the AI revolution, [have been] treated as different and less than” said Richard Mathenge, a former ChatGPT content moderator. These workers, so essential to the AI that’s rapidly soaked into the fabric of our daily lives, are not only underpaid and underappreciated but, according to Ajuzieogu, “precedents for a new form of technological colonialism that could entrench global inequalities for generations.”
Fighting for justice in the face of the impossible
Just as the workers who toiled in harsh conditions to establish industrial colonialism, data labourers’ working conditions are truly bleak. But structural dependencies leave workers with no choice but to accept such working conditions. Ajuzieogu explained to RESET that “With millions of people desperate for any income, speaking out means instant replacement. I documented cases where workers who attempted to organise were blacklisted across all platforms within 48 hours.”
The companies in charge intentionally obscure responsibility in insidious ways. “When Kenyan workers sued over working conditions, companies argued the employment relationship existed in California, not Kenya. When Californians sued, companies claimed the work happened in Kenya.” So, even after making the brave decision to speak out against these giant corporations, workers end up with not only no employment, but also “in legal limbo with no jurisdiction to hear their claims.”
But that hasn’t stopped workers from trying to fight for justice. In a landmark moment, Africa’s first Content Moderators Union formed in May 2023. It represented 150 workers across six countries and pursued legal action for better standards. This collective bargaining offers a potential route to accountability, and with $1.6 billion in sought compensation, hopefully economic parity, too.
Indeed, we must be wary of painting the narrative as one of victimhood. Many in the Global South are leading the charge for a fairer digital world. Alongside labour efforts, a movement for sovereign, African-led AI is emerging and hoping to break dependence on Western platforms. Entrepreneurs are building alternative solutions that aim to retain value and relevance locally. InstaDeep, for example, was founded in Tunis with minimal seed capital, achieved a $682 million acquisition by BioNTech while maintaining African operations. We’ve reported in depth about how AI can be made more sustainable, and there are several things to be hopeful about here.
The workers are speaking out
Meanwhile, moves are being made to understand and highlight the plight of data labourers in the West. The Data Workers’ Inquiry, for example, aims to amplify the voices of workers and their political demands. Co-initiated by Dr Miceli, it invites data workers from everywhere from Syria to Germany to take the lead in sharing their stories. They publish these stories in accessible formats such as podcasts, documentaries, animations, comics, zines and essays to reach as broad an audience as possible.
Data worker Bothwokwa Ranta created a zine about African women in content moderation for the Data Workers’ Inquiry: “It was not easy. I share a sisterhood with some of the women I wrote about, which was key for them to trust me with their stories. Having been depressed and in a dark place myself, working on this project has been a tough yet important step in my healing process.”
Ajuzieogu told RESET that the increased coverage of the topic gave him a reason to be optimistic. “Five years ago, nobody knew [data workers] existed. Now, investigations like mine (and yours) are making the human cost of AI visible. Visibility precedes accountability.” He’s cautiously optimistic that change is possible. “I’ve watched Kenyan content moderators build mutual aid networks, Filipino annotators create skill-sharing collectives, and Nigerian engineers develop tools to help workers negotiate better rates. Resistance is happening, even under extreme power imbalances.”
However, time is of the essence. AI companies would replace the data workers with a magic wand the second they can, according to Ajuzieogu. Rather than improving working conditions, paying fair wages and following legal and moral procedures, “they’re interested in eliminating workers entirely.” He’s concerned that, at the rate we’re going, “the window for reform might close before meaningful change happens.”
Fighting some of the largest, richest companies in the world for basic fairness is not an easy task. But, as Ajuzieogu puts it, “sunlight remains the best disinfectant. Every story that reveals how these systems actually work makes it harder for companies to claim ignorance or inevitability. We just need to force the conversation before the window closes.”
How can we ensure a green digital future?
Growing e-waste, carbon emissions from AI, data centre water usage—is rampant digitalisation compatible with a healthy planet? Our latest project explores how digital tools and services can be developed with sustainability in mind.




