IAmes Muldoon is a lecturer in management at the University of Essex, Mark Graham is a professor at the Oxford Internet Institute and Callum Cant is a lecturer at the University of Essex Business School. They work together on Fairwork, a project assessing working conditions in digital workplaces, and are co-authors of Feeding the Machine: The Hidden Human Work That Powers AI.
Why did you write this book?
James Muldoon: The idea for this book came from some fieldwork we did in Kenya and Uganda on the data annotation sector. We spoke to a number of data annotators and the working conditions were just appalling. We thought this was a story that everyone needed to hear. People are working for less than $2 an hour on precarious contracts, work that is mostly outsourced to the Global South because it is so difficult and dangerous.
Why East Africa?
Mark Graham: I started doing research in East Africa in 2009, actually on the first of many undersea fiber optic cables that were going to connect East Africa to the rest of the world. And the goal of my research was to find out what this new connectivity meant for the lives of workers in East Africa.
How did you gain access to these workplaces?
Mark Graham: At Fairwork, the basic idea is to set decent work principles and rate companies against those principles. We give them a score out of 10. That’s how companies in Nairobi and Uganda came to us because we were going to give them a score and they wanted a better score. We gave them a zero out of 10 and said, “Look, there’s work to be done to improve.”
And are companies responsive? Do they challenge your low ratings?
Mark Graham: There are a range of responses. Some people say that what we’re asking them to do is just not possible. They say things like, “It’s not our responsibility to do these things.” The beauty of scores is that we can point to other companies that are doing it. We can say, “Look, this company is doing this. What’s wrong with you? Why can’t you provide this to your employees?”
Can you talk about the echoes of colonialism that you found in this data work?
Mark Graham: The old East African railway line ran from Uganda to the port of Mombasa. It was funded by the British government and was essentially used to extract resources from East Africa. What’s interesting about fibre optic connectivity in East Africa is that it follows a very similar route to the old railway line, and it’s also an extractive technology.
Could you explain your concept of “extraction machine”?
Callum Cant: When we see an AI product, we tend to think of it as having been created relatively spontaneously and we don’t think about the human work, the resource requirements and everything that goes on behind the scenes.
For us, the mining machine is a metaphor that allows us to think more about who has put work, resources, energy, and time into this process. The book is an attempt to move from the superficial appearance of a clean web page or images of neural networks to a real analysis of the embodied reality of your workplace: what does AI look like and how does it interact with people?
James Muldoon: I think a lot of people would be surprised to learn that 80% of the work behind AI products is actually data annotation, not machine learning engineering. And if you look at an autonomous vehicle, one hour of video data requires 800 human hours of data annotation. So that’s an incredibly intensive form of work.
How does this concept differ from Shoshana Zuboff’s idea of surveillance capitalism?
James Muldoon: Surveillance capitalism is a great description of companies like Google and Facebook that make money primarily through targeted advertising. It’s an apt description of a data-to-advertising pipeline, but it doesn’t really capture the broader infrastructural role that big tech now plays. The extraction machine is an idea we developed to talk more broadly about how big tech feeds off the physical and intellectual labor of human beings, whether they’re Amazon employees, creatives, data annotators, or content moderators. It’s actually a much more visceral, political, and global concept that shows how all of our labor is being exploited and extracted by these companies.
Concerns about AI often focus on existential risks, or how the technology can reinforce inequalities and biases in the data it’s trained on. But you’re arguing that simply introducing AI into the economy creates a whole new set of inequalities?
Callum Cant: We can see this very clearly in a workplace like Amazon. Amazon’s AI system, their supply chain organization technology, has automated the thought process, and what humans have to do in an Amazon warehouse is a brutal, repetitive, high-stress work process. You end up with technology that is supposed to automate menial work and create freedom and time, but in fact you end up with people who are forced to do more routine, boring, lower-skilled tasks because of the inclusion of algorithmic management systems in their workplace.
In one chapter of the book, you talk about Chloe, an Irish actress, who discovered that someone was using an AI-generated copy of her voice. It’s similar to the recent conflict between Scarlett Johansson and OpenAI. She has a platform and the financial means to challenge this situation, which most people don’t.
Callum Cant: Most of the solutions aren’t really individual, they’re about collective power. Because, like everyone else, we don’t have the ability to tell OpenAI what to do. They don’t care whether some authors think they’re running an extractive regime that’s taking information. These companies are funded by billions and billions of pounds of capital and they don’t really need to care what we think of them.
But collectively, we’ve identified a number of ways in which we could turn back the clock and start trying to transform the way this technology is deployed. I think we all recognize that there’s emancipatory potential here, but to get there is going to take a lot of collective work and conflict in a lot of places, because there are people who are getting immensely rich off of this and there are decisions made by a very, very small handful of people in Silicon Valley that are making our lives worse. And until we force them to change the way they do things, I don’t think we’re going to get a better form of technology.
What would you say to readers? What action could they take?
Callum Cant: People are all in such different situations that it’s hard to give one universal piece of advice. If someone works in an Amazon warehouse, then organise your colleagues and use your influence with your boss. If someone works as a voice actor, then you need to organise with other voice actors. But everyone is going to have to react to this on their own terms and it’s impossible to give a diagnosis.
We are all customers of big tech companies. Should we, for example, boycott Amazon?
Callum Cant: I think organizing at work is more effective, but organizing as consumers also has a role to play. If there are clear differences and opportunities to use your consumption effectively, then by all means, especially if the workers involved are calling for it. If Amazon workers are calling for a boycott, for example on Black Friday, we encourage people to listen to that. Absolutely. But there has to be a set of principles that guide any action that people take, regardless of location, and the main one is that collective action is the main way forward.