In 1945, one in three inhabitants of the planet lived under colonial rule. Today, one in two people uses a product from Meta (owner of Facebook, Instagram and WhatsApp) and lives by its rules and regulations.
It may seem like an unfair comparison. Colonialism implemented a brutal extractive system that caused the loss of millions of lives in the span of five hundred years. Colonial subjects were forced into the system and saw their land, jobs, and resources forcibly taken away from them. We, on the other hand, willingly participate in platforms and networks and obtain many benefits without apparently paying anything. So, even though the world made possible by Big Tech is as vast as that of colonialism, it can give the impression that it is not at all as violent.
However, as Nick Couldry and I argue in our book Data Grab, there is something disturbingly colonial about that world. It’s a phenomenon we call data colonialism, and we certainly pay something when we participate in it. While the old colonialism took over the land, the new colonialism takes over us through data. That means our lives are restructured to endlessly extract data from them. The objective of this system is, obviously, to generate profits for companies. However, the social impact is greater than the profit margins would indicate, and where this is clearest is in the field of artificial intelligence (AI).
The latest advances in AI have been possible thanks to the development of neural networks capable of simulating something that seems like human thought, but which in reality are just sophisticated programs that excel at calculating the most possible or probable answer to a problem. Unlike human brains, which can learn and infer from very little information, AI needs massive volumes of data to learn to do anything: the more data used to train an AI model, the more representative or accurate it will be. probability. In other words, if you try to teach the AI ??to draw a cat, you need to show it thousands and thousands of images of cats.
However, those huge amounts of training data have to come from somewhere, and that’s where data colonialism becomes relevant. The development of AI involves us, the public, in three neocolonial processes: how data is extracted from us, how we become dependent on those who control the technology, and how technology is used to control us through surveillance and prediction. .
Although old and new colonialism differ in scales, intensities and modalities, both share a similarity: the function is the same, to extract. The extraction of resources and labor that characterized previous forms of colonialism has today been replaced by the extraction of data from our lives. Then as now, dispossession is at the heart of the colonial process. One of the main ways in which this dispossession has developed is through the enclosure or privatization of commons.
Before colonialism, large areas of the planet constituted common territories and resources managed by communities and without individual owners. The gradual enclosure of these commons in Europe led to the resettlement of surplus populations to the new lands. These settlers, whose intention was to claim these territories for themselves, saw the new lands as cheap: they were abundant, free, they had no owners (at least civilized owners, it was believed), and there was nothing left to do but take them.
The extractivism that fuels AI represents a new enclosure of common goods. Data is considered cheap just as land was considered cheap in earlier forms of colonialism: it is said to be abundant, unowned, and freely available to those who find it. Now, data is not a natural resource. We have generated them through millions of collective hours of work. We are the ones who have produced and shared all that data, and who have given permission for our activities to be tracked to generate more data.
As we spent our time creating data and being tracked, we felt like we were contributing to a new form of collective resource, a data commons. Because data was supposed to be a non-rivalry resource, it could be used infinitely without degradation or scarcity. If someone took a photo of a cat and posted it online, anyone could view it or even download it, reshare it, or modify it without loss to the original.
Most of those cat photos, along with the rest of our data about who we are and what we do, ended up on corporate platforms. We were encouraged to participate in it because they were free: no one paid to use Facebook or Google. Now, the reason for free was that we ourselves were the raw material that generated value for those platforms. And so Big Tech has extracted vast amounts of data from us that they have used, first, to build detailed marketing profiles that they could sell to advertisers and, more recently, to train artificial intelligence systems that will make us even more dependent on them. platforms.
Colonialism entailed relations of dependency between the colonizer and the colonized, relations that continue to characterize interactions between the global north and the global south today. At first, it was enough for the colonizers to occupy land and extract raw materials from it to send them back to the colonizer’s homeland in order to generate wealth. Think of all the minerals and natural goods that flowed from the New World to the Old. However, colonialism ended up coexisting with a new system, capitalism, which invented new forms of dispossession.
Capitalists (often financed by the wealth of their colonizing ancestors) realized that once raw materials were converted into manufactured products, those products could be sold back to the colonies. Thus, the colonies had to pay twice for the same material: once during extraction and once as a manufactured product. As industrial technology made the production of goods cheaper, they could compete with local alternatives. Consider, for example, cotton, which was harvested from plantations in the global south, woven on power looms in the industrialized north, and then exported back to colonies or former colonies, a process in which It eliminated local weavers and made colonies dependent on cheap textiles and losing their ability to manufacture.
Something very similar happens with our data and AI. We are told that our cheaply extracted data is essentially useless to us as individuals, which is why we should allow its extraction and processing. We are told that companies can turn our raw materials into something supposedly useful, like AI models that make our lives and work easier. This is a technology-intensive process, meaning it can only be carried out by a few companies. Our processed data is resold to us as a new manufactured product, AI, a product that eliminates our ability to do certain things for ourselves while making us more dependent on the companies that have the exclusive power to manufacture the product, in what which constitutes a replica of colonial relations.
We are also told that AI can solve many of humanity’s most pressing problems. We expect certain applications to improve our lives by detecting diseases, creating new vaccines, or modeling solutions to environmental problems. However, it is already clear that many other applications, as well as the environmental cost of AI itself, will have a detrimental impact on our lives.
Therefore, one of the most important applications of AI will be to keep populations under control through surveillance and prediction. The violence and injustice unleashed by colonialism invites resistance, and that resistance must be kept in check. Along with managing extracted resources, colonialism has always sought to manage discontent through surveillance, and there is no reason to believe things will be any different with data colonialism.
We have already seen the application of data technologies for the surveillance and tracking of people: surveillance cameras, facial recognition, algorithmic profiling, predictive policing… AI opens new frontiers for surveillance, creating innovative mechanisms for detection and even predicting threats to the status quo.
It is important to note that the victims of these new forms of surveillance will continue to be, disproportionately and in terms of race, class and gender, the traditional victims of colonialism. We’ve already seen examples of that kind of bias: from facial recognition algorithms that produce false positives when it comes to non-white faces (which can lead to wrongful imprisonment), to AI models that progressively reduce workers’ wages by vulnerable employees, through systems that have produced racist results when determining access to health care, housing, education or personal economic services, content moderation systems that do not protect women or non-binary gender people , systems that abuse the privacy of migrants and refugees, etc.
The impact of these systems is very real, resulting in loss of privacy, loss of opportunity, loss of dignity, and in some cases, loss of life. In other words, data colonialism extends forms of racial, gender, and class violence (as well as environmental violence) that have developed in the colonial context for hundreds of years.
Resisting colonial modes of AI extraction, dependence, and control might seem impossible, as impossible as dismantling the legacy of colonialism might seem. But whenever there has been colonialism, there has been resistance, and we have a rich history of decolonial movements to draw inspiration from. If it was not always possible to oppose colonialism with the body, it always could be opposed with the mind, and it is essential that we undertake collective and creative ways of imagining alternative futures to data colonialism.
Ulises A. Mejías is a professor of Communication Studies at the State University of New York (SUNY). His next book (with Nick Couldry) is ‘Data Grab: The new colonialism of big tech and how to fight back’. He is co-founder of Tierra Común (tierracomun.net), a network of activists, educators and academics for the decolonization of data, and is on the board of directors of Humanities New York