That society advances scientifically and socially and the regulation of these advances lags behind is an indisputable truth in the evolution of time. Reality is more agile accepting and adapting to novelties than our legislation is when it comes to standardizing them. This has happened on many occasions and it is happening again now, with the implementation of digitization supported by new technologies in practically all areas of our lives, with an aggravating circumstance: change has been extremely accelerated, ultimately driven by a pandemic. that forced us, almost all of us, to turn to that digital reality to manage our day to day. Work, domestic management, leisure, education and our personal relationships found an open space in digital connections when practically all the rest remained closed by government order.
The result has been a very rapid acceptance of new technological solutions without the time, or desire to a large extent, to consider too much their ethical aspect. The ease of use and the many solutions provided by the new applications have led to a very high level of almost blind acceptance. Almost, because with each new leap forward in technology -as has happened now with ChatGPT, the most innovative of artificial intelligence applications- critical voices are raised that demand to be aware of the extent to which we admit that networks interfere in our privacy, or confuse it.
Critics argue that in our capitalist society nothing is free. The networks provide us with daily information without having to pay the price of the traditional newspaper, but we pay, we pay. As long as we are part of an economic structure based on consumption, our profiles are gold for marketing and, like Thumbling, in our walks through the networks we leave traces of who we are, what we buy, what we think, what our aspirations are… DOMO, a cloud software firm, estimates that each person generates 1.7 MB of data per second, more than enough material for brands to draw up their strategies, with increasingly personalized campaigns, more directed at each target.
But the interest generated is not limited to our consumption capacity. Brexit, and the associated mess led by Cambridge Analytica (CA) and Facebook, made it clear that our value is double, as consumers and as voters. So our trail also defines us as citizens who can support, or not, political proposals. More gold for the managers of the data that we gladly dump daily on the network. Christopher Wylie, former director of CA, presumed that if the data from the social network had not been used to plan the electoral campaign, the United Kingdom would not have voted to leave the European Union.
The data that is generated from our movements in the digital world (what we see, what we look for, what spaces we visit, who we follow) turns us into an open book for those who can have access to the information generated – and those who can have it. pay for them. Thus, our portrait as digital beings is a sown field for an increasingly polarized society. Who listens, watches and reads what shares his speech -and suggests his RRSS-, without often listening to other opinions, points of view, spaces and ways of living. Because if we let the algorithm decide for us, we enter a circle that feeds back, where the critical spirit has little room, which weakens the maximum sense of democracy.
A critical spirit claimed by Idoia Salazar, founder and president of the Observatory for the Social and Ethical Impact of Artificial Intelligence (OdiseIA) when last September she participated in the debate “The ethics of algorithms”, organized by the Social Observatory of the Fundació ‘ la Caixa’, at CaixaForum Macaya: “I am aware that we are all going very fast and that we are taking the easiest path, but we need more awareness (…), a critical spirit that will be important in the future world we are going toâ€. To the question, do algorithms have a soul? Salazar’s response was emphatic: “It is only software, a program to which human qualities cannot be attributed. Ethics is a human ability to discern between good and evil, so the one who must be ethical is the developer of the algorithms. It is the job of the data scientist that the algorithm contemplates diversityâ€.
Position shared by Simona Levi. This Italian based in Barcelona is a technopolitical strategist and founder of Xnet, a network of specialists and activists that proposes advanced solutions in different fields related to digital rights and updating democracy in the 21st century. For her, “the Internet is not the cause of the misuse of its possibilities.” She adds that “neither the algorithm nor the technology are responsible. The responsibility belongs to those who program themâ€. In his opinion, if it is programmed to polarize because divergent positions interest certain political parties or because hate creates addiction and that favors permanence on digital platforms, and therefore its benefits, the problem lies in a programming of the algorithms designed to favor those who benefit
As an example of the current mismanagement of the digital space, the current drift of Twitter, which has become a battlefield for divergent opinions, or the content offered by Google Discover, a robot that adds informative content based on user preferences. But it is a robot, and its choice responds to certain premises defined by its managers. Quality or veracity of the content do not seem to be among them, so the result is a strange mix in which fake news has also had a place. The paradox is served: when technology manages to create a channel to democratize knowledge and facilitate the flow of information, we run the risk of it becoming, in part, a loudspeaker for those who make the most noise. However, and we refer again to the opinion of the founder of Xnet, Simona Levi, “it is entirely possible to legislate so that human rights prevail over private interests, however, there are two circumstances that add up so that it does not happen: on the one hand that the legislator himself places himself in a disadvantaged position by limiting them, and that legislator is often too sensitive to the corporate interests of those who benefit from the spurious use of these algorithmsâ€.
Nuccio Ordine, Italian philosopher and Princess of Asturias Award for Communication and Humanities 2023, expressed himself along the same lines. Ordine, in statements to the media at the time the award was announced, on May 4, assured “for me the Technology is like a pharmakon, a term that in Greek means both remedy and poison. And it is that technology can offer life but at the same time it can kill, it all depends on the dose. That is why we must be very alert in this new environmentâ€
The present has shown that standardizing this space, and educating so that consumers have a critical spirit, is already essential to prevent the information industry, key to the freedom of a society, from being treated as a consumer good. more and our data as material susceptible to marketing to the highest bidder. It will not be an easy task. Europe, once again, has the pending task of establishing itself as a guarantor of our democratic values, even though doing so may further reduce its chances in a world that has become a single large digital market that knows no borders. The news that TikTok differentiates the more educational content it broadcasts in China from that it offers in the West with the intention of dulling the new generations in the rest of the world would be, if confirmed, very worrying.