This has only started. If 2023 has been the year of amazement, 2024 will be the year of the massive incorporation of AI into a growing number of activities.
Sample buttons: the $3 billion that Novartis and Eli Lilly will pay a Google DeepMind spinoff to apply AI to drug development; Apple’s initiative to apply AI directly to iPhones, instead of accessing it via the internet; the December launch of Gemini, Google’s AI to compete with ChatGPT-4; Open AI, for its part, plans to launch ChatGPT-5 this year…
It is not yet known how the emergence of AI will change the world. It is known, because it has been a constant throughout human history, that every disruptive technological advance carries a risk of inequality, domination and exploitation. They can remember: the domestication of the horse, which gave an advantage to warriors; agriculture, which divided societies into rich and poor; writing, which divided them into educated and illiterate; the weapons and ships with which Europeans enslaved the people of Africa; or, more recently, rockets and satellites, which were developed for the purpose of world domination; or digital technologies that facilitate citizen monitoring.
The more disruptive a technology is, the greater the risk that it will exacerbate inequalities and exploitation. With AI, the risk is extreme. But now we have different values ??than the ancient slave sailors. We have a Universal Declaration of Human Rights, not always respected but presumably recognized by everyone, and regulatory mechanisms that have prevented harm from other potentially harmful technologies such as nuclear arsenals or biological weapons.
It will not be possible to regulate AI to everyone’s liking, because those who have begun to benefit have different priorities than those who fear being harmed. But this should not be a reason to allow it to develop according to the law of the jungle. A suboptimal regulation, if it protects respect for human rights without restricting technological advance, will be better than no regulation at all.