The balance between the costs and benefits of artificial intelligence is difficult to interpret. If we ask ourselves whether there is a price for finding a cure for a certain cancer, we are probably willing to pay a high price, but if we conclude that much of the primary use of AI will be in creating fake images, we may not want to pay a high bill. . The energy cost of AI is high and has the potential to become much more so over time, although the few studies that have been carried out establish some considerations that relativize the image of an energetic black hole that devours all electricity within its reach. .
It’s been just over a year since we learned about the existence of ChatGPT, which today has more than 100 million weekly users. Artificial intelligence – a term coined in 1956 – has only recently become popular. One of the best-known reports on its energy consumption was published in October by Alex de Vries, a researcher at the School of Business and Economics at the Free University of Amsterdam, who points out that the electricity consumption of AI systems in the world could reach to 15 gigawatts of continuous demand. For this calculation he takes into account the sales and forecasts of chips from the company Nvidia for artificial intelligence servers, among which it accounts for 90% of world production.
The estimated consumption represents 6% of what all of Spain consumed in 2022. De Vries points out that his prediction would require the total production of about 15 medium-sized nuclear power plants, dedicated all day to powering AI systems. In three years, the researcher estimates, the figure could increase by 50%. All this without taking into account the cost of cooling the chips in the data centers, for which he does not have a reliable figure, so his guess is that between 10% and 100% would have to be added to the amount consumed.
Calculating the energy demand of a technology like this is much more complicated than it seems, because there are many factors that are beyond the control of those who investigate it. Sasha Luccioni is a researcher at the AI ??startup Hugging Face, responsible for the large Bloom language model, and has created a free software method to measure the environmental impact of these systems.
In this way, he calculated that Bloom’s training produced 25 metric tons of CO2, but that figure doubled when taking into account the manufacturing of the computer equipment necessary for its operation. The 50 tons of CO2 are equivalent to about 60 flights between London and New York, but it is still much less than what is emitted to train other AI models, because Bloom was powered by electricity generated in France by nuclear power plants, which do not emit carbon dioxide. De Vries observes that, unlike what happens with internet searches, which cache the results of what a user requests in order to offer it to another who requests the same thing, with generative AI, each result requires a new complex work, which increases consumption (see the attached graph).
The last two bars of this infographic are estimates. If Google integrates generative AI into each of its 9 billion daily searches, its energy demand could skyrocket. With that possibility, the company SemiAnalysis estimated that installing Google would require 512,821 Nvidia A100 HGX servers, which would mean a daily consumption of 80 gigawatt hours and 29.2 terawatt hours annually. Another company, New Street Research, came up with slightly higher estimates, but in the same range.
Neither Nvidia nor other manufacturers can produce so many servers in a short time. If they could manufacture them and sell them to Google, the bill would be about $100 billion. The search engine generated profits of $162.5 billion in 2022, so this purchase does not seem to be immediate. This company has a market share of 95% of global AI servers in 2023. With all of them at full capacity, they would consume between 5.7 and 8.9 terawatt hours of electricity, a small fraction of the 205 terawatt hours consumed each year by data centers around the world.
Researchers warn that the large language models that make generative AIs possible are increasingly created with millions of parameters. That is, they are larger and have greater computing capacity, which will result in greater energy consumption if more efficient chips or software that better optimizes the energy bill are not used.
It is not clear whether most of our use of AI in the future will be on servers over the internet or within devices, such as mobile phones or computers. Google and Microsoft are betting on these reduced versions that, in addition to saving energy on servers, will preserve more privacy in addition to knowing the user’s preferences. The world will need to optimize AI consumption to take advantage of its potential.