The balance between the costs and benefits of artificial intelligence is difficult to interpret. If we ask whether finding a cure for a certain cancer is worth it, we’re probably willing to pay a very high price, but if we conclude that much of the primary use of AI will be to create fake images, we may not want to incur a high bill. The energy cost of AI is high and has the potential to be much higher over time, although the few studies that have been carried out establish some considerations that relativize the image of an energy black hole that devours the ‘electricity that is within reach.
It’s been just over a year since we learned about the existence of ChatGPT, which today has more than 100 million users every week. Artificial intelligence – a term coined in 1956 – has become popular very recently. One of the most well-known reports on its energy consumption was published in October by Alex de Vries, a researcher at the School of Business and Economics at the Free University of Amsterdam, which points out that the electricity consumption of the systems AI in the world could reach 15 gigawatts of continuous demand. For this calculation, it takes into account the sales and forecasts of chips from the company Nvidia for artificial intelligence servers, among which it captures 90% of the world production.
The estimated consumption represents 6% of what the whole of Spain consumed in 2022. De Vries points out that his prediction would require the total output of about 15 medium-sized nuclear power plants, all day dedicated to powering AI systems. In three years, the researcher estimates, the number could increase by 50%. All this without counting the cost of cooling the chips in the data centers, which is why he does not have a reliable figure, so his guess is that between 10% and 100% should be added to the amount consumed.
Calculating the energy demand of a technology like this is much more complicated than it seems, because there are many factors that escape the control of those who investigate it. Sasha Luccioni is a researcher at the AI ??startup, responsible for the great Bloom language model, and has created a free software method to measure the environmental impact of these systems.
In this way, he calculated that Bloom’s training produced 25 metric tons of CO2, a figure that doubled when the manufacture of the computer hardware necessary for its operation was taken into account. The 50 tons of CO2 is equivalent to about 60 flights between London and New York, but it’s still far less than what’s emitted to train other AI models, because Bloom was powered by electricity generated in France by nuclear power plants, that do not emit carbon dioxide.
De Vries observes that, unlike what happens with internet searches, which store in a box the results about what a user asks for in order to be able to offer it to another who asks for the same thing, with generative AI, each result requires a new complex work, which increases consumption (see the attached graph).
The last two bars of this infographic are estimates. If Google integrates a generative AI into each of its 9 billion daily searches, its energy demand may soar. With this possibility, the company SemiAnalysis estimated that putting Google would require 512,821 Nvidia A100 HGX servers, which would mean a daily consumption of 80 gigawatt hours and 29.2 terawatt hours annually. Another company, New Street Research, reached slightly higher estimates, but in the same range.
Neither Nvidia nor other manufacturers can produce so many servers in a short time. If they could make them and sell them to Google, the bill would be around $100 billion. The search engine generated 2022 profits of $162.5 billion, so this purchase does not appear to be immediate.
Nvidia has a 95% market share of global AI servers by 2023. With all at full capacity, they would consume between 5.7 and 8.9 terawatt hours of electricity, a small fraction of the 205 terawatt hours that data centers around the world consume each year.
The researchers warn that the large language models that make generative AI possible are created with millions more parameters each time. In other words, they are larger and have more computing capacity, which will result in greater energy consumption if more efficient chips or software that better optimizes the energy bill are not used.
It is not clear whether most of the use we will make of AI in the future will be on servers over the internet or inside devices, such as mobile phones or computers. Google and Microsoft are betting on the reduced versions which, in addition to saving energy in the expenditure of the servers, will preserve more privacy in addition to knowing the user’s preferences. The world will need to optimize the consumption of AI in order to take advantage of its potential.