The rapid expansion of the artificial intelligence (AI) industry could have large environmental consequences in the short term, according to a recent study by doctoral candidate at the VU Amsterdam School of Business and Economics, Alex De Vries. According to this report, published in the journal Joule, the AI ??industry could consume as much energy as countries the size of the Netherlands, Ireland or Sweden by 2027.

The study points out that the mass adoption of AI by large technology companies, in response to the success of ChatGPT and other similar language models, is driving a significant increase in energy consumption. AI not only requires more powerful hardware than traditional computing tasks, but also data centers full of specialized computers that consume a substantial amount of electricity. Thus, the report highlights that companies do not usually quantify the specific consumption of energy or water, which raises the need for greater transparency in the sector.

The environmental impact of AI could be even more significant than previously thought, and the numbers are scary. De Vries estimated that Nvidia, a major AI chip maker, could provide around 95% of the hardware needed by the industry by 2027. This, combined with growing demand, could lead to AI power consumption ranging from 85 and 134 terawatt-hours (TWh) of electricity per year. This volume of consumption would be the equivalent of what a country like the Netherlands consumes annually.

There are more and more technology companies that invest enormous resources in the construction of data centers and specialized hardware to support this technology. This leads to greater use of graphics processing units (GPUs) to drive the powerful algorithms behind AI. De Vries’ report notes that ChatGPT, which requires a significant amount of energy to use, consumed about 1,287 MWh of electricity in its training phase.

The problem is not limited to training, as AI systems respond to approximately 195 million requests per day. This translates to an average consumption of 564 MWh per day. Taking into account that these systems require cooling that consumes large amounts of water, the situation is worrying. Furthermore, the lack of transparency on the part of large technology companies regarding their real energy consumption is something that De Vries’ research highlights.

As big tech companies like Microsoft and Google continue to invest in the development and expansion of AI, the environmental impact becomes even more evident. Microsoft revealed, in its latest sustainability report, that its water consumption had increased by 34% between 2021 and 2022, reaching 6.4 million cubic meters; or what is the same, 2,500 Olympic swimming pools.

This situation raises questions about the cost of using this technology in everyday life. Alphabet President John Hennessy mentions that interacting with large language models could cost up to ten times more than a standard keyword search. These data underscore the need to consider not only the efficiency of AI, but also whether its use is truly necessary and whether the benefits outweigh the costs.

Some argue that AI could help solve environmental challenges, such as reducing the formation of aircraft vapor trails or accelerating research into nuclear fusion energy, although this does not obviate the need for responsible use of the technology.

De Vries raises a fundamental question: how much AI do we really need and when will its energy consumption exceed its benefits? The expansion of this technology is rapidly transforming the industry, but a more balanced approach is needed to ensure it does not become a threat to the environment.