• The growing demand for hosting capacity for advanced AI processing is now a major environmental issue, notably with regard to power and water resources.
• In response to this challenge, players in the sector will need to optimize the efficiency of cooling technologies and develop smart networks and smart energy storage systems.
According to the International Energy Agency, soaring demand for artificial intelligence and a massive increase in the number of requests to tools like ChatGPT, requiring ten times more power to process than simple Google searches, has led to a massive upsurge in energy use at data centers, which is increasingly the cause of significant environmental problems. As the head of corporate green strategy for data centers at Orange, Guillaume Gérard, explains, “the large-scale deployment of artificial intelligence has led to a phenomenal increase in the need for processing capacity and the power requirements of IT infrastructure required to support it.” Worse still, this problem will likely be compounded by rapid development in this fast-growing sector. According to a recent McKinsey report, global demand for AI-dedicated data centers is set to increase by 20% per year in tandem with the demand for data center hosting capacity which is expected to rise at a rate of between 19% and 22% to reach a demand of 171 to219 gigawatts (GW) by 2030. About 70% of this need will be for data centers that are equipped to host advanced-AI workloads.
The massive use of AI could result in a scarcity of energy and water resources
Unprecedented demand
Demand on this level has also had a game-changing impact on energy industry players. “Among other phenomena, we are now seeing ‘green’ energy suppliers who opt to sell all the energy from one production facility to a single AI-dedicated data center,” points out Guillaume Gérard. The Orange specialist further adds that “the presence of water-cooled data centers is increasingly linked to local water shortages. The implication is that the massive use of AI can result in a scarcity of energy and water resources, which will not only contribute to more legal reporting obligations, but also to growing public pressure to halt the construction of AI infrastructure.”
This scenario recently emerged in Ireland, where the national Commission for the Regulation of Utilities (CRU) imposed a moratorium on the connection of new data centers in certain regions in 2023 — a measure largely prompted by the fact that data centers already account for 20% of the electricity used in the country. With the massive integration of AI in this infrastructure, there is now a risk of unpredictable spikes in the demand for power that could put a huge strain on local grids. In major economies such as the United States, China and the European Union, data centers account for around 2% to 4% of total electricity consumption. However, their local impact can be more pronounced, which is notably the case in five American states, where they are responsible for more than 10% of electricity consumption.
Smarter power management and storage
The rollout of AI involves the use of high-power density equipment. “Current AI equipment that takes up less space than a large server consumes four times as much power. The cooling systems required for this kind of equipment are complex because standard technologies like liquid direct-to-chip (D2C) and immersion cooling need to be specially adapted to ensure its efficient operation,” points out Guillaume Gérard. Scientists are recommending that smart grids and smart energy storage systems be used to alleviate pressure on infrastructure prompted by AI, and work on solutions of this kind is underway. Earlier this year, an Uptime Intelligence survey, which reported that AI is now driving innovation and change in power distribution, cooling, and IT workload management, predicted that we may soon see data centers that provide or store power and possibly even shed loads to support grids. As Guillaume Gérard explains, “The demand for real-time interaction with AI systems will continue to weigh on energy distribution systems, but the overall impact of artificial intelligence can be mitigated by training models at off-peak times and by training them with electricity from local power generation facilities that can supply energy to the grid when they are not in use for this purpose.”
