AI is increasing energy consumption in data centers, putting strain on the grid and hindering sustainability efforts.

AI's impact on big tech companies has led to a surge in energy consumption and carbon emissions, driven by the success of large language models like ChatGPT.

July 13th 2024.

AI is increasing energy consumption in data centers, putting strain on the grid and hindering sustainability efforts.
Boston: The rise of artificial intelligence has had a monumental impact on major tech companies, leading to a significant increase in their energy consumption and subsequent carbon emissions. This can be attributed, in part, to the remarkable success of large language models like ChatGPT, which have fueled the demand for more energy. According to the Electric Power Research Institute, it takes 2.9 watt-hours of electricity to process a single ChatGPT request, which is about 10 times more than a traditional Google query. As AI continues to advance, with capabilities such as audio and video generation, the demand for energy is expected to grow even further.

This surge in energy needs has caused energy companies to rethink their approach. In some cases, they are considering options that were previously deemed unviable, such as restarting a nuclear reactor at the Three Mile Island power plant that has been idle since the infamous disaster in 1979. Data centers have been steadily growing for several decades, but the rapid expansion in the era of large language models has been unprecedented. AI requires a significant amount of computational power and data storage, which the pre-AI rate of data center growth was unable to keep up with.

AI and the grid are also facing challenges due to this increase in energy demand. The electrical grid, which is already nearing its capacity and prone to stability issues, is under even more pressure. Additionally, there is a considerable lag between the growth of computing and the expansion of the grid. While it takes 1-2 years to build a data center, adding new power to the grid can take over four years.

A recent report from the Electric Power Research Institute reveals that 80% of data centers in the US are concentrated in just 15 states. Some states, like Virginia, have over a quarter of their electricity consumption attributed to data centers. Similar trends can be seen in other parts of the world, such as Ireland, which has seen a significant increase in data center development.

In addition to the need for more power generation to keep up with this growth, many countries also have goals to decarbonize their energy sources. This means incorporating more renewable energy into the grid. However, renewables like wind and solar are not always consistent, and the lack of affordable and scalable energy storage poses a challenge in matching supply with demand.

Water cooling, which is used in data centers for efficiency, is also causing strain on limited fresh water sources. As a result, some communities are pushing back against new data center investments. To address these issues, the industry is taking steps to improve energy efficiency. Hardware has become more energy-efficient over the years, and data centers have been able to reduce their power use efficiency to 1.5 on average, with some advanced facilities achieving an impressive 1.2 ratio. New data centers are also implementing more efficient cooling methods, such as water cooling and using external cool air when available.

However, efficiency alone will not solve the sustainability problem. In fact, there is a phenomenon known as Jevons paradox, which suggests that efficiency may lead to an increase in energy consumption in the long run. Hardware efficiency gains have also slowed down as the industry reaches the limits of chip technology scaling. To continue improving efficiency, researchers are exploring specialized hardware, new integration technologies, and chip cooling techniques.

There is also a focus on developing better data center cooling methods. The Electric Power Research Institute report recommends air-assisted liquid cooling and immersion cooling as potential solutions. While liquid cooling is already being used in data centers, immersion cooling is still in the development stage and has only been implemented in a few new data centers.

One potential solution to the energy crisis is the concept of flexible computing, where data centers can adjust their power consumption based on factors such as availability, cost, and environmental impact. This would require innovation in hardware, software, and grid-data center coordination. For AI specifically, there is room for improvement in strategies to manage computational loads and energy consumption, such as scaling back accuracy when training AI models.

However, implementing this vision requires accurate modeling and forecasting. Data centers must have a better understanding of their loads and conditions, and it is crucial to predict grid load and growth. The Electric Power Research Institute has launched a load forecasting initiative to aid in grid planning and operations. Comprehensive monitoring and intelligent analytics, possibly utilizing AI, will be essential in accurate forecasting.

In the US, there is a pressing need to find sustainable solutions for the explosive growth of AI. It may be time to rethink how data centers are built. One potential solution is to focus on building more edge data centers, which are smaller and more widely distributed. These centers can bring computing power to local communities and alleviate strain on the grid. While they currently make up only 10% of data centers in the US, analysts project significant growth in the edge data center market in the next five years. Along with converting data centers to flexible and controllable loads, investing in edge data centers may offer a more sustainable solution to AI's growing energy demands.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]

 0
 0