
There will be a significant surge in energy consumption for both data processing and cooling servers in data centers as AI and ML technology advances at a time when electric grids are already struggling to keep up with demand. AI and ML data centers use advanced processors called GPUs to train AI models that require about five times more electricity than CPUs (conventional processors). Plus, training large language models requires tens of thousands of GPUs that must remain operational 24/7 to ensure the uninterrupted functioning of critical applications during the machine learning process. Then applications like ChatGPT must always be on and never off. To provide uninterrupted power, data centers frequently depend on redundant systems to ensure continuous operation, which adds additional capital costs that also will affect electricity pricing.
This increased electrical demand from AI and ML will be happening at the same time there is expected to be comparable increased electrical demand from EVs that will add to the need for significant investment in electrical generation and in the grid.

Here is an estimate from Goldman Sachs that breaks out the projected increased demand for electricity just from AI:

Another challenge to expanding uninterruptible electrical generation for AI and ML is most conventional base load power generation comes from fossil fuels, which release greenhouse gases into the air and promotes global warming. According to the Energy Information Administration about 60% of US electricity generation in 2023 was from fossil fuels (43% from natural gas and 16% from coal). Renewable energy from wind and solar made up 21%. Nuclear power totaled just under 19%.
AI’s constant energy demand also makes is difficult to address the intermittency problem associated with powering data centers with renewables without significant excess generation capacity and large amounts of battery storage that runs up capital costs. This has many people worried that increased electrical demand from AI and EVs will cause utilities to extend the life of coal plants and build more natural gas plants that were expected to be phased out by wind and solar.
AI’s need for clean, uninterruptible power has led many companies to consider new designs for smaller nuclear power plants that can be located near the data center, a strategy that will also reduce the need to expand the grid. Many of the nuclear deals under discussion are with existing nuclear power providers to access energy or to employ small module nuclear reactors (SMRs) with the larger company operating the reactor. Here is a map of where the expected power demand will come.

Microsoft co-founder Bill Gates, a longtime advocate for nuclear innovation, co-founded TerraPower, which broke ground in Kemmerer, Wyo., this summer on a new nuclear power plant dubbed Natrium that uses salt for cooling and is intended to be operated as a commercial power plant. But this plant and the others are not expected to be online before 2030. In the meantime, there is going to be a constant tension between the need to reduce fossil fuel emissions and the power needs for AI and EVs.
Data centers are also increasing their environmental impact in terms of water demand used to generate the electricity the centers need to operate, and as a liquid coolant to dissipate the heat generated by the servers and other equipment. Collectively, data centers rank in the top 10 of “water-consuming industrial or commercial industries” in the U.S., according to a study led by Landon Marston, Ph.D., P.E., M. ASCE, an assistant professor of civil and environmental engineering at Virginia Tech. His study estimated that the data center industry “directly or indirectly draws water from 90% of U.S. watersheds.” and that 20% of those data centers “draw water from moderately to highly stressed watersheds in the western U.S.”
In West Des Moines, Iowa, a giant data-center cluster serves OpenAI’s most advanced model, GPT-4. A lawsuit by local residents in July 2022, the month before OpenAI finished training the model, showed the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use of 20% and 34%, respectively.
While data centers can use reclaimed/recycled water to reduce the effect on people living in adjacent communities, the quality of the cooling water affects the equipment’s useful life. Reclaimed water, for example, can cause more corrosion, scaling, and microbiological growth in the equipment than potable water. So, there will be water supply issues for the data centers and for the communities where these centers want to locate.
There are people who expect the growth in the demand for electricity and water will be offset by improving efficiency, something that is already starting to happen as the technology evolves. But historically when improved efficiency reduces cost, it normally leads to more demand for whatever is being produced and does not end up saving much energy in the end. AI and ML are also expected to change the balance of labor and capital in existing parts of the economy and create entirely new products and business that will add to energy and resource demand.
The 4th REV’s conclusion is the greatest initial barrier to rapidly expanding the number and size of data centers necessary to run AI and ML applications is going to be environmental concerns from burning fossil fuels and secondarily the supply of water. It will take several more years before clean energy alternatives to generating base load power from fossil fuels emerge. Most people recognize the need to reduce carbon emissions and will oppose slowing or stopping the decommissioning for coal fired power plants and expanding the percentage of power plants fueled by natural gas and contributing to global warming. The need for water to cool servers in data centers will likely shift the location for new construction to parts of the country with lower average annual temperatures and ample water supplies.