AI drives tech companies to take the nuclear option

Adobe Stock: George Sheldon

With increased AI usage driving up power consumption at data centers, tech firms are looking to alternative energy sources such as nuclear power. 

Cloud providers such as Amazon Web Services, or AWS, Microsoft Azure and Google Cloud Platform are driving investments in new power generation sources for data centers. Although hyperscalers Amazon, Google and Microsoft have been investing in solar and wind technologies, they want to go further in searching for clean electricity. 

In September, Microsoft signed a deal with Constellation Energy to reactivate one of the notorious Three Mile Island nuclear power site's reactors in Pennsylvania, the first time a U.S. nuclear reactor has been recommissioned after closure.

One nuclear technology that could provide a solution to data center power needs is small modular reactors, or SMRs, which are cheaper than conventional nuclear power plants and equipped with safer cooling methods. They can be sited at locations not suited to larger nuclear plants, and can be dedicated to data centers so that hyperscalers are not competing for power with users of public power grids. SMRs have been endorsed by the Department of Energy as a way of providing safe, clean and affordable nuclear power.

SMRs' important selling point is their standardized design and modularity, allowing for volume production and lower costs. However, with high development costs, the eventual price tag per SMR unit is unknown, and so far, only China and Russia have built operational SMRs. 

Bill Gates is investing $1 billion in the construction of an SMR in Wyoming, while Amazon and Google have signed deals with power utilities to purchase nuclear energy from SMRs, with Google expected to start getting SMR-generated power by 2030. Separately, Ontario Power Generation is building an SMR in Ontario, Canada, with backing from the Ontario provincial government.

Another limiting factor for SMRs is the shortage of the necessary enriched uranium, as the U.S. doesn't produce enough fuel for the reactors, said James Walker, CEO of U.S.-based Nano Nuclear Energy.

AI drives demand for energy

The International Energy Agency estimates that a ChatGPT query needs 10 times as much electricity to process as a Google search. The result is a forecast 165% surge in data center power demand between 2023 and 2030, not counting the demands of cryptocurrency miners, according to Goldman Sachs. The investment bank estimates that data centers alone will contribute a 0.9% compound annual growth rate to overall U.S. power demand by 2030, and that U.S. data centers' share of total U.S. power demand will increase from 3% to 8%, with AI representing 21% of data center power demand in 2028. 

U.S. data demand has been growing at 20% to 25% over the last decade, said Brian Singer, Goldman Sachs' global head of GS Sustain for Global Investment Research. While data centers' power efficiency gains have offset some of this growth, the pace of efficiency improvement has now slowed down, leading to a pickup in data center power demand in the last three years, he said. 

Singer sees nuclear as a potential panacea for providing low-carbon sources of reliable power and reducing the intermittency of renewables such as solar and wind. But, given the longer build-time of nuclear plants compared to natural gas- or solar-powered plants, the agreements and investments need to start this decade if nuclear reactors are to be available to power data centers in the 2030s, he said.

Shorter-term solutions

In the meantime, major hyperscalers are searching out existing sources of electric power generation such as natural gas and colocating data centers nearby. Meta is building a data center adjacent to a new gas-power plant operated by Louisiana-based Entergy, and the Canadian province of Alberta is seeking to attract data center operators to benefit from its huge natural gas resources and fiber-optic network. 

Karen Brennan-Holton, PwC's energy, utilities and resources advisory leader, expects grid power and power purchase agreements to remain the primary option for data center energy, with colocation and local microgrids providing a solution to wide-area grid congestion and interconnection queues. Before there can be a move to fully sourcing power from alternative energy sources, long-duration power storage and lower-cost SMRs need to be available, and they aren't there yet, she said. 

Singer thinks existing power capacity plus new investment in existing technologies will be sufficient to meet data center power demand forecast by Goldman Sachs for 2030. But there needs to be preplanning to ensure that, when data centers open, there is sufficient power to run them, without impacting grid reliability. Singer expects to see some SMRs come online by the end of this decade, with the process accelerating in the 2030s. He sees potential for de-mothballing and bringing back online nuclear plants that had been recently retired, although there are only a few of these plants available for de-mothballing. He also expects utilities to consider new larger-scale reactors, potentially leading to a significant expansion in U.S. nuclear capacity in the 2030s.

Aneesh Prabhu, S&P Global Ratings' power and LNG infrastructure managing director, believes nuclear power is a viable solution for hyperscalers aiming for clean and renewable energy goals in 2030-2035. "Given the high construction costs of large nuclear units in the U.S., SMRs could be a part of the solution," he said. "If successful, SMRs could significantly impact the energy landscape for data centers, potentially replacing natural gas generation, just as natural gas displaced coal-fired power generation."

Gilles Ubaghs, Datos Insights' strategic advisor, commercial banking and payments, advises banks to plan for their data center needs now. He said he gets the impression that banks consider energy consumption to be an issue for their technology partners, the cloud providers. "The hyperscalers are the ones who will be managing the energy issue, and for banks, it's a bit 'out of sight out of mind,'" he said. "Banks don't view this as a primary concern. But, as part of their due diligence, banks should be talking to their data center providers about their AI energy strategies." 

One option banks can consider for lowering their data center energy consumption is to use small language models for internal AI applications such as summarizing documents or data analysis. "These tasks don't need the full knowledge basis of a public-facing large language model and can use small language models which have fewer parameters and limited capacity to process and generate text compared with large language models," Ubaghs said. "There's also a broad expectation by banks that the technologies used for AI such as NVIDIA chips will become more energy-efficient in the near term and reduce the energy challenges they might otherwise face."

For reprint and licensing requests for this article, click here.
Artificial intelligence Technology
MORE FROM DIGITAL INSURANCE