Microsoft’s Three Mile Island Deal Signals a Broader Nuclear Comeback

For five years, reactor one at Three Mile Island nuclear power station in Pennsylvania has lain dormant. Now, thanks to a deal with Microsoft, the reactor will start running again in 2028—this time to exclusively supply the tech firm with oodles of low-carbon electricity.

It’s all part of an ongoing flirtation between Big Tech and nuclear power. In March, Amazon Web Services agreed to buy a data center powered by Susquehanna nuclear power station in Pennsylvania. At an event at Carnegie Mellon University on September 18, Alphabet CEO Sundar Pichai mentioned small modular nuclear reactors as one potential source of energy for data centers. The links don’t stop there either: OpenAI CEO Sam Altman chairs the boards of nuclear startups Oklo and Helion Energy.

The AI boom has left technology companies scrambling for low-carbon sources of energy to power their data centers. The International Energy Agency estimates that electricity demand from AI, data centers, and crypto could more than double by 2026. Even its lowball estimates say that the added demand will be equivalent to all the electricity used in Sweden or—in the high-usage case—Germany.

This surge in energy demand is music to the ears of the nuclear power industry. Electricity demand in the US has been fairly flat for decades, but the sheer scale and intensity of the AI boom is changing that dynamic. One December 2023 report from a power industry consultancy declared the era of flat power demand over, thanks to growing demand from data centers and industrial facilities. The report forecasts that peak electricity demand in the US will grow by 38 gigawatts by 2028, roughly equivalent to 46 times the output of reactor one at Three Mile Island.

“[AI] is really taking off, and it’s garnering a lot of attention in the energy industry,” says John Kotek, senior vice president for policy development and public affairs at nuclear industry trade association the Nuclear Energy Institute. Kotek says there’s also a national security angle. “People legitimately see AI as a field of competition between the US and our global competitors.” The US falling behind in the AI race because it doesn’t have enough power “is something that’s really causing people to focus attention,” he says.

Part of the enthusiasm from power plant operators is due to government incentives to keep low-carbon power online. The Inflation Reduction Act includes tax credits tied to electricity production at existing nuclear power plants, but Kotek says that the industry will also have to get busy building new reactors if it wants to capture that projected energy demand. The number of operating nuclear reactors in the US peaked at 112 in 1990 and declined to 92 by 2022, and the most recently built reactors in the US—at Vogtle power plant in Georgia—took more than 14 years to build and came in at more than double the expected budget.

“The US showed at Vogtle that we’re not very good at building plants,” says Todd Allen, chair of nuclear engineering and radiological sciences at University of Michigan. But Allen points out that China seems to build nuclear power plants much more quickly than the US, so speeding up is possible, and that if energy demand from data centers continues to grow, then building entirely new plants will increasingly look like an attractive option.

These potentially lengthy timescales are part of the reason why Microsoft is interested in small modular reactors, which should be quicker and cheaper to build. But tech firms have tended to emphasize searching for new sources of energy rather than improving the efficiency of their artificial intelligence operations, says Sasha Luccioni, AI and climate leader at Hugging Face, a company that develops tools for building applications using machine learning. “Regulation could be one way to incentivize [great efficiency], starting with mandatory reporting and transparency for companies providing AI tools and services,” she says.

At the Carnegie Mellon University event, Pichai said that work on improving the consumption side of AI’s energy usage was still in its “early phases.” “We are all inefficiently pretraining these models, absolutely,” he said, but added that inference—actually asking an AI model to perform a task—could become “dramatically more efficient over time.” Google’s emissions in 2023 were 48 percent higher than their 2019 baseline, primarily due to increases in data center energy consumption and supply chain emissions, putting Google’s goal to reach net zero emissions by 2030 increasingly under threat. “The energy demands of AI are rising right now,” says Luccioni, but the renewable or low-carbon energy to fuel AI isn’t keeping pace quickly enough.

For some, the prospect of the site of the US’s most notorious nuclear disaster being used to power the AI revolution might sit uneasily. But Allen points out that reactor one did not shut down because of operational issues. Restarting the reactor, he says, will mostly be a question of making sure it is still in good operating condition and that there are enough trained staff to run it smoothly.

Source : Wired