How much power is enough for artificial intelligence? Even OpenAI’s Sam Altman and Microsoft’s Satya Nadella don’t have a clear answer.
As AI models grow larger and data centres expand at breakneck speed, both companies are realising that computing power isn’t the only constraint — electricity is fast becoming the real bottleneck. Microsoft and OpenAI have been aggressively securing GPUs to meet demand, but their power infrastructure has not kept pace. Nadella admitted on the BG2 podcast that Microsoft has ordered more chips than it currently has the power capacity to use.
“The cycles of demand and supply are impossible to predict,” Nadella said. “The biggest issue now is not a shortage of chips, but power — and the ability to get data centres built fast enough near energy sources. We’ve got chips sitting in inventory because I don’t have warm shells to plug into.”
The sudden shift from software and silicon to steel and substations has forced tech companies to navigate an unfamiliar world of long permitting cycles and slow infrastructure timelines. After decades of flat electricity demand in the U.S., data centre growth — driven largely by AI — is now outpacing utility expansion plans. Some developers are even bypassing the grid altogether through direct “behind-the-meter” power setups.
Altman warned that this surge could backfire. “If a very cheap form of energy comes online soon at mass scale, a lot of people will get burned with the contracts they’ve already signed,” he said. He added that the infrastructure buildout is under immense pressure from AI’s steep cost curve: “If we keep cutting the cost per unit of intelligence at this pace — about 40x a year — that’s a terrifying exponent from an infrastructure standpoint.”
Altman has already backed several energy ventures, including nuclear startups Oklo and Helion, and solar-focused Exowatt. But none of these technologies are ready for mass deployment yet. Meanwhile, traditional gas-fired plants take years to build and face regulatory delays, pushing many tech firms toward solar energy for its speed, scalability, and lower emissions.
Solar also appeals to Silicon Valley’s instincts. Like chips, photovoltaic cells are silicon-based, modular, and easy to scale — a mindset familiar to companies used to building data centres. Yet both AI facilities and solar farms take years to complete, while AI demand can spike in months.
Altman acknowledged the risk that some companies may end up with idle power plants if AI becomes more efficient than expected. But he doesn’t see that happening anytime soon. Instead, he leans toward Jevons Paradox — the idea that greater efficiency only drives higher consumption.
“If the price of compute fell by a factor of 100 tomorrow,” he said, “usage would rise by far more than that. There are countless things people want to do with AI that simply aren’t economical at today’s costs.”
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.