The United States’ largest technology companies are consolidating artificial intelligence (AI) infrastructure within national borders, fuelling a capital-intensive shift that has amassed more than $500 billion in corporate and strategic commitments in recent weeks.
This redirection marks a notable break from the open, global networks that have long underpinned the digital economy. Instead, AI infrastructure-spanning GPUs, cloud capacity and hyperscale data centres are being treated as a strategic asset, subject to increasing domestic concentration and federal alignment.
At the centre of this realignment is what one executive termed a new “compute arms race,” with processing power assuming a role akin to that of oil in the industrial era.
Oracle has reportedly struck a record $300 billion agreement with OpenAI for access to computing capacity, in what would be the largest such deal of its kind. Nvidia is preparing to invest up to $100 billion in the ChatGPT maker while also committing billions to chipmaker Intel. Meta has agreed a $10 billion cloud services partnership with Google, and Tesla has signed a $16.5 billion semiconductor supply deal with Samsung.
These moves follow a broader trend of AI consolidation, as technology giants seek to secure long-term access to scarce computational resources and to hedge against geopolitical and supply chain risks.
The sheer scale of investment, now exceeding half a trillion dollars, highlights the degree to which AI infrastructure has become not only a corporate priority but also a national one. Washington’s recent export controls on advanced chips and the CHIPS and Science Act signal growing alignment between private sector strategy and federal industrial policy.
What emerges is an increasingly fortified AI ecosystem—one where competitive advantage is no longer just about algorithms or models, but about control of the physical infrastructure that underpins them.
Enter: US Government
The US government itself is writing big checks to keep AI capacity domestic.
Washington recently took a 9.9 percent stake in Intel for $8.9 billion, explicitly linking semiconductor self-sufficiency to national security and AI leadership. The move builds on the CHIPS and Science Act, which earmarks $52 billion to expand domestic chip manufacturing, ensuring that future breakthroughs in AI run on American-made silicon rather than offshore supply chains.
Analysts warn that this is creating a closed, costly, and centralized AI future.
Unlike the internet era, where standards and infrastructure spread globally, the AI boom is being walled off inside US-led ecosystems.
As Ray Wang of Constellation Research put it, this is a “war for compute” that is rapidly consolidating into duopolies and mutual dependencies. The Stargate project, a joint venture between OpenAI, Oracle, and SoftBank planning up to $500 billion for AI-focused data centers, epitomises this model: scale so large that only a handful of players can participate.
The implications are stark. AI innovation worldwide may increasingly depend on corridors of GPU and cloud power located in the US, raising barriers for foreign competitors and startups alike.
What was once the promise of an open internet could give way to a walled garden of compute, one where entry requires not millions, but tens of billions, and where policy and capital align to keep the crown jewels of AI firmly inside US borders.
Compute as the new oil
Analysts agree that the fundamental shift is that compute power has become the scarcest, most strategic resource in AI.
“There is a war for compute power and the top players are consolidating the generation, distribution, and applications of AI,” Wang said. Unlike the past where software innovation could thrive on commodity hardware, today’s breakthroughs in large language models (LLM) and generative AI hinge on massive GPU clusters, power-hungry data centres, and long-term access to electricity and grid interconnects.
This explains why companies are writing unprecedentedly large checks.
As Fersht, CEO of global research and advisory firm HFS, observed, “Boards are writing very large checks now because leadership, market share, and unit economics in AI hinge on being first to secure multi-year GPU and power corridors. Otherwise, you pay more later and ship slower.”
Mutual dependencies or strategic alignment?
One defining feature of these megadeals is that they go beyond mere supplier contracts. They are mutual dependencies that double as strategy.
“OpenAI gets guaranteed scale and diversified cloud capacity via Oracle. Oracle secures multi-year, GPU-heavy workloads and brand relevance at the AI frontier. Nvidia locks demand visibility and positions its newest platforms as the default fabric for OpenAI’s next generation,” Fersht explained.
“That is dependency in service of strategy.”
For Nvidia, tying itself directly to OpenAI’s roadmap ensures that its chips remain the backbone of the most high-profile AI models in the world.
For Oracle, the $300 billion pact provides relevance in a cloud market otherwise dominated by Microsoft and Amazon. And for OpenAI, the deals secure not just GPUs, but entire gigawatts of compute at predictable costs, a critical hedge against supply chain bottlenecks.
"This means creating collaboration between US companies so that you can own the end-to-end stack of AI from application to chips to semiconductor manufacturing to packaging," Satya Gupta, President of VLSI Design and also a Member of the National Committee on Electronics Manufacturing, told Moneycontrol.
The scale of capital, and risks
If the internet era was defined by millions of venture dollars, the AI era requires tens of billions in upfront capital just to participate.
The Stargate initiative, with five new US sites already announced that bring capacity close to 7 gigawatts. Nvidia’s letter of intent to invest up to $100 billion in OpenAI underlines the sheer intensity of capital flows.
But such a scale carries risks.
“You have to have money to play, but more importantly you need a mind shift in business models,” Wang cautioned. Saudi Arabia’s ambition to shift from the world’s largest energy producer to the world’s biggest producer of compute power gives a glimpse how geopolitics and industrial planning are converging with AI.
Fersht also warned of the danger of stranded assets. “The risk is not only capex bloat, it is stranded assets if model architectures or energy constraints shift faster than expected. The mitigation is staged buildouts tied to utilization triggers and vendor financing like the Nvidia-OpenAI structure,” he said.
A closed ecosystem versus the internet age
The architecture of this buildout stands in sharp contrast to the early internet. Then, systems were open, competitors were many, and prices steadily declined as scale expanded. AI is heading in the opposite direction.
“AI is closed, few winners, increasing prices, and centralised power,” Wang pointed out. That centralisation raises costs for smaller players and reduces the diversity of innovation.
Startups may get easier access to rented compute via cloud providers, but the premium GPU corridors are being locked in by hyperscalers and elite partnerships.
Mapping the AI stack
Part of the reason for such consolidation is the complexity of the AI stack itself.
As S. Anjani Kumar, Partner at Deloitte India, explained, different players excel at different levels of the ecosystem:
- At the base layer, Nvidia and others produce GPUs.
- Data centers, or “AI factories,” aggregate those GPUs at scale.
- On top sit large language models (LLMs), trained with vast datasets.
- These models then enable AI services for specific corporate needs.
- Finally, AI products like ChatGPT or Gemini reach end-users.
“No single company can truly excel across all layers,” Kumar said. “That’s why you see collaborations and partnerships across the stack. The GPUs are not easy to make. Data centres need power, cooling, cybersecurity. LLMs require enormous training and fine-tuning. Services and products require domain expertise. It’s an ecosystem where specialisation and scale go hand in hand.”
In effect, the whole effort of these announcements is to bring together the US companies in the five domains and start collaborating, said Gupta, who is also a former chairman of the India Electronics and Semiconductor Association (IESA).
Policy as industrial strategy
Overlaying this corporate race is a clear policy hand from Washington.
The government’s Intel stake, the CHIPS Act subsidies, and a broader push to tie AI infrastructure to US jobs and security signal that compute is no longer just a business matter. It’s industrial policy.
As Fersht put it, “OpenAI is stitching together a national AI buildout with Oracle and SoftBank under Stargate. This is industrial policy by other means, aimed at anchoring next-gen AI compute inside the US and creating long-term supply certainty for model training and inference at scale.”
The global fallout
The US fortification of AI infrastructure has ripple effects for the rest of the world.
Rivals like China are accelerating domestic chipmaking to reduce reliance on American GPUs. The EU has floated state-backed AI funds to counterbalance US dominance.
But with the bulk of GPU supply and data centre capacity locked into US-based players, global challengers face steep barriers.
For startups worldwide, the implications are twofold: cheaper baseline access to compute through cloud programs, but also tougher differentiation as US incumbents internalise more of the premium capacity.
The walled garden of compute
In the end, analysts converge on a simple but unsettling conclusion: the AI economy is being constructed as a walled garden of compute, rather than a shared, open frontier.
As Wang summed it up, “These AI giants are creating AI duopolies and in some cases monopolies.”
That concentration may accelerate the march toward artificial general intelligence (AGI) but it also means that the levers of the future economy will rest in fewer hands, inside carefully fortified US borders.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!