
OpenAI CEO Sam Altman has said that artificial intelligence may eventually be delivered as a basic utility similar to electricity or water. Speaking at the BlackRock Infrastructure Summit in Washington, DC, Altman said the future of AI could involve users paying for intelligence based on how much they use it.
According to Altman, companies developing AI models are moving toward a system where computing power and AI services are provided on demand. This approach could make AI accessible across industries while also linking its availability to infrastructure and compute capacity.
AI as a metered utility
Altman said the long-term business model for AI providers may resemble the utility model used for services such as electricity. In this structure, users would pay according to their usage rather than buying a fixed product or subscription.
He noted that many AI companies, including OpenAI, already charge based on tokens — the units used to measure how much text or data an AI system processes. This token-based system effectively meters the use of AI, similar to how electricity consumption is measured through power usage.
Altman said that in the future, intelligence could be treated as a widely available resource that individuals and businesses access whenever needed.
Altman also highlighted the role of computing infrastructure in determining the availability and price of AI services. The processing power needed to train and run large AI models relies heavily on advanced chips, data centres and large-scale electricity supply.
He said that if companies fail to build enough computing capacity, the price of AI could increase or access could become limited. In such a situation, only organisations with significant financial resources may be able to use large-scale AI systems.
Technology companies are investing heavily in expanding compute infrastructure to keep pace with rising demand for AI. Industry leaders have warned that electricity supply and power grid capacity could become a major constraint for future AI expansion.
Demand for AI computing resources is already growing inside technology companies. Engineers and researchers increasingly compete for access to GPUs and compute budgets needed to train and test models.
Altman said the long-term goal is to expand infrastructure so that the industry is no longer limited by computing capacity. If that happens, AI could eventually become a standard service used in daily life, delivered in the same way as other essential utilities.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.