IBM chief Arvind Krishna has there are no rules dictating artificial intelligence (AI) models to remain big and expensive, adding the cost of training is just another challenge to be solved.
The comments come at a time when Chinese startup DeepSeek’s cost-effective AI models have threatened the Silicon Valley technology giants who have been pouring huge sums of money to design and train their models.
“For too long, the AI race has been a game of scale where bigger models meant better outcomes. But there is no law of physics that dictates AI models must remain big and expensive. The cost of training and inference is just another technology challenge to be solved,” Krishna wrote in a LinkedIn post a few days back, following the company’s earnings announcement.
“We’ve seen this play out before. In the early days of computing, storage and processing power were prohibitively expensive. Yet, through technological advancements and economies of scale, these costs plummeted. AI will follow the same path.”
Speaking during the company’s Q4 earnings call, he said IBM’s AI portfolio is aimed at meeting diverse needs of enterprise clients enabling them to leverage a mix of models, IBMs, their own, open models from Hugging Face, Meta and Mistral.
IBM's Granite models designed for specific purposes are 90 percent more cost-efficient than larger models, Krishna said. IBM’s generative AI book of business now stands at more than $5 billion inception-to-date, up nearly $2 billion quarter over quarter.
For the full year 2024, IBM’s revenue stood at $62.8 billion, growing 3 percent in constant currency.
“This is promising for businesses. Technology becomes truly transformative when it becomes more affordable. As more companies embrace this shift, we’ll see an AI landscape that is both more powerful and accessible,” Krishna added in the post.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!