In a computing world obsessed with graphics processing units (GPUs) and ever-larger machines, the work of Professor Sushant Sachdeva is a reminder of an older, more powerful truth that hardware makes computers faster but algorithms decide what computers can do.
That distinction sits at the heart of why Sachdeva, associate professor of mathematical and computational sciences at the University of Toronto, was awarded the Infosys Prize 2025 in Engineering and Computer Science.
“Hardware makes things faster, but algorithms make new things possible,” Sachdeva told Moneycontrol after winning the prize for building absurdly fast algorithms.
Cracking a problem stuck since 1998
Sachdeva’s breakthrough challenges one of the most studied problems in computer science, known as maximum flow, which supports how information, traffic, and resources move through networks such as the internet, transportation systems, and communication grids.
Despite decades of effort, the fastest known algorithms for this problem had remained largely unchanged since 1998.
His work shows that these problems can, in theory, be solved in linear time, meaning computation scales proportionally with the size of the network rather than exploding exponentially.
Explaining what “fast” truly means, Sachdeva said that when a network grows by a factor of 100, “we want the algorithm to slow down only by a factor of 100, not by 10,000 or a million”.
For example, if an algorithm can analyse traffic flow across Bengaluru’s road network in one minute, a truly fast, linear-time algorithm should be able to analyse a network 100 times larger, such as the entire country, in about 100 minutes.
Slower algorithms would take exponentially longer, extending that computation into days or even months, making them unusable at scale.
Why algorithms matter more than hardware
The importance of this advance lies not in immediate deployment but in what it establishes as possible.
Sachdeva said that over the last 50 years, algorithmic improvements have far outpaced hardware speedups.
GPUs and faster chips can accelerate existing methods, but algorithmic insights create entirely new capabilities.
Even in artificial intelligence (AI), he added that the hardware existed long before recent breakthroughs, but it was algorithmic innovation that unlocked its potential.
A benchmark for decades, not quarters
Professor Jayathi Y Murthy, president of Oregon State University and chair of the Infosys Prize jury for Engineering and Computer Science, placed the work firmly in the category of foundational science.
As computational systems scale, she said, the central question is whether computation grows at a manageable rate or becomes unusable.
“If a network grows 100-fold but the computation grows a billion-fold, that algorithm isn’t useful,” Murthy told Moneycontrol.
What this means for AI
The work also challenges the assumption that advances in computing will primarily come from applied AI systems or larger datasets.
Sachdeva pointed out that many machine learning models operate under constraints that prevent them from seeing global structure in large networks, limiting the kinds of problems they can solve.
Foundational algorithms, by contrast, address these global questions directly, even if they do not fit neatly into today’s AI workflows.
Why the Infosys Prize backed this work?
For the Infosys Prize jury, this long-term relevance was decisive. Murthy described the research as not just globally relevant but indefinitely so, because it reshapes understanding of what is computationally possible.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.