By Shivanghi Sukumar
India is betting big on artificial intelligence. The proposed IndiaAI Mission - designed to build a comprehensive Indian AI ecosystem that fosters innovation - signals the government's ambition to harness AI for growth. But whether this ecosystem flourishes or falters will depend less on the size of the bet, and more on how India chooses to regulate.
The compliance trap
Take a startup building an AI assistant that not only answers emails but also pays bills, dims lights, and schedules doctor appointments. A rule mandating full interoperability from day one may sound pro-competitive on paper, but in practice it forces the startup to open its integrations before it has gained traction. Faced with high costs and compliance hurdles, such projects may never make it to market.
The same goes for small businesses. Imagine an MSME using integrated AI tools on digital platforms to handle tasks that would otherwise require a workforce affordable only to large businesses. If regulations were to prohibit such integration, the MSME would lose the ability to leverage AI and compete effectively with larger players.
Structuring AI regulation
This highlights a broader choice: per se vs. effects-based approaches, ex ante vs. ex post regulation. Ex ante rules offer predictability but may push companies to design for compliance rather than innovation. By contrast, an effects-based ex post framework intervenes only once real and verified harms occur - reducing certainty but allowing greater scope for ambition and experimentation
Form matters too. Rule-based mandates - quotas or interoperability requirements - create certainty but can ossify quickly in fast-moving fields. Principle-based duties (“ensure fairness”) are more flexible but risk uneven enforcement. The European Union’s draft AI Act reflects this trade-off: its risk-based framework is thoughtful, yet its detailed ex ante obligations could unintentionally narrow the space for innovation.
Funding winter raises the stakes for India
For India, the stakes are especially high. Startup funding has already plunged - down by about 72% in 2023, from $25 billion in 2022 to just $7 billion. Layering heavy compliance on top of that risks driving away the capital the ecosystem needs. Earlier this year, the Supreme Court cautioned that enforcement detached from effects can deter the long-term investment and expertise the economy urgently needs. That warning is just as relevant to AI today.
India wants AI to power financial inclusion, health access, and smarter public services. But if rules force young firms to spend scarce resources on audits and legal reviews, the broader economy loses out on potentially transformative applications.
Interoperability and data-sharing mandates are often framed as pro-competitive, but if introduced too soon, they can undercut the incentives for young firms to build and grow. India can sidestep this risk by focusing first on clear principles, applied with targeted enforcement once actual harms are evident.
Another path is to begin with informal, non-binding guidance, which would give companies room to test and scale AI applications responsibly, while keeping open the option of firmer rules once the ecosystem has matured. These approaches could allow the ecosystem to test and scale AI responsibly before enforcement kicks in.
AI will shape India’s national competitiveness in the decade ahead. Whether it becomes a platform for broad-based growth will depend less on how much regulation is written, and more on how it is designed. Regulate too early, and India may never see the breakthroughs it needs.
(Shivanghi Sukumar is Partner, Axiom5 Law Chambers.)
Views are personal and do not represent the stand of this publication.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.