
Perplexity CEO Aravind Srinivas has stirred up a fresh debate about the future of AI and the giant data centers powering it today. In a video clip shared on X on January 1, 2026 by user @slow_developer, Srinivas calls local, on-device AI the biggest threat to massive data centers being built across the world.
To understand his point, you first need to know how most AI works right now. When you use a chatbot, generate an image, or ask AI to summarise something, your request usually travels to huge servers sitting in data centers. These centers are filled with powerful GPUs, consume a lot of electricity, and cost billions to build and maintain. Companies like Meta, Google, OpenAI and others rely on them to run AI models because those models are too heavy for phones or laptops today.
But Srinivas says this could change. He believes that if AI models become small and efficient enough to run directly on a chip inside your phone or computer, people won’t need to send their data to far-away servers anymore. His exact point in the video is simple: the moment intelligence can be packed locally on a chip running on the device, data centers lose their importance.
He explains that local AI would be faster because the processing happens on your device itself. It would also be more private, because your personal data wouldn’t leave your phone or laptop to be stored in the cloud. Srinivas even compares it to having your own digital brain that lives with you and learns from you.
He also talks about something called test time training. In Srinivas’ words, this means AI observing the tasks you repeat on your system—like editing photos, writing emails, designing slides, or sorting files—and slowly learning your workflow. Over time, it begins automating those tasks without you having to upload anything to a server. He says this makes AI feel less like a tool you access and more like a system that becomes a part of you. That way, you own it. It’s your brain, he says in the clip.
Srinivas isn’t dismissing the power of cloud AI entirely, but he questions the logic of spending massive money on centralized data centers if devices can run intelligence locally. Tech giants are currently betting anywhere from Rs 500 billion to Rs 5 trillion on building GPU-packed facilities to lead the AI race. But if edge AI grows fast, Srinivas says, it simply doesn’t make sense to spend all this money on centralized data centers.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.