Jay Shooster, the Democratic nominee for Florida’s State House District 91 in US, took to X to share a chilling personal encounter with an AI-powered voice-cloning scam that nearly deceived his family. Shooster revealed that his father had received a call claiming to be from him, stating that he had been in a serious car accident, was injured, under arrest for driving under influence, and needed $30,000 (over Rs 25 lakh) for bail. But none of it was true—it was a scam, powered by AI technology.
The twist? The voice was almost indistinguishable from Shooster’s own, convincing his family to the point where they nearly fell victim to the fraud. “I'm not sure it was a coincidence that this happened just days after my voice went up on television. Fifteen seconds of me talking. More than enough to make a decent AI clone,” Shooster tweeted.
As a consumer protection lawyer who has long warned about such scams, Shooster was particularly shaken by how close his own family came to falling prey. Despite his professional expertise and repeated warnings to his family, the sophistication of the scam made it extremely difficult to detect.
Shooster’s story flashes the light on the increasingly alarming trend of AI-driven scams. AI voice-cloning technology allows scammers to mimic the voices of loved ones, making their fraudulent claims all the more convincing. Shooster highlighted how one red flag—an insistence on cash payment—was what ultimately made his family suspicious and realise it was a scam.
Today, my dad got a phone call no parent ever wants to get. He heard me tell him I was in a serious car accident, injured, and under arrest for a DUI and I needed $30,000 to be bailed out of jail.But it wasn't me. There was no accident. It was an AI scam.
Jay Shooster (@JayShooster) September 28, 2024
One commenter noted, "My dad didn’t fall for the scam though because he called him ‘grandpa,’ which isn’t what my son calls him.”
Shooster said that such scams are only likely to rise with advancements in AI technology. “A very sad side-effect of this voice-cloning tech is that now people in real emergencies will have to prove their identities to their loved ones with passwords etc. Can you imagine your parent doubting whether they're actually talking to you when you really need help?” he wrote.
Shooster’s post comes with a call to action: for better regulation of AI technology to prevent its misuse. He stressed the need for lawmakers to urgently address the evolving risks posed by AI, warning that failure to do so could lead to further exploitation of vulnerable individuals.
As these scams become more prevalent, Shooster and others are urging people to spread the word and adopt extra safety measures, such as using secret passwords with loved ones to confirm identities during emergencies. "We’re all going to need a secret passphrase that identifies we’re real," another X user added.
Shooster's tweet has nearly 5 million views so far.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.