A United States (US) man shared how his parents nearly fell victim to an AI voice-cloning fraud that could have cost them US$30,000.
The Florida State House candidate Jay Shooster described how con artists impersonated him using artificial intelligence (AI) and led his parents to believe he had been in an automobile accident, was being detained, and was in urgent need of bail money.
“My dad got a phone call no parent ever wants to get. He heard me tell him I was in a serious car accident, injured, and under arrest for a DUI and I needed $30,000 to be bailed out of jail,” Jay Shooster posted on X.
Jay Shooster clarified that the con artists had created a believable voice for themselves using only 15 seconds of his voice from a recent TV appearance.
“It sounded just like me but it was not me,” he clarified.
“That’s how effective these scams are. Please spread the word to your friends and family,” he added.
Jay Shooster urged readers to get ready for such situations in the future. “Can you imagine your parents doubting whether they’re actually talking to you when you really need help?” he asked, advising families to employ identity-verification techniques such as secret passphrases in an emergency.