AI Voice Cloning: The Newest Cyber Scam
"Grandma, can I borrow $100?"
If your grandchild has ever called with a request like this, take note: Cybercriminals can now use artificial intelligence (AI) to impersonate nearly anyone.
"In the past, fraudsters had to hope they could successfully imitate your friend or loved one," says Peter Campbell, director of Schwab's Financial Crimes Risk Management division. "Now, they can use AI to get the voice just right."
Impersonation scams have been a problem for years, particularly targeting older individuals. But now that fraudsters require only a small audio sample—easily acquired from videos posted to social media—virtually anyone could be at risk of being deceived. "One downside to the AI revolution is often there can be no perceptible difference between a real and a cloned voice," Peter says.
The antidote? "A healthy dose of skepticism," says Andrew Witt, a senior manager at Schwab's Financial Crimes Risk Management division. "If you get a call and something seems off, trust your gut and call the person back at a number you know is theirs."
If you're worried AI could help criminals with fraudulent voice verification gain access to your financial or other accounts, there's good news: "While AI may be good enough to trick the human ear, the technology and tools available to large organizations like Schwab aren't so easily fooled since they analyze discrete speech markers," Peter says. "Nevertheless, you should report any unusual calls or account activity as soon as possible. Schwab's Security Guarantee offers to 'cover losses in any of your Schwab accounts due to unauthorized activity.'"1
1Read the terms of the Security Guarantee at schwab.com/guarantee.
Discover more from Onward
Keep reading the latest issue online or view the print edition.