Cyber criminals are using artificial intelligence to clone people’s voices to impersonate your friend or family member and swindle you out of money.
Answering calls from unknown numbers can already be risky, since it’s often a salesperson or just a recorded voice trying to persuade you to buy a service or product.
Voice cloning is now easily accessible. Cybercriminals are taking advantage of this by adding AI-powered voice cloning to their arsenal of tricks to scam people into handing over their money or personal information.
DON’T MISS: Hot Scam: Robocallers promoting a fake car warranty
The Federal Trade Commission warns consumers not to give calls to people claiming to be friends or relatives who need money soon, especially in a non-recourse form.
“Scammers ask you to pay or send money in ways that make it hard to get your money back,” the FTC wrote in a blog post. “If the caller tells send moneysend cryptocurrencyeither buy gift cards and give them card numbers and PINs, those could be signs of a scam.”
Cloning someone’s voice, whether it’s a famous musician, politician, CEO, or even your best friend, has become much easier thanks to advances in artificial intelligence technology.
AI is making it possible to clone a person’s voice and produce “authentic-sounding statements and conversations,” Chris Pierson, CEO of BlackCloak, an Orlando, Florida-based digital protection executive firm, told TheStreet.
Fraudsters create the clones by capturing a sample of a person’s voice, which could be accomplished by pulling a video from YouTube or TikTok, he said.
Even a few seconds of a person’s voice is enough for AI tools to capture the “essence of that person’s voice and then create completely original statements and conversations with the same frequency, intensity, harmonic structure, pitch, and inflection.” Pierson said.
A fragment of someone’s voice from a conversation is enough for a criminal to use it to generate “very realistic conversations that are ultimately false,” he said.
Voice cloning will be popular
Scammers are always after the money, and any tool that is easy to scale to generate more profit is attractive to them, Pierson said.
ChatGPT’s AI leap will prove popular with cybercriminals, as all these technological advances will earn them higher profits, he said.
“This technology can be used for malicious purposes and we’re starting to see this happen,” Pierson said. “Right now it looks like more of a fancy attack method, but in the coming months and years it will most likely be applied en masse and really create a cyber cyclone of fraud.”
The alarming part of the new technology is that artificial intelligence “needs very little content to train” unlike older methods, such as those used for automated customer service agents, Alex Hamerstone, director of solutions at consulting at TrustedSec, an ethical company based in Fairlawn, Ohio. hacking and cyber incident response company, he told TheStreet.
AI doesn’t need a recording of every word you’ll use and only needs a “handful of words spoken by someone and can create a very real sounding version of almost any word, and can put these words together into sentences that sound like the person that was recorded.” “, said.
“What’s really important here is that it’s not just the individual words that ring authentic, but the person’s entire style of speaking,” Hamerstone said. “Not only does it sound like the person word for word, it also sounds like the person when speaking in longer sentences. It also picks up speech patterns like pauses, mouth noises, etc. and is very convincing.”
Voice cloning is already becoming popular and is “likely to be widely used by criminal groups, especially the more sophisticated gangs,” he said.
The cloned voices are realistic and make it easy to fool someone.
“As these tools evolve in the coming months and years, it will be extremely difficult to tell the difference between a real person’s voice and their AI clone,” Hamerstone said.
“This will not only help direct phone scams, but also in combination with other social engineering attacks such as email phishing and text phishing. It is likely that scammers will continue to take full advantage of this technology due to the compelling what is is.”
AI’s ability to create “credible content through video, audio and text has upped the malware game,” Timothy Morris, chief security advisor at Tanium, a Kirkland-based converged endpoint management provider, told TheStreet. , washington.
Using AI tools for voice makes attacks and scams more credible and easier to fool people because “the request sounds like it’s coming from someone you know,” he said.
Common Voice Cloning Scams
Scammers will try to quickly gain the trust of the other person on the phone.
Since people are used to receiving calls with poor reception, cloned voices don’t have to be perfect.
Scammers are after money, especially in the form of gift cards, since they are difficult to trace. Fraudsters will also try to gain access to a computer, confirm banking information or passwords, or go a bolder step and request access to funds via wire transfer, Zelle or another instant payment method, Pierson said.
The number of consumer scams will increase: grandparent scams are likely to proliferate and donation/charity scams are likely to benefit from this as well, he said.
“Voice cloning, ChatGPT, image creators and deep forgeries are incredibly powerful tools in their own right, but when used in combination, they are likely to overwhelm even the most security-conscious person,” Hamerstone said.
Voice cloning will not be used only for one-off “vishing” scams. Expect to see it combined with other types of attacks, such as email and text phishing.
Businesses will be big targets because people are more likely to open an email, click a link or download an attachment, especially if they receive a call “urging them to do so shortly after the email arrives,” he said.
Fraudsters can use voice cloning to harvest an executive’s voice and use it to attack employees or vice versa.
“This will make phishing attacks targeting corporate entities much more effective, especially when it comes to wire fraud schemes,” Pierson said. “When you combine this with the ease with which phone numbers can be spoofed and the scripts that can be created using GPT Chat, you can create a perfect storm.”
Companies cannot be relaxed and must train their employees to only use trusted communication methods and to be “extremely careful when an unexpected and urgent request arrives,” said Zane Bond, product manager at Keeper Security, a zero-trust vendor. based in Chicago. and zero-knowledge cybersecurity software, he told TheStreet.
“Artificial intelligence in the hands of adversaries has the potential to exponentially amplify social engineering, which is currently one of the most successful scam tactics available,” he said.