AI is now entering our lives in a good way, but it can also be used by scammers to manipulate us out of crypto or money. Here's what to look out for.
Artificial intelligence is transforming the way scammers operate, with AI-powered voice cloning emerging as one of the most worrying new tactics. These scams cross all industries, whether that be cryptocurrency exchanges, banks or even just random people.
Using short audio samples, fraudsters can now create convincing replicas of someone’s voice and use them to trick friends, family, or colleagues into sending money or revealing sensitive information.
Voice cloning tools (many of them free or very low cost) can recreate a person’s voice with just a few seconds of recorded audio. Scammers often source these samples from social media videos, podcasts, or other online recordings. Once the voice is cloned, they’ll typically call the target and play a short, pre-recorded message designed to create panic or urgency.
These scams are often an evolution of older cons, such as the “Hi Mum” text scam, but now they add the realism of hearing a familiar voice in distress.
Victims may be told a loved one has been in an accident, is in jail, or needs urgent bail money. They say funds must be sent immediately via gift cards, cryptocurrency, or bank transfer. Excuses are given for why the person can’t speak further, limiting the victim’s chance to notice anything suspicious.
AI voice scams have already caused significant harm overseas and are starting to surface in Australia. In the US, victims have paid hundreds of thousands of dollars in fake ransom demands after hearing what they believed were their kidnapped relatives.
One US-based man lost $25,000 after scammers used a cloned version of his son’s voice to request bail money.
Closer to home, an Australian reported receiving a call from someone using a cloned voice of former Queensland Premier Steven Miles to promote a fake Bitcoin investment.
While the voice was slightly robotic, it was convincing enough to raise an alarm. The National Australia Bank now lists AI voice cloning among its top scam threats.
Spotting these scams can be challenging, but experts recommend focusing on what is being said, not just who you think is saying it. Warning signs include:
-Urgency and pressure to send money quickly.
-Unusual greetings or missing social cues.
-Accents slipping in and out.
-Short, pre-prepared messages that avoid back-and-forth conversation.
-Reluctance to answer personal questions or explain details.
-Calls from unknown or blocked numbers.
While it’s impossible to stop scammers from accessing public audio entirely, you can make it harder for them to succeed. -Agree on a family codeword that must be used during any emergency call.
-Hang up and call back using a known, verified number.
-Avoid giving out sensitive information over the phone.
-Educate vulnerable relatives, especially older family members, about how these scams work.
-Review your privacy settings on social media to limit access to videos and voice recordings.
-Let unknown calls go to voicemail to buy time for verification.
Experts warn that AI voice cloning is only going to get more accessible and convincing. Some tools can create realistic voices for less than $2 a month, and adding more audio samples increases their accuracy.
While not every scam currently involves voice cloning, the technology’s rapid adoption makes it a growing risk.
The best defense is a combination of awareness, healthy skepticism, and clear verification processes. If you receive a call, even from someone you think you know, asking for urgent money or personal information, pause, verify, and act only when you’re certain it’s legitimate.
Your information is handled in accordance with CoinJar’s Privacy Policy.
Copyright © 2025 CoinJar, Inc. All rights reserved.
CoinJar, Inc. is a registered Money Services Business with FinCEN and licensed as a money transmitter, NMLS #2492913. For a list of states in which CoinJar, Inc. is licensed or authorized to operate, please visit here. In certain other states, money transmission services are provided by Cross River Bank, Member FDIC.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.