If a member of your family or a close friend called you seemingly in distress and asking for money, would you question whether it’s them? Or would you leap into action to help in any way you can?
What is AI voice cloning?
Scammers can take voice audio and record it, then feed it into AI to ask it to say certain things. With most scam attempts involving trying to instill panic into their intended victims, it’s likely that it’ll involve making the voice sound in distress, asking for money – a technological step-up from texts pretending to be family members stuck in other countries, only it’ll be harder to immediately spot a scam when you think you can hear a loved one in distress.
Does it actually happen?
Although it sounds like something ‘from the future’, in a recent email sent to customers, Starling Bank say that AI voice scams are on the rise, with 28% of Brits thinking they’ve been targeted within the last 12 months. Even if you pride yourself on being un-scammable, anyone can become a victim, especially as technology becomes more and more advanced. Scammers only require three seconds of audio to clone someone’s voice and trick their friends and family.
How to identify an AI voice cloning call
It can be tricky, just take a look at this slightly terrifying video of actor, James Nesbitt, taking a phonecall…. with himself!
One way to keep safe from this kind of thing personally, is to have an agreed upon safe phrase shared with family and friends, to identify if the person you’re talking to is actually them. Then, during any conversation, you can ask them to verify what the safe phrase is, and will know if you’re speaking to them, or a clever clone.
Safe phrases should be short, sweet, and easy to remember. But also random – don’t use easily guessable things like pet names or children’s names, or favourite football teams. Flamingo Iced Tea or something like that, easy to remember but no scammer’s going to guess that off the top of their heads!
What about businesses?
Although this technology is in its early days, like any scam it’s likely to be used to get money. In a professional setting, it could be used to impersonate bosses asking employees to move money – so in settings where large sums of money are involved, like accountants or solicitors, creating a pass phrase for your accounts team in particular wouldn’t be overkill.
It could be used to impersonate clients, so having tight security protocols in place if needed to ensure you don’t act on someone’s instructions other than their own might also be a good idea.
This isn’t about spreading fear, it’s about knowing what’s possible out there with very real tech and getting ahead of the game, before it’s too late.
If you’d like to talk through ways of adding protection for this kind of scam into your security protocols, please do get in touch with us. We can talk through what kind of thing might be right for you and your business, so you don’t get stung.
Further Reading: Still Using Phones That Only Support 3G?