KIS Bridging Loans
 
Presented by KIS Finance
 
What is Deepfake Audio Fraud and Why We All Need to Be Aware of it

You may not have heard of deepfake audio fraud, but it’s definitely becoming something that you should be aware of - especially if you work for a large company that may be susceptible to being defrauded for money.

One company lost over £200,000 after one of their employees was targeted with this type of fraud.

 

What is Deepfake audio fraud and how does it work?

The term ‘deepfake’ refers to the use of machine learning to create audio or visual impersonations of real people.

The technology can listen to the way a particular person speaks, including the accent, the tone and the grammar, and then recreate sentences based on what it has learnt.

This can then be used over the phone to trick people into thinking they are talking to somebody they know - usually their boss or someone high up in the company they work for.

Scammers will often use deepfake audio fraud to target employees who have the authority to transfer company money. They will phone the employee (usually getting details from a data breach) and use manufactured audio of the CEO (or similar) of the company asking them to transfer money to a supplier or another company. But the bank details they give you will actually be the scammer’s account and then that money is lost forever. This may also be accompanied by an email asking the same thing after they have spoofed the CEO’s email address.

The worry now is that scammers will begin to use this type of fraud to target consumers and trick then into transferring money to them by pretending to be from their bank or HMRC.

 

How to detect and prevent deepfake audio fraud

If you are targeted with deepfake audio fraud, it can be tricky to detect. It’s important to ask yourself whether this person would ask you to do something like this, or is it a bit out of the ordinary?

If you are unsure about what they’re asking you to do, the following list gives a few examples of some steps you can take;

  1. Focus on the person’s voice and consider whether it sounds normal and whether it’s flowing naturally. Although deepfake audio can be very convincing, it’s still speaking to real person so you should be able to detect some abnormalities if you listen carefully.

  2. Ask questions about why they’re asking you to do this, and when you should complete the task. This should show whether you are talking to the real person or not as technology won’t be able to converse as effectively as humans.

  3. Hang up the call and contact the person directly using the contact details you already have. This way you definitely know that you’re talking to the real person and you can check whether the request is genuine.

  4. If you’re still unsure, ask for the request to be followed up in writing.

 

Find it useful? Please share!

Subscribe for Updates

We will email you monthly details of our latest:

  • Business and consumer guides
  • Finance news
  • Information and awareness about the latest frauds and scams, to help you avoid them.  
I want to receive email updates

By submitting your email, you agree to our Terms and Privacy Notice. You can opt out at any time.