You may not have heard of deepfake audio fraud, but it’s definitely becoming something that you should be aware of - especially if you work for a large company that may be susceptible to being defrauded for money.
One company lost over £200,000 after one of their employees was targeted with this type of fraud.
The term ‘deepfake’ refers to the use of machine learning to create audio or visual impersonations of real people.
The technology can listen to the way a particular person speaks, including the accent, the tone and the grammar, and then recreate sentences based on what it has learnt.
This can then be used over the phone to trick people into thinking they are talking to somebody they know - usually their boss or someone high up in the company they work for.
Scammers will often use deepfake audio fraud to target employees who have the authority to transfer company money. They will phone the employee (usually getting details from a data breach) and use manufactured audio of the CEO (or similar) of the company asking them to transfer money to a supplier or another company. But the bank details they give you will actually be the scammer’s account and then that money is lost forever. This may also be accompanied by an email asking the same thing after they have spoofed the CEO’s email address.
The worry now is that scammers will begin to use this type of fraud to target consumers and trick then into transferring money to them by pretending to be from their bank or HMRC.
If you are targeted with deepfake audio fraud, it can be tricky to detect. It’s important to ask yourself whether this person would ask you to do something like this, or is it a bit out of the ordinary?
If you are unsure about what they’re asking you to do, the following list gives a few examples of some steps you can take;
Find it useful? Please share!
Find it useful? Please share!
Last updated: 23 January 2020 | © KIS Bridging Loans 2024 | Terms & Conditions