Deepfakes:  A sophisticated new approach to cyber fraud

by Bailey Amber
0 comment

Cyber attackers have always come up with a large repertoire of tools to further their criminal enterprises.

While companies are getting better at defending themselves against traditional types of attacks due to increasing investments in IT security, the element that is probably easiest for attackers to outsmart is often left out – humans. 

Humans are precisely the vulnerability that fraudsters mostly exploit through phishing emails, in which they disguise malware in a message from a seemingly trustworthy source. The reason for the popularity of phishing is obvious: for this type of attack, the fraudster does not need in-depth technical understanding, nor does he need to make high investments or efforts. This makes for an attractive cost-benefit factor for criminals. The naivety and carelessness of employees are mercilessly exploited to scam money and cause damage at the expense of the company.

Similar to phishing, we are increasingly encountering a relatively new type of fraud that also uses people instead of machines as an extremely effective gateway: deepfakes. 

But what are Deepfakes?

Deepfakes are videos that deceive their audience through technical manipulation.

The name of these recordings, which are often used as a scam, is a portmanteau word from the two English terms “deep learning” and “fake”. The latter is a reference to the fact that the video’s image and/or audio track is a fake, in which a person’s face, facial expressions and voice are deceptively imitated. The first part of the portmanteau, called deep learning, sheds light on the “how?” of creating these fake recordings.

The namesake process is a subset of machine learning that uses machine learning methods to train an artificial intelligence (AI) to fake videos. A large amount of information is fed into the system, and over time the AI learns on its own how to process new information most effectively. The end result, in the case of deepfakes, is fake footage, some of which is almost indistinguishable from reality.

What impact can they have on your business? 

As with so many other technologies, deepfakes harbour both opportunities and dangers for businesses. 

The negative element of the technology is that both audio and video deepfakes can be used by criminals to lure employees into doing something and acting on something they normally wouldn’t. This is because deepfake technology allows for audio and/or video elements to be captured from legitimate sources and then used to create a ‘deepfake’ of someone within the organisation. 

For example, a deepfake in a business situation could be a video or audio of someone pretending to be the CEO asking for a transfer of money from employees. By using the software available, cybercriminals can use recordings or actual audio from individuals and then create conversations that sound like they’re coming from the original person, encouraging employees to do something that could later cause damage to the business. 

However, as much as deepfakes come with dangers, they can also be used as a very useful tool for businesses, especially when it comes to learning and development. 

Deepfake technology can be used to create synthetic data sets for training machine learning models and for simulations in environments which may be challenging to represent in real life.

Remain vigilant to protect your business

Deepfakes currently only make up a small portion of the number of all cyber frauds. The technology is not yet fully developed and is also complex and costly. However, that will change in the future. Deepfakes are becoming more accessible and easier to handle, making it possible for less tech-savvy cyber crooks to use this technology successfully.

The principle is to remain vigilant and always review new information and sources with a critical eye to avoid falling for scams or hoaxes. Security awareness training can help to train employees to deal with such dangers and thus protect themselves and their company from social engineering attacks.

Keep up to date with our stories on LinkedInTwitterFacebook and Instagram.

Related Posts