The use of deepfakes in remote job interviews is becoming more and more frequent. The objective? Access sensitive data

0

There are many things that HR managers at large companies look at during a job interview when deciding whether or not to hire a candidate. But, from now on, when the interview takes place by videoconference, they are going to have to learn to look even in the smallest facial gesture… to rule out that, in reality, what they are seeing does not exist.

And it is that the FBI Computer Crime Complaint Center issued a statement yesterday stating that the US agency is experiencing a wave of complaints of cases of use of deep fakes. The illicit use of this technology is something that we quickly link to the spread of fake news or revenge porn.but in this case, the goal is very different: they use it to apply for a remote job.

DO NOT BE FOOLED! The main SCAMS in ONLINE PURCHASES and HOW TO AVOID THEM

It’s not that they really want to work, no

Indeed, cybercriminals are beginning to use images, video and audio generated by artificial intelligence, in combination with stolen personal data, to achieve present themselves as viable candidates for jobs —especially in the technology sector itself— in which, once hired, open the doors to obtaining confidential information about the company or its customers.

Audio deepfakes are sadly no longer foreign to the corporate sector: one of their earliest uses, three years ago, was to use them to run scams by ‘impersonating’ the CEO of the company. But now they are used to impersonate employees.

So it is possible to know if a video is a deepfake with just the blink of an eye, literally

According to the FBI statement:

“The remote job positions identified in these complaints include fields such as ICT, programming, databases, and software-related job functions. In some of the cases, the position provided access to financial data, corporate databases, or proprietary information.

Detecting a deepfake video, especially when you don’t expect it, is more difficult than it might seem. It helps to keep an eye out for inaccurate skin textures or shadows that don’t behave as they should

The quality of these attempts, yes, is variable —although, of course, the most convincing cases may not yet have been detected—:

“The complaints reported the use of voice deepfakes during online interviews of potential applicants. In them, the actions and the movement of the lips of the person interviewed on camera are not fully coordinated with the audio of the person speaking. Sometimes actions such as coughing, sneezing, or other auditory actions are inconsistent with what is presented visually.

The FBI already issued a warning to the US private sector last May, in which it was stated that North Korea has agents applying for jobs in the tech industry (especially from Web3 and cryptocurrencies). In those cases, these fake workers often bid for vacancies using sites like Upwork or Fiverr… and having previously collected false documentation and references.

LEAVE A REPLY

Please enter your comment!
Please enter your name here