Complete Story
08/07/2024
10 Steps to Protect Your Business from the Latest AI Workplace Scam
Source: Fisher Phillips, August 1, 2024
A prominent cybersecurity training company just fell victim to an increasingly common scam when it hired a remote worker who turned out to be a North Korean cybercriminal that used AI deepfake tools to fake his identity and infiltrate the organization. The Florida-based company caught the would-be thief before he was able to steal any data, but he did attempt to load malware and execute unauthorized programs on firmwide systems in what could have been a damaging attack. “If something like this can happen to us, it can happen to almost anyone,” the CEO said in the wake of the cyberattack. What are the 10 things you can do to ensure you don’t fall for the same scam?
What Really Happened?
The following information comes straight from the company CEO himself, who explained in full what happened in a series of blog posts.
- The cybercriminal was using a valid but stolen identity of a U.S. citizen. He used AI tools to enhance a stock photo and make it appear like a brand-new person.
- He most likely used AI tools to alter his voice, and may have also used AI technology to change his image (as was successfully done in the recent $25M Hong Kong heist).
- The fake worker had his company computer sent to a physical address somewhere in the U.S. that turned out to be an “IT mule laptop farm.” He then used VPN to mask where he actually was – which turned out to be in North Korea or right over the border in China.
- He worked in the middle of the night in Asia to make it seem like he was working in the daytime here in the States.
- Other details about the attack are not yet available because the matter is part of an active FBI investigation, but the point remains – if it can happen to a cybersecurity company, it can happen to you.
10 Steps to Protect Your Organization in the Remote Work Era
- Foster a culture of skepticism when it comes to hiring remote workers, similar to the way that employees are now on guard for phishing emails.
- Train your hiring team on social engineering tactics now being employed by malicious cyberattackers.
- Conduct all video interviews with the camera on and train your hiring team to look for deepfake signatures (blurry details, irregular lighting, unnatural eye or facial movements, mismatched audio, absence of emotion, etc.). As technology improves, you should also consider investing in threat-detection tools that can identify and flag potential deepfakes.
- Consider the feasibility of conducting in-person interviews, even for remote positions. Even mentioning that the process includes an in-person interview may dissuade scammers from continuing with the interview process.
- Make sure any laptops provided to new hires are completely wiped clean of any residual company data, including data stored on web browsers.
- Only ship laptops to physical addresses where the employee lives. Or send them to a trusted third-party (like a reputable delivery service office) where new employees are required to provide a valid picture ID to obtain them.
- Start new employees in a highly restricted environment where they only have access to the systems necessary to perform their work. Ensure they don’t have immediate access to production systems or sensitive data.
- Ensure your IT security monitoring systems are robust and up-to-date, trained to look for attempts to access unauthorized systems or download improper files.
- Audit all of your hiring practices to ensure your hiring team is consistently following best practices on background checks, references, resume review, interviews, and more.
- Conduct regular security training awareness sessions for all employees on the latest cybersecurity threats and how to recognize and report them.

