The article discusses a growing issue where fraudsters use artificial intelligence (AI) to pose as job candidates, particularly for remote positions at U.S. companies.

These impostors employ AI tools to create fake photo IDs, fabricate employment histories, and generate responses during interviews. According to Gartner, by 2028, one in four job applicants globally could be fake.

The article highlights a case at Pindrop Security, where a candidate named “Ivan X” used deepfake technology to disguise themselves during a video interview. Once hired, these fake employees might install malware, steal data, or simply collect salaries.

Fake remote worker
Gif;base64,R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw==

Who is being affected?

The problem is especially prevalent in cybersecurity and cryptocurrency firms, with some CEOs reporting a significant increase in fraudulent applications. The U.S. Justice Department has noted cases where over 300 firms hired impostors linked to North Korea, who used stolen identities to secure IT roles and funnel money to fund weapons programs. To counter this, companies are turning to identity-verification services to screen applicants.

View full article at CNBC