A Singapore businessman lost at least S$4.9 million, or US$3.8 million, in an AI-powered deepfake scam [1].

The incident highlights the increasing sophistication of generative AI, which now allows criminals to impersonate high-ranking officials with high-fidelity video and audio to bypass traditional security instincts.

According to the Singapore Police Force, the scammers used AI-generated deepfake videos to impersonate senior Singapore government officials [1]. The fraudsters targeted the businessman by offering a fraudulent funding assistance scheme related to the Strait of Hormuz [1].

Investigators found that the scammers targeted professionals who had previously interacted with government officials to make the deception more convincing [1]. By leveraging these existing professional connections, the perpetrators were able to lure the victim into transferring the funds [1].

The Singapore Police Force issued a statement regarding the incident on May 14, 2024 [1]. The agency said there is a need for extreme caution when dealing with unexpected financial requests, even those appearing to come from trusted sources.

"The police advise members of the public to be vigilant against scams involving the impersonation of senior government officials," a Singapore Police Force spokesperson said [2].

Authorities continue to monitor the use of synthetic media in financial crimes. This case marks one of the more significant losses attributed to deepfake technology in the region, as the attackers moved beyond simple phishing emails to real-time visual deception [1].

A Singapore businessman lost at least S$4.9 million, or US$3.8 million, in an AI-powered deepfake scam.

This scam demonstrates a shift toward 'hyper-personalized' social engineering, where attackers use AI to bridge the trust gap. By combining deepfake visuals with specific geopolitical contexts—such as the Strait of Hormuz—and targeting individuals with known government ties, scammers can create a level of authenticity that traditional verification methods may fail to detect.