Deepfake CFO Video Call: Hong Kong Finance Worker Pays $25 Million

A finance employee at a multinational company was tricked into transferring $25 million after attending a video conference call populated entirely by AI-generated deepfakes of the company's CFO and other colleagues.

Unnamed Multinational (Hong Kong)·2024·2 min read

Attack Chain

  1. 1
    Voice deepfake of CEO generated
  2. 2
    Finance worker joins video call
  3. 3
    Wire transfer instructions given
  4. 4
    $25M transferred to attacker accounts
  5. 5
    Fraud discovered days later

Background

AI-generated video deepfake technology reached a quality threshold in 2023-2024 where real-time generation of convincing video was possible with modest hardware. The cost of creating a convincing deepfake fell from millions of dollars to effectively zero.

The Attack

An employee at a Hong Kong branch of an unnamed multinational received an email — apparently from the UK head office — requesting a confidential wire transfer. Suspicious, the employee attended a video conference call with who appeared to be the CFO and several known colleagues to verify the request. All participants except the employee were AI-generated deepfakes — their likenesses and voices synthesised from publicly available video footage. Convinced by the video verification, the employee authorised 15 transfers totalling $25.6 million to five different bank accounts.

Response

The employee later raised concerns with headquarters and discovered the fraud. Hong Kong police investigated and arrested six people connected to the scheme. The technology used was determined to be commercially available deepfake software. No recovery of the funds was reported.

Outcome

The attack marked a qualitative shift in social engineering: for the first time, visual and voice verification — previously considered reliable confirmation methods — were successfully faked. The FBI issued guidance stating that deepfakes are now a primary business email compromise technique.

Key Takeaways

  1. Video and voice verification are no longer reliable authentication methods — use out-of-band code words
  2. Any wire transfer above a threshold should require a physical callback to a verified, pre-registered number
  3. Train finance staff on deepfake video fraud — the faces and voices of executives can now be faked in real time
  4. Implement dual-authorisation for all large outbound wire transfers regardless of who appears to approve
deepfakevideo callBECAI fraudwire transfer