First Documented AI Voice Clone Fraud: CEO's Voice Transfers €220,000

The CEO of a UK energy company received a phone call from what sounded exactly like his parent company's German CEO. He transferred €220,000 to a Hungarian bank account. It was a synthetic AI voice — the first documented case of AI voice cloning used for financial fraud.

UK Energy Company (Unnamed)·2019·2 min read

Background

In 2019, AI-powered voice synthesis had reached a quality threshold where a convincing clone could be generated from a few minutes of public audio. Corporate executives who appeared in press releases, investor calls, or media interviews had sufficient public audio available for cloning.

The Attack

The UK energy company's CEO received a call that appeared to originate from Germany and sounded exactly like the voice of his parent company's CEO — someone he knew and regularly spoke with. The voice explained that an urgent acquisition required an immediate €220,000 transfer to a Hungarian supplier, to be reimbursed within the hour. The CEO complied. When he called back to confirm, the real German CEO had no knowledge of the call. The money moved from Hungary to Mexico before it could be recovered.

Response

The company's insurer, Euler Hermes, reported the case publicly while withholding names. The incident was documented by the Wall Street Journal. Law enforcement investigated but no arrests were reported. The case prompted financial services organisations to begin reviewing voice-based verification procedures.

Outcome

The €220,000 loss was the first documented case where AI voice synthesis — not a human impersonator — was used to commit financial fraud. It established that AI-generated voices had surpassed the quality threshold for telephone fraud and foreshadowed the explosive growth of voice cloning fraud in subsequent years.

Key Takeaways

  1. Establish a company-wide protocol: pre-agreed code words for any out-of-ordinary financial request, regardless of how recognisable the voice sounds
  2. Wire transfers must always be authorised via a second independent communication channel — call back on a known number
  3. AI voice synthesis quality surpassed human detection ability in 2019 — do not rely on voice recognition for financial authorisation
  4. Limit public audio exposure of executives who have budget authority — be selective about recordings posted online

How to Prevent This

All guides
intermediate

Establish a company-wide code word for verifying unusual executive requests

AI voice cloning and deepfake audio have reached quality where a CEO's voice on a phone call is no longer a reliable identity signal. The first documented AI voice clone fraud transferred €220,000 because the recipient trusted the voice. Establish a company-wide protocol for any unusual financial or sensitive request: a pre-agreed code word or phrase that is changed monthly and distributed only through internal secure channels. Any request from a senior executive that does not include the current code word — regardless of how the voice sounds or how urgent the matter seems — requires independent verification before action.

See: AI Voice Clone CEO FraudSocial Engineering Defence
intermediate

Do not rely on voice recognition for financial authorisation — any voice can be cloned

AI voice synthesis has surpassed the quality threshold for telephone fraud. The first documented AI voice clone fraud transferred €220,000 in 2019. Garmin's CEO, financial executives at multiple Fortune 500 companies, and even LastPass's CEO have had their voices cloned for fraud attempts. For any financial transaction, the voice heard on a phone call is no longer sufficient authorisation. Require independent channel verification (a separate text or email from a known internal system, a coded authorisation number from your financial controls platform) for any wire transfer, regardless of whether the requesting voice sounds exactly like your CEO.

See: AI Voice Clone CEO FraudAI & Emerging Threats
voice cloningCEO impersonationBECAI synthesiswire fraud