Biden Robocall Deepfake: Synthetic Voice Suppresses New Hampshire Primary Voters

Thousands of New Hampshire voters received robocalls using a synthetic AI clone of President Biden's voice telling them not to vote in the primary — the first documented use of deepfake audio for voter suppression.

US Political System / New Hampshire·2024·2 min read

Background

The 2024 New Hampshire Democratic presidential primary included Biden as a write-in candidate. Days before the primary, an AI-generated voice mimicking Biden was used in a coordinated robocall campaign.

The Attack

Political consultant Steven Kramer commissioned audio deepfakes of Biden's voice as a demonstration project (he later claimed). The calls used phrases Biden had actually spoken, edited and synthesised to deliver the message: "It's important that you save your vote for the November election... Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again." Thousands of New Hampshire residents received the calls, using a voice convincingly similar to Biden's. The calls were distributed by a Texas-based political robocalling company.

Response

The robocalls were reported to the FCC, state attorneys general, and the FBI. New Hampshire's Attorney General launched an investigation. The FCC subsequently issued rules making AI-generated audio in robocalls illegal. Steven Kramer was fined $6 million by the FCC. The robocalling company was also fined.

Outcome

The Biden robocall was the first election-related deepfake audio incident to result in regulatory action and fines. The FCC ban on AI robocalls was a rapid regulatory response. The incident raised broader concerns about AI audio in electoral contexts that remain active policy debates.

Key Takeaways

  1. AI audio in political communications requires immediate disclosure requirements — regulations are emerging but lag behind capabilities
  2. Voter education about deepfake robocalls is essential before major elections
  3. Telecommunications carriers must implement deepfake audio detection for robocall screening
  4. The ease of producing deepfake audio for mass distribution makes electoral influence operations dramatically cheaper
deepfake audiovoter suppressionelection interferenceBidenFCC