Home / News / Technology / AI Biden Robocall Mastermind Steve Kramer Insists He “Did the Right Thing” to Spotlight AI Ethics
4 min read

AI Biden Robocall Mastermind Steve Kramer Insists He “Did the Right Thing” to Spotlight AI Ethics

Last Updated February 27, 2024 7:54 PM
Samantha Dunn
Last Updated February 27, 2024 7:54 PM

Key Takeaways

  • The mastermind behind an AI-generated voice recording of President Biden has been identified.
  • A Democratic consultant hired a New Orleans magician to create the AI voice recording.
  • With the US election season well underway, the use of deepfakes to spread misinformation is a threat to democracy.

Steve Kramer, a seasoned Democratic consultant, has been identified as the architect behind a controversial robocall campaign that mimicked the voice of President Joe Biden, stirring a nationwide conversation on the ethics of artificial intelligence in political campaigning. The incident, which unfolded just days before the New Hampshire primary on January 23, 2024, was intended as a demonstration of AI ethics rather than an attempt to sway the election outcome, according to Kramer.

“Maybe I’m a villain today, but I think in the end we get a better country and better democracy because of what I’ve done, deliberately,” Kramer claimed.

The AI Biden Robocall

Misleading robocalls targeted New Hampshire residents on January 21, 2024, ahead of the state’s Presidential Primary Election. These calls, featuring an AI-generated voice clone of President Joe Biden, falsely instructed voters to abstain from participating in the January 23 primary.

The message stated : “It’s important that you save your vote for the November election,” as well as “Your vote makes a difference in November, not this Tuesday.”

A subsequent investigation  swiftly pinpointed the origin of the deceptive communications to Life Corporation, a Texas-based entity, and an individual named Walter Monk.

Since then, a New Orleans magician has come forward to claim he was paid to make the false recordings.

Multiple Individuals Involved

NBC News reported Paul David Carpenter was hired by Steve Kramer in January to make an imitation of the president’s voice.

“I was in a situation where someone offered me some money to do something, and I did it. There was no malicious intent. I didn’t know how it was going to be distributed,” he told NBC News.

Kramer has responded to allegations surrounding his involvement in the Biden robocalls, stating that

“I wrestled in college, I’m ready for the fight. If they want to throw me in jail, good luck. Good luck, and I meant that. “If they want to fine me for doing the right thing when they didn’t do the right thing, even though it’s been their job and they went to a fancy law school? Well, you’ve proven a point,” he added.

Presidential candidate Dean Phillips Breaks his Silence

Presidential candidate Dean Phillips, who had previously hired Kramer, broke his silence on the 23d February, sharing a statement  on X, condemning the behavior.

One X user, responding to Dean Phillips tweet , criticized Phillips for his poor choice in hiring Kramer, noting: “Your consultant – Steven Kramer – was a Kanye 2020 staffer who was sued in 2021 for sabotaging a New York mayoral candidate’s ballot signature collection. What impressed you about Kramer’s resume that you decided to hire him? Do you plan on keeping your hiring manager?”

In a follow-up Tweet, Phillips revealed Kramer is no longer part of his campaign.

AI Ethics in Elections

The concerns surrounding the use of AI in elections have led many state-wide legislators to propose bills that specifically target AI-generated media in elections.

Kansas legislators have proposed a bill that would prohibit political campaigns from using AI-generated representations of electoral candidates. Kansas state previously passed a related generative artificial intelligence policy that went into effect in July 2023.

Recent examples of AI misuse have heightened concerns specifically around Deep Fakes. The manipulated videos of public figures appearing to say things they never did, or altered images designed to mislead viewers have the potential to cause serious harm to democratic practises.

Was this Article helpful? Yes No