Skip to main content

The Federal Trade Commission has chosen four winning submissions for its Voice Cloning Challenge to promote the development of ideas to protect consumers from the misuse of artificial intelligence-enabled voice cloning for fraud and other harms.

“Tapping American ingenuity is critical to solving big abuses like deceptive voice cloning,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. “When it comes to AI-driven fraud, the FTC will continue using every tool to deter harmful practices, shut down bad actors, and spur innovative proposals to help protect consumers."

“We’re recognizing people who are pushing science forward and proposing different options to ensure a robust landscape of solutions,” said Stephanie T. Nguyen, the FTC’s Chief Technologist. “These exciting solutions show that a multi-disciplinary approach is necessary to prevent the harms posed by voice cloning.”

The panel of judges—Princeton Computer Science Professor Arvind Narayanan, Britt Paris, assistant professor at Rutgers University’s School of Communication & Information, and Beau Woods, CEO of Stratigos Security and a Cyber Safety Innovation Fellow with the Atlantic Council—chose three top submissions from individuals and small organizations, who will split a total of $35,000 in prize money. They are:

  • AI Detect: The submission from David Przygoda and Dr. Carol Espy-Wilson from the small organization OmniSpeech is aimed at consumer and enterprise apps and devices and would use AI to detect AI. It utilizes AI algorithms to distinguish the subtle differences between genuine and synthetic voice patterns.
  • DeFake: Submitted by Dr. Ning Zhang, an Assistant Professor in the Department of Computer Science and Engineering at Washington University in St. Louis, this proposal uses a form of watermarking. Given that voice cloning relies on the use of pre-existing speech samples to clone a voice, which are generally collected from social media and other platforms, the proposal calls for adding carefully crafted distortions to voice samples that are imperceptible to the human ear, but that make it more difficult to accurately clone.
  • OriginStory: Submitted by Dr. Visar Berisha, Drena Kusari, Dr. Daniel W. Bliss, and Dr. Julie M. Liss of the small organization OriginStory, this technology aims to authenticate the human origin of voice recordings at the point of creation. It uses off-the-shelf sensors already integrated into many devices to simultaneously measure speech acoustics and the co-occurring biosignals in the throat and mouth that a person uses when speaking to validate the human origin and embed this authentication as a type of watermark into the audio stream.

The fourth winning submission is from a large organization, Pindrop Security, which received the Recognition Award. The Pindrop team was comprised of Dr. Elie Khoury, Anthony Stankus, Ketuman Sardesai, and Amanda Braun. Pindrop’s Voice Cloning Detection technology detects voice clones and audio deepfakes in real time. The technology evaluates each incoming phone call or digital audio in two-second chunks and flags those that are potential deep fakes. (Large organizations were not eligible for monetary prizes.)

The four winning submissions demonstrate the potential for cutting edge technology to help mitigate risks of voice cloning in the marketplace. They promote innovative approaches on which key consumer protections can be built. At the same time, the results of the challenge highlight that there is no single solution to this problem. Given this, in addition to the Voice Cloning Challenge, the FTC also has proposed a comprehensive ban on impersonation fraud, and has affirmed that the Telemarketing Sales Rule applies to AI-enabled scam calls. 

This is the sixth challenge the FTC has launched under the America COMPETES Act aimed at spurring the development of innovative solutions to complex consumer protection issues. Voice cloning technology offers potential benefits by, for example, providing new ways for those who have impaired speech to communicate in their own voice with the help of technology. But it also poses significant risks to consumers and has been utilized by scammers to impersonate others. For example, scammers have used voice cloning technology to impersonate business executives in order to fraudulently obtain money or valuable information.

The lead FTC staffers on this matter are James Evans and Christine Barker from the FTC’s Bureau of Consumer Protection and Amritha Jayanti and Ben Swartz from the FTC’s Office of Technology.

The Federal Trade Commission works to promote competition and protect and educate consumers.  The FTC will never demand money, make threats, tell you to transfer money, or promise you a prize. Learn more about consumer topics at consumer.ftc.gov, or report fraud, scams, and bad business practices at ReportFraud.ftc.gov. Follow the FTC on social media, read consumer alerts and the business blog, and sign up to get the latest FTC news and alerts.

Contact Information

Media Contact