Tell FTC to protect everyone from AI voice deepfakes
Sign The Petition
Consumer Reports’ latest investigation of six popular voice cloning apps on the market found that it is easy to create deepfake voices of other people without their knowledge or consent – opening the door to ‘imposter scams’ and other trickery that helped drain nearly $3 billion from Americans’ bank accounts in 2023.
Surprisingly, there is no federal law that specifically prohibits someone from making an AI fake of your voice without your consent – the Federal Trade Commission has prohibited the impersonation of government officials and businesses, but not individuals. Join us in urging the FTC to extend these protections to all Americans, and for law enforcement to investigate voice cloning apps and hold companies accountable if they harm consumers!
Petition to the FTC and state Attorneys General
A new Consumer Reports investigation of six voice cloning apps found it’s easy to create deepfake voices of other people without their knowledge or consent. Considering Americans lost nearly $3 billion in ‘imposter scams’ in 2023 alone, we urge the FTC to finalize a rule that would prohibit the impersonation of individuals, and hold companies accountable when they knowingly facilitate fraud. The FTC should also ensure proper staffing levels to perform investigations of these voice-cloning companies. We also want our state Attorneys General to use their laws and enforcement tools to investigate these voice cloning apps, and to hold them accountable if they don’t do enough to protect consumers.