Today, a group of senators presented the ai-generated-replicas” rel=”nofollow noopener” target=”_blank” data-ylk=”slk:NO FAKES Act;elm:context_link;elmt:doNotAffiliate;cpos:1;pos:1;itc:0;sec:content-canvas”>a law that would make it illegal to create digital recreations of a person’s voice or likeness without that person’s consent. It’s a bipartisan effort by Sens. Chris Coons (D-Del.), Marsha Blackburn (R-Tenn.), Amy Klobuchar (D-Minn.), and Thom Tillis (R-N.C.), whose full name is the Encouraging Originals, Fostering the Arts, and Keeping Entertainment Safe Act of 2024.
If passed, the NO FAKES Act would create an option for people to seek damages when their voice, face, or body is recreated by ai. Both individuals and companies would be liable for producing, hosting, or sharing unauthorized digital replicas, including those created by generative ai.
We've already seen plenty of cases of celebrities having their impersonations of themselves spotted out in the world. It was used to scam people with a giveaway of fake Le Creuset cookware. A voice that sounded like it appeared in a ChatGPT voice demo. ai can also be used to make political candidates appear to be making false statements, with the most recent example being ai. And it's not just celebrities that can be…
“Everyone deserves the right to own and protect their voice and image, regardless of whether you are Taylor Swift or anyone else,” said Senator Coons. “Generative ai can be used as a tool to foster creativity, but that cannot come at the expense of unauthorized exploitation of anyone’s voice or image.”
The speed of new legislation is notoriously an indicator of the speed of development of new technologies, so it is encouraging to see lawmakers getting serious about regulating ai. Today’s proposed law follows the Senate’s recent passage of the DEFIANCE Act, which would allow victims of sexual deepfakes to sue for damages.
Several entertainment organizations have lent their support to the NO FAKES Act, including SAG-AFTRA, the RIAA, the Motion Picture Association, and the Recording Academy. Many of these groups have been taking their own actions to gain protection from unauthorized ai re-creations. Recently, SAG-AFTRA attempted to secure a union agreement for imagery in video games.
Even OpenAI is listed among the bill’s sponsors. “OpenAI is pleased to support the NO FAKES Act, which would protect creators and artists from unauthorized digital replications of their voices and images,” said Anna Makanju, OpenAI’s vice president of global affairs. “Creators and artists must be protected from improper impersonation, and thoughtful legislation at the federal level can make all the difference.”