A bipartisan group of senators ai-deepfakes-put-journalists-artists-songwriters-back-in-control-of-their-content”>introduced a new bill to facilitate authentication and detection of ai-generated content and protect journalists and artists from having their work eaten by ai models without their permission.
Law on the protection of the origin and integrity of the content of edited and falsified media (COPIED Law) The bill would direct the National Institute of Standards and technology (NIST) to create standards and guidelines to help prove the origin of content and detect synthetic content, such as through watermarks. It also directs the agency to create safeguards to prevent manipulation and requires ai tools for creative or journalistic content to allow users to attach information about its origin and prohibit that information from being removed. Under the bill, such content also could not be used to train ai models.
Content owners, including media outlets, artists and newspapers, could sue companies they believe used their materials without permission or altered authentication markers. State attorneys general and the Federal Trade Commission could also enforce the law, which its supporters say prohibits anyone from “removing, disabling or altering provenance information from content” outside of an exception for some security research purposes.
This is the latest ai-related bill the Senate has undertaken to understand and regulate the technology. Senate Majority Leader Chuck Schumer (D-N.Y.) led an effort to create an ai roadmap for the chamber, but made clear that new laws would be crafted in individual committees. The COPIED Act has the advantage of having a powerful committee leader as its sponsor, Senate Commerce Committee Chairwoman Maria Cantwell (D-Wash.). Senate ai Task Force member Martin Heinrich (D-N.M.) and Commerce Committee member Marsha Blackburn (R-Tenn.) are also leading the bill.
Several publishing and artist groups issued statements applauding the bill’s introduction, including SAG-AFTRA, the Recording Industry Association of America, the News/Media Alliance and the Artist Rights Alliance, among others.
“ai’s ability to produce incredibly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members,” SAG-AFTRA National Executive Director and Chief Negotiator Duncan Crabtree-Ireland said in a statement. “We need a fully transparent and accountable supply chain for generative ai and the content it creates in order to protect everyone’s basic right to control the use of their face, voice and persona.”