A bipartisan group of U.S. senators introduced a bill Thursday that aims to protect creatives and their content from unauthorized use by artificial intelligence and its developers.
The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) requires developers or providers of AI tools to allow content owners to attach “content provenance information” to their work, which is defined as “machine-readable information documenting the origin and history” of a digital asset.
If the content has provenance information attached, the bill prohibits using it to train AI models or generate AI content without the creator’s approval. This gives creators the ability to protect and set terms of use for their work, including compensation. Internet platforms, search engines, and social media companies are also prohibited from interfering with content provenance information.
RELATED STORY | Tennessee is the first state to protect musicians from AI infringement
Under the COPIED Act, owners of covered content and state attorney generals can sue violators who use work without permission. The National Institute of Standards and Technology would develop guidelines and standards to enforce content provenance information, watermarking, and synthetic content detection.
U.S. Sens. Maria Cantwell, D-Washington, Marsha Blackburn, R-Tennessee, and Martin Heinrich, D-New Mexico, introduced the bill. Sen. Heinrich believes the COPIED Act will help crack down on harmful AI in content.
RELATED STORY | Billie Eilish, Stevie Wonder, more sign letter calling for AI defenses
Various groups representing the creative community have endorsed the bill, including SAG-AFTRA, Nashville Songwriters Association International, and Recording Academy. These groups have signed contracts to protect themselves against AI infringement.