
The Black Music Action Coalition (BMAC) is defending the culture. Today, they officially threw their support behind the reintroduced NO FAKES Act — a bipartisan bill to protect artists from unauthorized AI-generated deepfakes and voice clones. With tech moving fast and AI mimicking voices and faces like never before, the bill sets out to give artists the legal power to protect their image, voice, and legacy in the digital age.
The NO FAKES Act — short for Nurture Originals, Foster Art, and Keep Entertainment Safe — was reintroduced by Senators Marsha Blackburn, Chris Coons, Thom Tillis, and Amy Klobuchar, along with Representatives María Elvira Salazar, Madeleine Dean, Nathaniel Moran, and Becca Balint. It lays down a new federal property right for artists, creators, and public figures to own and defend their voice and likeness — something never seen before at the national level.
BMAC is joining forces with the Human Artistry Campaign and a powerhouse lineup of industry players including the Recording Academy, ASCAP, RIAA, and Songwriters of North America, all calling on Congress to act.
“The NO FAKES Act is a significant first step towards safeguarding artists, journalists, performers, and all performers whose professional recordings and videos are maliciously exploited in AI-generated deepfakes and voice clones,” Willie “Prophet” Stiggers, Co-Founder, CEO & President of Black Music Action Coalition. “Black Music Action Coalition extend our gratitude to the Human Artistry Campaign, Senators Blackburn, Coons, Tillis, Durbin, Hagerty, and Klobuchar; and Representatives Salazar, Dean, Lee, Morelle, Moran, and Whitman for their unwavering efforts in introducing this bipartisan bill.”
“Protecting the integrity of music creators has never been more imperative,” said Michelle Lewis, CEO of Songwriters Of North America. “Songwriters of North America thanks the bipartisan efforts of Senators Coons, Tillis, Kloubuchar, and Blackburn, Representatives Salazar, Dean, Moran, and Balient, as well as our advocacy partners from RIAA, BMAC, ARA, MAC, and SAG-AFTRA, and the Recording Academy for their efforts to ensure that the true creators of musical works are recognized for their contributions, preventing the exploitation and misrepresentation of their original art.”
The act is designed to clap back against AI abuse — scams, revenge porn, manipulated content, and unauthorized use of artist material — and to give creators new tools to remove deepfakes without lengthy court battles. It also makes sure platforms can’t just look the other way while their tech is used to spread fake content.
With clear rules and exceptions for parody, news, and commentary, the NO FAKES Act doesn’t kill creativity — it protects it. Backed by tech giants like Google, YouTube, OpenAI, and IBM, along with major talent agencies and advocacy groups, the bill is gaining serious momentum.
As AI starts to blur the lines between real and fake, BMAC is making it known — protecting artists in this next era isn’t optional. It’s necessary.