In May, country singer Martina McBride appeared before the Senate Judiciary Subcommittee on Privacy, Technology and the Law at a hearing to speak out against AI-generated deepfakes. Last week, she expanded on her testimony at the CNBC AI Summit in Nashville.
McBride, along with Recording Industry Association of America chief policy officer Morna Willens, spoke to CNBC’s Courtney Reagan about the NO FAKES Act, a proposed bipartisan bill focused on protecting individuals’ voices and likeness.
The four-time Academy of Country Music Awards top female vocalist said speaking out on deepfakes and the need for AI guardrails is something close to her heart. “The thing that I’m most proud of in my career is my reputation and the fact that when I say something, my fans trust that it’s the truth,” McBride told the audience comprised of tech executives, CFOs and CEOs.
The ability for AI to fake her voice or image means that there’s a real chance someone could take her song lyrics, some of which shine a light on the horrors of domestic violence, and change them to belittle or justify the abuse. “At some point you can’t discern what I say and what someone manipulates me saying and that’s terrifying,” she said.
Willens said in her role at the RIAA she spends “100% of her time” on AI because the technology is moving so fast. “I’m talking to artists, managers and lawmakers, trying to figure out next steps,” she said. “Is it regulation? I don’t know, but we need some sort of guardrail around this technology.”
McBride told the Nashville audience that among the biggest deepfake dangers are scams. She said one of her fans nearly sold his home to raise money because an AI-generated “Martina McBride” said she needed cash. “AI is just going to make these kinds of scams even more dangerous,” she said.
The criticism that the music industry is somehow anti-AI or anti-technology, Willens said, is false. “The music industry has been on the front edge of technology for a while,” she said. Labels and artists have worked with Apple Music and Spotify on licensing for years, she explained, so the idea that sorting through artists’ libraries and output is too complex is simply not true.
The problem, said Willens, is that there’s no transparency among the big AI companies. “We can’t tell if they’re training on Martina’s music for instance,” she said. “And if she doesn’t know what they’re training on, she can’t enforce her rights.”
McBride was asked about the impact of deepfakes on the careers and livelihoods of young singers just starting out. “If someone can invade that artist-fan bond and distort the story a young artist tells the world about who they are, careers could be lost before they truly get started,” she said.
Another aspect of the danger posed by AI-generated deepfakes centers on retaliation, McBride said.
“If you lose your house because a deepfake of an artist says they need money and you never get that money back, that’s an angry situation,” she said. “I’m on stage in front of thousands of people and I don’t know how long it will be until I don’t feel safe doing that. That’s a real physical danger that we need to think about.”