Senators introduce bill to protect individuals against AI-generated deepfakes

Date:

Share:


Today, a group of senators introduced the , a law that would make it illegal to create digital recreations of a person’s voice or likeness without that individual’s consent. It’s a bipartisan effort from Senators Chris Coons (D-Del.), Marsha Blackburn (R-Tenn.), Amy Klobuchar (D-Minn.) and Thom Tillis (R-N.C.), fully titled the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024.

If it passes, the NO FAKES Act would create an option for people to seek damages when their voice, face or body are recreated by AI. Both individuals and companies would be held liable for producing, hosting or sharing unauthorized digital replicas, including ones made by generative AI.

We’ve already seen many instances of celebrities finding their imitations of themselves out in the world. used to scam people with a fake Le Creuset cookware giveaway. A voice that sounded showed up in a ChatGPT voice demo. AI can also be used to make political candidates appear to make false statements, with the most recent example. And it’s not only celebrities who can be .

“Everyone deserves the right to own and protect their voice and likeness, no matter if you’re Taylor Swift or anyone else,” Senator Coons said. “Generative AI can be used as a tool to foster creativity, but that can’t come at the expense of the unauthorized exploitation of anyone’s voice or likeness.”

The speed of new legislation notoriously flags behind the speed of new tech development, so it’s encouraging to see lawmakers taking AI regulation seriously. Today’s proposed act follows the Senate’s recent passage of the DEFIANCE Act, which would allow victims of sexual deepfakes to sue for damages.

Several entertainment organizations have lent their support to the NO FAKES Act, including SAG-AFTRA, the RIAA, the Motion Picture Association, and the Recording Academy. Many of these groups have been pursuing their own actions to get protection against unauthorized AI recreations. SAG-AFTRA recently to try and secure a union agreement for likenesses in video games.

Even OpenAI is listed among the act’s backers. “OpenAI is pleased to support the NO FAKES Act, which would protect creators and artists from unauthorized digital replicas of their voices and likenesses,” said Anna Makanju, OpenAI’s vice president of global affairs. “Creators and artists should be protected from improper impersonation, and thoughtful legislation at the federal level can make a difference.”



Source link

━ more like this

Restaurants are forcing us to put phones away, and I’m not complaining

A growing number of bars and restaurants across the United States are embracing a phone-free experience, reflecting a broader cultural shift toward reducing...

Samsung just gave up on its own Messages app

Samsung is finally doing what it probably should’ve done years ago: killing its own Messages app. And while this might sound like just...

NASA shares breathtaking images of Artemis II astronauts taking in the view from Orion’s windows

The Artemis II crew is almost at the moon, and the astronauts spent this weekend carrying out preparations for their lunar flyby on...

Why are astronauts using aging tech? NASA spaceflight expert has the answers

Astronauts floating in space using what looks like “old tech” might sound bizarre at first. But as it turns out, there’s a very...

Doctors came up with an app to save you from jumping to wrong conclusions

We have all been there. A delayed text reply suddenly means something is wrong. A neutral comment feels oddly critical. A small situation...
spot_img