California Gov. Gavin Newsom (D) signed into law a raft of artificial intelligence bills Tuesday, aimed at curbing the effects of deepfakes during elections and protecting Hollywood performers from their likenesses being replicated by AI without their consent.
There is growing worry about deepfakes circulating during the 2024 campaign, and concerns over Hollywood’s use of artificial intelligence were a prominent part of last year’s historic actors strike. California is home to “32 of the world’s 50 leading AI companies, high-impact research and education institutions,” according to Newsom’s office, forcing his government to balance the public’s welfare with the ambitions of a rapidly evolving industry.
“Safeguarding the integrity of elections is essential to democracy, and it’s critical that we ensure AI is not deployed to undermine the public’s trust through disinformation — especially in today’s fraught political climate,” Newsom said Tuesday in a statement.
Among the measures is A.B. 2655, which requires large online platforms to remove or label deceptive, digitally altered or digitally created content related to elections during certain periods before and after they are held. He also signed A.B. 2839 — which expands the time frame during which people and entities are prohibited from knowingly sharing election material containing deceptive AI-generated or manipulated content — and A.B. 2355, which requires election advertisements to disclose whether they use AI-generated or substantially altered content.
In July, after X owner Elon Musk retweeted an altered Kamala Harris campaign advertisement, Newsom wrote on social media that “manipulating a voice in an ‘ad’ like this one should be illegal” and committed to signing a bill “to make sure it is.”
Despite the bills signed Tuesday, it’s still unclear whether Newsom will sign or veto S.B. 1047, which aims to make AI companies liable if their technology is used for harm and is fiercely opposed by most of the tech industry.
Venture capitalists and start-up founders say it would stifle innovation as developers worry about unforeseen uses of AI technology that they build. The bill’s author, Sen. Scott Wiener (D), says it simply seeks to formalize commitments that AI companies have made about trying to keep their tech from being used for ill.
The new laws also include two measures for actors and performers that Newsom said will ensure the industry “can continue thriving while strengthening protections for workers and how their likeness can or cannot be used.”
A.B. 2602 requires contracts to specify how AI-generated replicas of a performer’s voice or likeness will be used. A.B. 1836 prohibits commercial use of digital replicas of deceased performers without the consent of their estates.
The use of AI in entertainment — whether through the consensual replication of performances like James Earl Jones’s Darth Vader voice or the warnings from several celebrities about AI-altered images of them circulating online without their consent — is hotly debated.
Last year, the actors union SAG-AFTRA secured a contract with safeguards against AI, including a requirement that actors give studios “informed consent” and receive “fair compensation” for the creation of digital replicas, The Washington Post reported.
Union President Fran Drescher praised the bills in a Tuesday statement for expanding on AI protections that actors “fought so hard for last year” and thanked Newsom for “recognizing that performers matter, and their contributions have value.”
Duncan Crabtree-Ireland, SAG-AFTRA’s national executive director, added: “No one should live in fear of becoming someone else’s unpaid digital puppet.”