California Governor Gavin Newsom, seen speaking to reporters after September's presidential debate.
Getty Images

California governor vetoes controversial AI safety bill

Newsom says SB-1047 ignored "smaller, specialized models" and curtailed innovation.

by · Ars Technica

California Governor Gavin Newsom has vetoed SB-1047, a controversial artificial intelligence regulation that would have required the makers of large AI models to impose safety tests and kill switches to prevent potential "critical harms."

In a statement announcing the veto on Sunday evening, Newsom suggested the bill's specific interest in model size was misplaced. "By focusing only on the most expensive and large-scale models, SB-1047 establishes a regulatory framework that could give the public a false sense of security about controlling this fast-moving technology," Newsom wrote. "Smaller, specialized models may emerge as equally or even more dangerous than the models targeted by SB-1047—at the potential expense of curtailing the very innovation that fuels advancement in favor of the public good."

Newsom mentioned specific "rapidly evolving risks" from AI models that could be regulated in a more targeted way, such as "threats to our democratic process, the spread of misinformation and deepfakes, risks to online privacy, threats to critical infrastructure, and disruptions in the workforce." California already has a number of AI laws on the books targeting some of these potential harms, and many other states have signed similar laws.

"While well-intentioned, SB-1047 does not take into account whether an Al system is deployed in high-risk environments, involves critical decision-making or the use of sensitive data," Newsom continued in explaining the veto. "Instead, the bill applies stringent standards to even the most basic functions—so long as a large system deploys it. I do not believe this is the best approach to protecting the public from real threats posed by the technology."

State Senator Scott Wiener, who co-authored the bill, called Newsom's veto in a social media post "a setback for everyone who believes in oversight of massive corporations that are making critical decisions that affect the safety and welfare of the public and the future of the planet." Voluntary commitments to safety from AI companies are not enough, Wiener argued, adding that the lack of effective government regulation means "we are all less safe as a result" of the veto.

A hard-fought lobbying battle

SB-1047, which passed the state Assembly in August, had the support of many luminaries in the AI field, including Geoffrey Hinton and Yoshua Bengio. But others in and around the industry criticized the bill's heavy-handed approach and worried about the legal liability it could have imposed on open-weight models that were used by others for harmful purposes.

Shortly after the bill was passed, a group of California business leaders sent an open letter to Governor Newsom urging him to veto what they called a "fundamentally flawed" bill that "regulates model development instead of misuse" and which they said would "introduce burdensome compliance costs." Lobbyists for major tech companies including Google and Meta also publicly opposed the bill, though a group of employees from those and other large tech companies came out in favor of its passage.

OpenAI Chief Strategy Officer Jason Kwon publicly urged an SB-1047 veto, saying in an open letter that federal regulation would be more appropriate and effective than "a patchwork of state laws." Early attempts to craft such federal legislation have stalled in Congress amid the release of some anodyne policy road maps and working group reports.

xAI leader Elon Musk advocated for the bill's passage, saying it was "a tough call" but that in the end, AI needs to be regulated "just as we regulate any product/technology that is a potential risk to the public." California's powerful actor's union, SAG-AFTRA, also came out in support of the bill, calling it a "first step" to protecting against known dangers like deepfakes and nonconsensual voice and likeness use for its members.

At the 2024 Dreamforce conference earlier this month, Newsom publicly addressed "the sort of outsized impact that legislation [like SB-1047] could have, and the chilling effect, particularly in the open source community... I can’t solve for everything. What can we solve for?"