California governor vetoes hotly-contested AI safety bill
California governor Gavin Newsom has vetoed a hotly-debated artificial intelligence (AI) bill — arguing it would hinder innovation and fail to protect the public from the “real” threats raised by the tech.
Newsom vetoed SB 1047 — known as the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act — on Sept. 30, which had garnered significant pushback from Silicon Valley.
It proposed mandatory safety testing of AI models and other guardrails, with tech firms worried it would stifle innovation.
In a Sept. 29 statement, Newsom said the bill focused too much on regulating existing top AI firms without protecting the public from the “real” threats posed by the new technology.
“Instead, the bill applies stringent standards to even the most basic functions — so long as a large system deploys it. I do not believe this is the best approach to protecting the public from real threats posed by the technology.”
Penned by San Francisco Democratic Senator Scott Wiener, SB 1047 would also require developers in California — including big names such as ChatGPT maker OpenAI, Meta, and Google — to implement a “kill switch” for their AI models and publish plans for mitigating extreme risks.
If the bill were implemented, AI developers would also be liable to be sued by the state attorney general in the instance of an ongoing threat from models like an AI grid takeover.
Newsom said he had asked the world’s leading AI safety experts to help California “develop workable guardrails” that focus a creating a “science-based trajectory analysis”. He added that he had ordered state agencies to expand their assessment of the risks from potential catastrophic events stemming from AI development.
Related: OpenAI’s move to for-profit: Is it really ‘illegal’?
Even though Newsom vetoed SB 1047, he said adequate safety protocols for AI must be adopted, adding that regulators can’t afford to “wait for a major catastrophe to occur before taking action to protect the public.”
Newsom added that his administration has signed over 18 bills concerning AI regulation in the last 30 days.
Politicians, big tech push back on AI safety bill
The bill was unpopular among lawmakers, advisors, and big technology firms in the lead-up to Newsom's decision.
House Speaker Nancy Pelosi and firms such as OpenAI said that it would significantly hinder the growth of AI.
The head of AI policy at Abundance Institute, Neil Chilson, warned that while the bill primarily targets models of a certain cost and size — models costing more than $100 million — its scope could easily be expanded to crack down on smaller developers as well.
But some are open to the bill. Billionaire Elon Musk — who is developing his own AI model dubbed “Grok” is among a select few tech leaders in favor of the bill and sweeping AI regulations more broadly.
In an Aug. 26 post to X, Musk said “California should probably pass the SB 1047 AI safety bill,” but conceded that standing behind the bill was a “tough call”.
Magazine: Advanced AI system is already ‘self-aware’ — ASI Alliance founder
Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.
You may also like
Ex-Goldman Sachs Exec Shares Positive Outlook on XRP Amid Legal Challenges
Cardano Hits $1 for the First Time in Two Years, Signaling Strong Market Momentum
XRP Soars as Ripple CEO Applauds Trump’s Treasury Secretary Nominee
Trump and Wall Street: How long will the love affair last?
Share link:In this post: Wall Street loved Trump’s win at first—stocks jumped, Bitcoin soared, and borrowing costs hit rock bottom, but some sectors started cracking fast. Tax cuts and deregulation made financial and energy stocks shoot up, but tariffs and plans to deport workers freaked out economists and markets. Tariffs mean higher prices for Americans, and even Walmart’s warning it’ll have to raise prices if Trump pushes through with his trade war.