close
close

California Governor Newsom vetoes AI security bill dividing Silicon Valley

California Governor Newsom vetoes AI security bill dividing Silicon Valley

California Gov. Gavin Newsom on Sunday vetoed a bill that would have enacted the nation’s most far-reaching regulations on the burgeoning artificial intelligence industry.

California lawmakers overwhelmingly passed the bill, called SB 1047, seen as a potential blueprint for national AI legislation.

The measure would make tech companies legally liable for harm caused by artificial intelligence models. Additionally, the bill would require tech companies to activate a “kill switch” for AI technology if the systems are misused or used fraudulently.

Newsom called the bill “well-intentioned” but noted that the law’s requirements would require “stringent” regulations that would be burdensome for the state’s leading artificial intelligence companies.

in it veto messageNewsom said the bill focuses too much on the biggest and most powerful AI models and that smaller startups could be just as disruptive.

“Smaller, private models could emerge as equally or even more dangerous than the models targeted by SB 1047, at the expense of potentially curtailing innovation that fuels progress for the public good,” Newsom wrote.

California Sen. Scott Wiener, a co-author of the bill, criticized Newsom’s move, saying the veto was an impediment to AI accountability.

“This veto confronts us with the uncomfortable reality that companies aiming to create extremely powerful technology face no binding restrictions from US policymakers, especially given Congress’ ongoing paralysis in meaningfully regulating the technology industry,” Wiener said. ” he wrote. in x.

The now-defunct bill would have forced the industry to conduct security tests on extremely powerful artificial intelligence models. Wiener wrote Sunday that in the absence of such requirements, the industry will continue to regulate itself.

“While major AI labs have made admirable commitments to monitor and mitigate these risks, the reality is that voluntary commitments from the industry are not feasible and rarely work well for the public.”

Many powerful players in Silicon Valley, including venture capital firm Andreessen Horowitz, OpenAI, and trade groups representing Google and Meta, lobbied against the bill, arguing that it would slow the development of artificial intelligence and stifle growth for early-stage companies.

“SB 1047 will threaten this growth, slowing the pace of innovation and causing California’s world-class engineers and entrepreneurs to leave the state to seek greater opportunities elsewhere,” said Jason Kwon, OpenAI’s Chief Strategy Officer. wrote In a letter sent to Wiener last month.

However, other tech leaders supported the bill, including Elon Musk and leading AI scientists like Geoffrey Hinton and Yoshua Bengio, who signed a letter urging Newsom to sign it.

“We believe that the strongest AI models may soon pose serious risks, such as increased access to biological weapons and cyber attacks on critical infrastructures. Leading AI companies need to test whether the strongest AI models will cause serious harm, and these companies must take reasonable precautions against such risks.” is possible and appropriate.” wrote Hinton and dozens of former and current employees of leading AI companies.

In Sunday’s

Other states such as colorado And UtahIt has enacted legislation more narrowly designed to address how AI could perpetuate bias in employment and healthcare decisions, as well as other consumer protection concerns related to AI.

Newsom recently signed into law other AI bills, including one to curb the spread of deepfakes during elections. Another protects players against doppelgangers copied by AI without their consent.

As billions of dollars are poured into the development of artificial intelligence and it permeates more corners of daily life, lawmakers in Washington still have not proposed a single piece of federal legislation to protect people from AI’s potential harms or provide oversight of its rapid development. .

Copyright 2024 NPR