Join my FREE Weekly AI Workshop!

California Governor Newsom Vetoes AI Regulation Bill SB 1047

While AI regulation is certainly needed, I agree with Governor Newsom this bill was heavy-handed and not fully thought out. It’s important to give proper time and consideration to AI safety measures, while not creating stagnation (ahem, EU’s AI Act!) or putting undo pressure on companies. That said, I’m glad that people are putting significant attention into exploring the balance between innovation and regulation.

California Governor Gavin Newsom recently vetoed SB 1047, a bill designed to prevent AI from being used to cause significant harm. While the bill was passed by the state assembly with a 41-9 vote, it faced opposition from various organizations, including the Chamber of Commerce. Newsom acknowledged the bill’s good intentions but argued it wasn’t the best approach to regulating AI.

SB 1047 aimed to hold AI developers accountable by requiring safety protocols, such as testing, external risk assessments, and an “emergency stop” feature. Violations would have resulted in hefty fines, starting at $10 million for the first offense and $30 million for subsequent ones. However, the bill was revised to remove the state attorney general’s ability to sue AI companies unless a catastrophic event occurred.

The bill targeted large-scale AI models costing at least $100 million to use and requiring 10^26 FLOPS for training. It also covered derivative projects with third-party investments of $10 million or more. Newsom criticized this focus, suggesting it might give a false sense of security and overlook smaller, potentially dangerous models. He emphasized the need for a regulatory framework that evolves with the technology.

Initially, SB 1047 proposed creating a new department, the Frontier Model Division, to oversee enforcement. This was later changed to a Board of Frontier Models within the Government Operations Agency, with nine members appointed by the governor and legislature.

Despite support from notable AI researchers like Geoffrey Hinton and Yoshua Bengio, the bill faced criticism from tech industry figures such as Fei-Fei Li and Meta’s Yann LeCun. They argued it could stifle innovation. Trade groups representing tech giants like Amazon, Apple, and Google also opposed the bill, citing potential financial burdens on AI innovators.

How It Works

The bill would have required AI developers to implement safety measures, including rigorous testing and external risk assessments. An “emergency stop” feature would allow for the immediate shutdown of an AI model if it posed a threat. These protocols aimed to prevent AI from causing critical harm to humans.

Benefits

  • Enhanced safety measures to prevent AI misuse.
  • Accountability for AI developers, encouraging responsible innovation.
  • Potential to mitigate risks before they become catastrophic.

Concerns

  • Potential to stifle innovation by imposing heavy financial burdens on developers.
  • Focus on large-scale models might overlook smaller, equally dangerous AI systems.
  • False sense of security due to the bill’s narrow scope.

Possible Business Use Cases

  • AI Safety Consulting: Offer services to help companies comply with safety protocols and risk assessments.
  • Emergency Stop Solutions: Develop and sell “emergency stop” features for AI models to ensure immediate shutdown capabilities.
  • AI Risk Assessment Tools: Create software tools that assist in external risk assessments for AI systems.

As we navigate the complexities of AI regulation, how can we balance the need for safety with the imperative to foster innovation?

Read original article here.

Image Credit: DALL-E

—

I consult with clients on generative AI infused branding, web design and digital marketing to help them generate leads, boost sales, increase efficiency & spark creativity. You can learn more and book a call at https://www.projectfresh.com/consulting.

Leave a Reply

Your email address will not be published. Required fields are marked *

Share with

Archives
Other Recent Posts

Looking for Something?