Since February, a bill that would regulate artificial intelligence programs has been making its way through California's legislature. Governor Gavin Newsom has until September 30 to decide whether to sign the bill.
However, this is more than a single state's bill. Because of the number of California high-tech companies delivering AI services around the globe — and the expansion of coverage to any tech company doing so, no matter where it is located — as a law this could have large effects on the entire industry and every person and business using the supplied technology, including those in CRE.
The bill, called the Safe and Secure Innovation for Frontier Artificial Intelligence Systems Act, doesn't say exactly what type of AI it regulates. Given the types of technology that have been swept into the category over the last 70 years, trying to focus on one thing, like large language models of which ChatGPT is an example, could make the legislation out of date.
Instead, the bill's language targets two qualifying characteristics. One is specifying the enormous amount of computing power that would be used. The bill specifies the cost of training the model — at least $100 million. The other involves a slightly smaller, but still vast, amount of computing power put to use and a training cost of at least $10 million. The former type essentially refers to the creation and first training of a model. The latter refers to later iterations of training.
A review of the long list of supporters and detractors of the legislation reveals that a majority of the positions are against it. As CalMatters has reported, the opposing responses mention such reasons as the resulting deterrence of AI and "vague language," but the real reasons are more likely what the bill calls for.
That includes various requirements like developers testing whether their software can enable attacks on public infrastructure. The bill would create a public cloud that smaller companies could use to help develop new products, challenging the proprietary services from the biggest tech companies. There would be protection for whistleblowers at large tech companies who want to warn of dangerous practices. Developers would have to install a "kill switch" to stop a program that was running out of control and report safety violations and problems to state authorities.
Those in the tech industry opposing the legislation said that testing and safety requirements and compliance costs would hamper startups. However, the market power of the largest companies tends to do that as well and the restrictions would mostly rest on the giants, not small companies that wouldn't have the resources to trigger the spending requirements.