Last month, California lawmakers advanced about 30 new ai measures aimed at protecting consumers and jobs, one of the biggest efforts yet to regulate the new technology.
The bills seek the nation's strictest restrictions on ai, which some technologists warn could wipe out entire categories of jobs, cause election chaos with disinformation and pose national security risks. California's proposals, many of which have won broad support, include rules to prevent artificial intelligence tools from discriminating in housing and health care services. They also aim to protect intellectual property and jobs.
The California legislature, which is expected to vote on the proposed laws by August 31, has already helped shape technology consumer protections in the United States. The state passed a privacy law in 2020 that curbed the collection of user data, and in 2022 it passed a child safety law that created safeguards for those under 18.
“As California has seen with privacy, the federal government is not going to act, so we think it's critical that we step up in California and protect our own citizens,” said Assembly Member Rebecca Bauer-Kahan Democrat who chairs the State Assembly Privacy and Consumer Protection Committee.
While federal lawmakers drag their feet on regulating ai, state lawmakers have filled the void with an avalanche of bills poised to become de facto regulations for all Americans. tech laws like California's often set a precedent for the nation, largely because lawmakers across the country know it can be challenging for companies to comply with a patchwork that crosses state lines.
State lawmakers across the country have proposed nearly 400 new ai laws in recent months, according to the lobbying group TechNet. California leads the states with a total of 50 proposed bills, although that number has shrunk as the legislative session has progressed.
Colorado recently enacted a comprehensive consumer protection law that requires ai companies to take “reasonable care” when developing the technology to avoid discrimination, among other issues. In March, the Tennessee legislature passed the ELVIS Act (Ensuring Voice and Image Safety Act), which protects musicians from having their voice and likeness used in ai-generated content without their explicit consent.
It's easier to pass laws in many states than at the federal level, said Matt Perault, executive director of the technology Policy Center at the University of North Carolina at Chapel Hill. Forty states now have “trifecta” governments, in which both chambers of the legislature and the governor's office are run by the same party (the most since at least 1991).
“We're still waiting to see which proposals actually become law, but the huge number of ai bills introduced in states like California shows how interested lawmakers are in this issue,” he said.
And the state proposals are having a ripple effect globally, said Victoria Espinel, executive director of the Business Software Alliance, a lobbying group representing large software companies.
“Countries around the world are analyzing these drafts for ideas that could influence their decisions on ai laws,” he said.