close
close

California is testing new generative AI tools. Here’s what you need to know

SACRAMENTO, Calif. (AP) — Generative artificial intelligence tools will soon be used by the California government.

Democratic Gov. Gavin Newsom’s administration announced Thursday that the state will partner with five companies to develop and test generative AI tools that could improve public service.

California is one of the first states to roll out guidelines on when and how state agencies can purchase AI tools, as lawmakers across the country grapple with how to regulate the emerging technology.

Here’s a closer look at the details:

WHAT IS GENERATIVE AI?

Generative AI is a branch of artificial intelligence that can create new content such as text, audio and photos in response to prompts. This is the technology behind ChatGPT, the controversial writing tool launched by Microsoft-backed OpenAI. San Francisco-based Anthropic, with backing from Google and Amazon, is also getting into the generative AI game.

HOW CAN CALIFORNIA USE IT?

California plans to use this type of technology to help reduce customer call wait times at state agencies and improve traffic and road safety, among other things.

Initially, four state departments will test generative AI tools: the Department of Tax and Fee Administration, the California Department of Transportation, the Department of Public Health, and the Department of Health and social services.

The tax agency administers more than 40 programs and received more than 660,000 calls from businesses last year, Director Nick Maduros said. The state hopes to deploy AI to listen to these calls and extract key information about state tax codes in real time, allowing workers to answer questions more quickly because they don’t have to search for the information themselves. -themselves.

In another example, the state wants to use technology to provide citizens with information about health and social services benefits in languages ​​other than English.

WHO WILL USE THESE AI TOOLS?

The public does not yet have access to these tools, but perhaps in the future. The state will launch a six-month trial, during which the tools will be tested internally by state officials. In the tax example, the state is considering having the technology analyze companies’ call recordings and see how AI processes them afterward — rather than having them run in real time, Maduros said.

However, not all tools are designed to interact with the public. For example, tools designed to help improve traffic congestion and road safety would only be used by state officials to analyze traffic data and brainstorm potential solutions.

Officials will test and evaluate their effectiveness and risks. If testing goes well, the state will consider deploying the technology more widely.

HOW MUCH DOES IT COST?

The final cost is unclear. For now, the state will pay each of the five companies $1 to begin a six-month internal trial. The State will then be able to assess whether it is appropriate to sign new contracts for long-term use of the tools.

“If it turns out that this doesn’t serve the public better, then we’re down a dollar,” Maduros said. “And I think it’s a really good deal for the citizens of California.”

The state is currently running a huge budget deficit, which could make it harder for Newsom to make the case that such technology is worth deploying.

Administration officials said they did not have an estimate of what these tools would ultimately cost the state, and they did not immediately release copies of the agreements with the five companies that will test the technology on a trial basis. These companies are: Deloitte Consulting, LLP, INRIX, Inc., Accenture, LLP, Ignyte Group, LLC, SymSoft Solutions LLC.

WHAT COULD GO WRONG?

The rapid growth of technology has also led to concerns about job losses, misinformation, privacy and automation bias.

State officials and academic experts say generative AI has significant potential to help government agencies become more efficient, but there is also an urgent need for safeguards and oversight.

Testing tools on a limited basis is one way to limit potential risks, said Meredith Lee, chief technical advisor for UC Berkeley’s College of Computing, Data Science, and Society.

But, she added, testing cannot stop after six months. The state must have a consistent process for testing and knowing the potential risks of the tools if it decides to deploy them on a larger scale.

Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.