HARTFORD, Conn. (AP) 鈥 As state lawmakers rush to get a handle on fast-evolving artificial intelligence technology, they're often focusing first on their own state governments before imposing restrictions on the private sector.
Legislators are seeking ways to protect constituents from discrimination and other harms while not hindering cutting-edge advancements in medicine, science, business, education and more.
鈥淲e're starting with the government. We're trying to set a good example,鈥 Connecticut state Sen. James Maroney said during a floor debate in May.
Connecticut plans to inventory all of its government systems using artificial intelligence by the end of 2023, posting the information online. And starting next year, state officials must regularly review these systems to ensure they won鈥檛 lead to unlawful discrimination.
Maroney, a Democrat who has become a go-to AI authority in the General Assembly, said Connecticut lawmakers will likely focus on private industry next year. He plans to work this fall on model AI legislation with lawmakers in Colorado, New York, Virginia, Minnesota and elsewhere that includes 鈥渂road guardrails鈥 and focuses on matters like product liability and requiring impact assessments of AI systems.
鈥淚t鈥檚 rapidly changing and there鈥檚 a rapid adoption of people using it. So we need to get ahead of this,鈥 he said in a later interview. 鈥淲e鈥檙e actually already behind it, but we can鈥檛 really wait too much longer to put in some form of accountability.鈥
Overall, at least 25 states, Puerto Rico and the District of Columbia introduced artificial intelligence bills this year. As of late July, 14 states and Puerto Rico had adopted resolutions or enacted legislation, according to the 香港六合彩挂牌资料 Conference of State Legislatures. The list doesn鈥檛 include bills focused on specific AI technologies, such as facial recognition or autonomous cars, something NCSL is tracking separately.
Legislatures in Texas, North Dakota, West Virginia and Puerto Rico have created advisory bodies to study and monitor AI systems their respective state agencies are using, while Louisiana formed a new technology and cyber security committee to study AI鈥檚 impact on state operations, procurement and policy. Other states took a similar approach last year.
Lawmakers want to know 鈥淲ho鈥檚 using it? How are you using it? Just gathering that data to figure out what鈥檚 out there, who鈥檚 doing what,鈥 said Heather Morton, a legislative analysist at NCSL who tracks artificial intelligence, cybersecurity, privacy and internet issues in state legislatures. 鈥淭hat is something that the states are trying to figure out within their own state borders.鈥
Connecticut's new law, which requires AI systems used by state agencies to be regularly scrutinized for possible unlawful discrimination, comes after an by the Media Freedom and Information Access Clinic at Yale Law School determined AI is already being used to assign students to magnet schools, set bail and distribute welfare benefits, among other tasks. However, details of the algorithms are mostly unknown to the public.
AI technology, the group said, 鈥渉as spread throughout Connecticut鈥檚 government rapidly and largely unchecked, a development that鈥檚 not unique to this state.鈥
Richard Eppink, legal director of the American Civil Liberties Union of Idaho, testified before Congress in May about discovering, through a lawsuit, the 鈥渟ecret computerized algorithms鈥 Idaho was using to assess people with developmental disabilities for federally funded health care services. The automated system, he said in written testimony, included corrupt data that relied on inputs the state hadn't validated.
AI can be shorthand for many different technologies, ranging from algorithms recommending what to watch next on Netflix to generative AI systems such as ChatGPT that can aid in writing or create new images or other media. The surge of commercial investment in generative AI tools has generated public fascination and concerns about their ability to trick people and , among other dangers.
Some states haven't attempted to tackle the issue yet. In Hawaii, state Sen. Chris Lee, a Democrat, said lawmakers didn鈥檛 pass any legislation this year governing AI 鈥渟imply because I think at the time, we didn鈥檛 know what to do.鈥
Instead, the Hawaii House and Senate passed a resolution Lee proposed that urges Congress to adopt safety guidelines for the use of artificial intelligence and limit its application in the use of force by police and the military.
Lee, vice-chair of the Senate Labor and Technology Committee, said he hopes to introduce a bill in next year's session that is similar to Connecticut's new law. Lee also wants to create a permanent working group or department to address AI matters with the right expertise, something he admits is difficult to find.
"There aren鈥檛 a lot of people right now working within state governments or traditional institutions that have this kind of experience,鈥 he said.
The European Union is in building guardrails around AI. There has been discussion of in Congress, which Senate Majority Leader Chuck Schumer said in June would maximize the technology鈥檚 benefits and mitigate significant risks.
Yet the New York senator did not commit to specific details. In July, his administration had secured voluntary commitments from seven U.S. companies meant to ensure their AI products are safe before releasing them.
Maroney said ideally the federal government would lead the way in AI regulation. But he said the federal government can't act at the same speed as a state legislature.
鈥淎nd as we鈥檝e seen with the data privacy, it鈥檚 really had to ,鈥 Maroney said.
Some state-level bills proposed this year have been narrowly tailored to address specific AI-related concerns. Proposals in Massachusetts would place limitations on mental health providers using AI and prevent 鈥渄ystopian work environments鈥 where workers don't have control over their personal data. A proposal in New York would place restrictions on employers using AI as an 鈥渁utomated employment decision tool鈥 to filter job candidates.
North Dakota passed a bill defining what a person is, making it clear the term does not include artificial intelligence. Republican Gov. Doug Burgum, a long-shot presidential contender, has said such guardrails are needed for AI but the technology should still be embraced to make state government less redundant and more responsive to citizens.
In Arizona, Democratic Gov. Katie Hobbs vetoed legislation that would prohibit voting machines from having any artificial intelligence software. In her veto letter, Hobbs said the bill 鈥渁ttempts to solve challenges that do not currently face our state.鈥
In Washington, Democratic Sen. Lisa Wellman, a former systems analyst and programmer, said state lawmakers need to prepare for a world in which machine systems become ever more prevalent in our daily lives.
She plans to roll out legislation next year that would require students to take computer science to graduate high school.
鈥淎I and computer science are now, in my mind, a foundational part of education,鈥 Wellman said. 鈥淎nd we need to understand really how to incorporate it.鈥
___
Associated Press Writers Audrey McAvoy in Honolulu, Ed Komenda in Seattle and Matt O'Brien in Providence, Rhode Island, contributed to this report.