Sign In  |  Register  |  About Burlingame  |  Contact Us

Burlingame, CA
September 01, 2020 10:18am
7-Day Forecast | Traffic
  • Search Hotels in Burlingame

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Advisory panel rules Connecticut needs to further regulate state-used AI

A Civil Rights Commission advisory panel ruled Thursday that Connecticut needs to employ certain safeguards on artificial intelligence technology used by the state.

Connecticut needs safeguards on state government's use of artificial intelligence including algorithms at child welfare and other agencies to prevent discrimination and increase transparency, an advisory panel to the U.S. Commission on Civil Rights said Thursday.

The Connecticut Advisory Committee to the federal commission called on state lawmakers to pass laws regulating such systems, which have sparked concerns in other parts of the country.

The problem, critics say, is algorithms can use flawed data that can disproportionately identify minorities, low-income families, disabled people and other groups when agencies make decisions on removing children from homes, approving health, housing and other benefits, where to concentrate law enforcement and assigning children to schools, among other uses.

CHATGPT LEADS LAWMAKERS TO CALL FOR REGULATING ARTIFICIAL INTELLIGENCE

"The state of Connecticut makes thousands of decisions that impact the lives and civil rights of residents every day," said David McGuire, chair of the Connecticut Advisory Committee. "When the state uses an algorithm residents should know which agency is using the algorithm, the reason it is being used, and assurances that the algorithm is fair."

The committee did not identify any specific instances of discrimination and bias in Connecticut's use of algorithms, but said it would release a more comprehensive report within the next few months. The panel also pointed to a study that said some Connecticut agencies did not release full information on their use of algorithms when asked under public records laws.

Concerns about such use of artificial intelligence, or AI, led the Biden administration in October to issue its Blueprint for an AI Bill of Rights urging government action to safeguard digital and civil rights.

An investigation by The Associated Press last year revealed bias and transparency problems in the increasing use of algorithms within the country’s child welfare system.

McGuire said the Connecticut panel’s review of the issue is the first by the U.S. Commission on Civil Rights or any of its 56 advisory committees. The commission was established by the Civil Rights Act of 1957 as an independent, bipartisan federal fact-finding agency.

Supporters of using algorithms say they make government systems more thorough and efficient through the use of data.

The Connecticut advisory committee is urging state lawmakers to pass laws that would require independent audits of algorithms, including assessments of potential biases, and mandate under state records laws that information about agencies' use of algorithms be publicly available.

Democratic Gov. Ned Lamont's office did not immediately respond to a request for comment. Spokespeople for Democratic leaders in the legislature said they were looking into the issue. Democrats control both chambers of the General Assembly.

Senate Republican Leader Kevin Kelly and House Republican Leader Vincent Candelora welcomed a review of how the state uses algorithms.

"People might be surprised to realize that it’s not human beings behind a desk that are making some of these decisions, but it could be computer generated," Candelora said. "We need to know what goes into those programs that are making those decisions, because I believe it impacts policy."

ARTIFICIAL INTELLIGENCE CHATBOT PASSES ELITE BUSINESS SCHOOL EXAM, OUTPERFORMS SOME IVY LEAGUE STUDENTS

The Connecticut advisory panel pointed to a Yale Law School report released last year that said certain Connecticut agencies did not release full information about their use of algorithms in response to its requests under the state's Freedom of Information Act.

"Responses to Freedom of Information (FOI) requests confirmed both that existing disclosure requirements are insufficient to allow meaningful public oversight of the use of algorithms, and that agencies do not adequately assess the effectiveness and reliability of algorithms," the report said.

"The FOIA responses generally revealed that agencies are insufficiently aware of the potential problems posed by their algorithms and unconcerned about the lack of transparency," it said.

The law school said it requested information on algorithms from the state departments of Children and Families, Education and Administrative Services.

The Department of Children and Families provided the only complete FOIA response to the law school on the use of algorithms to identify at-risk children, the law school said. The agency disclosed basic information but not its source code, which the agency said it did not have and asserted was protected as a trade secret, the law school said.

The Education Department produced only partial information about an algorithm it uses to assign students to schools, while the Department of Administrative Services provided no information on an algorithm used to hire state employees and contractors, according to the law school.

Asked about the advisory panel's recommendations and the Yale study, a spokesperson for the Department of Children and Families said it was reviewing the matters. The Department of Administrative Services said it was working on a response, and officials at the Department of Education did not immediately return a message.

A spokesperson for state Attorney General William Tong declined to comment.

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 Burlingame.com & California Media Partners, LLC. All rights reserved.