California lawmakers are making significant strides in regulating artificial intelligence (AI) technologies, with a series of proposals aimed at addressing discrimination, misinformation, privacy concerns and prohibiting deepfakes in the contexts of elections and pornography, advancing in the legislature last week.
These proposals must now gain approval from the other legislative chamber before being presented to Governor Gavin Newsom.
Experts and lawmakers warn that the United States is falling behind Europe in the race to regulate AI. The rapid development of AI technologies poses significant risks, including potential job losses, the spread of misinformation, privacy violations, and biases in automated systems.
Experts and lawmakers warn that the United States is falling behind Europe in the race to regulate AI. The rapid development of AI technologies poses significant risks, including potential job losses, the spread of misinformation, privacy violations, and biases in automated systems.
Governor Newsom has championed California as a frontrunner in both the adoption and regulation of AI. He has outlined plans for the state to deploy generative AI tools to reduce highway congestion, enhance road safety, and provide tax guidance. Concurrently, his administration is exploring new regulations to prevent AI discrimination in hiring practices. Speaking at an AI summit in San Francisco on Wednesday, Newsom revealed that California is considering at least three additional AI tools, including one designed to address homelessness.
Tatiana Rice, deputy director of the Future of Privacy Forum, a nonprofit organization that advises lawmakers on technology and privacy issues, said that California’s strong priv
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
This article has been indexed from CySecurity News – Latest Information Security and Hacking Incidents
Read the original article: