In a victory for transparency in police use of facial recognition, a New Jersey appellate court today ruled that state prosecutors—who charged a man for armed robbery after the technology showed he was a “possible match” for the suspect—must turn over to the defendant detailed information about the face scanning software used, including how it works, source code, and its error rate.
Calling facial recognition “a novel and untested technology,” the court in State of New Jersey v. Francisco Arteaga held that the defendant would be deprived of due process rights unless he could access the raw materials police used to identify him and test its reliability to build a defense. The inner workings of the facial recognition software is vital to impeach witnesses’ identification of him, challenge the state’s investigation, and create reasonable doubt, the court said.
The ruling is a clear win for justice, fairness, and transparency. Study after study shows that facial recognition algorithms are not always reliable, and that error rates spike significantly when involving faces of folks of color, especially Black women, as well as trans and nonbinary people. But despite heightened inaccuracy for members of vulnerable communities often targeted by the police, that hasn’t stopped law enforcement from widely adopting and using this unreliable tool to identify suspects in criminal investigations.
EFF, along with Electronic Privacy Informa
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
Read the original article: