The Los Angeles Police Department has used facial-recognition software nearly 30,000 times since 2009 even while denying at times that it used the controversial technology at all, the Los Angeles Times reported Monday.
The LAPD doesn't have its own face-scanning platform but uses the face-comparison software provided through the Los Angeles County Regional Identification System, a database of about 9 million mugshots, the Times said.
Recent denials that the Police Department used facial recognition were mistakes, Assistant Chief Horace Frank told the paper. Frank said he told the city’s Police Commission about its use two years ago.
Local news from across Southern California
“We aren’t trying to hide anything,” he said.
The software has been used to help identify suspects in crimes without witnesses, gang crimes where witnesses are afraid to come forward, and by a multi-law enforcement agency task force investigating arsons, burglaries and other crimes that took place during summer protests over police shootings, LAPD spokesman Josh Rubenstein told the paper.
The software was used to compare images from crime scenes, witnesses and surveillance cameras with the regional database.
Rubenstein said he couldn't determine how many arrests have resulted from use of the software but said the system is “only used to develop investigative leads, not to solely identify a suspect in a crime."
“No individuals are arrested by the LAPD based solely on facial recognition results,” he said.
The LAPD also doesn't use the software to scan crowds or in any live-streaming capacity, Rubenstein said.
He said 330 LAPD personnel have access to the software.
“That is a lot of people with access to the system, and shows its widespread usage,” Mohammad Tajsar, a senior staff attorney at the American Civil Liberties Union of Southern California, told the Times.
The potential for mass use of facial-recognition technology has raised concerns about privacy and civil rights, especially since the software has shown problems with higher misidentification rates for women and people of color.
Last year, a federal study of algorithms provided by about 100 facial-recognition software providers showed higher error rates for women, the youngest and oldest people and for certain racial groups — including Blacks, Asians and American Indians — depending on which image database or software was used.
California has enacted a three-year ban on the use of facial recognition technology in police body cameras. The ban followed similar moves in New Hampshire and Oregon.
Last year, San Francisco and Oakland banned the use of any facial recognition by police and other city departments. It also has been banned in the Boston suburb of Somerville, Massachusetts.