Vallejo Police Chief Shawny Williams is sworn into office under the gaze of a city surveillance camera on Nov. 12, 2019. A BuzzFeed News investigation found that Vallejo was one of more than 1,800 public entities to use a facial recognition technology so powerful, it is the subject of numerous lawsuits and several government probes. A Vallejo police spokesperson said the department’s command staff were unaware of its use prior to being contacted for comment by Open Vallejo.
Vallejo Police Chief Shawny Williams is sworn into office under the gaze of a city surveillance camera on Nov. 12, 2019. A BuzzFeed News investigation found that Vallejo was one of more than 1,800 public entities to use a facial recognition technology so powerful, it is the subject of numerous lawsuits and several government probes. A Vallejo police spokesperson said the department’s command staff were unaware of its use prior to being contacted for comment by Open Vallejo. Credit: Geoffrey King / Open Vallejo

Members of the Vallejo Police Department secretly used facial recognition technology developed by Clearview AI, a company embroiled in public controversy for allegedly violating personal privacy laws in several U.S. states and foreign countries, according to a search records database published in connection with a year-long investigation by BuzzFeed News.

The Vallejo Police Department was among more than 1,800 public entities across the country, many of them law enforcement agencies, whose employees used free trials of Clearview’s facial recognition service to run largely-unregulated searches on suspects and test subjects alike, BuzzFeed reported. Some agencies rejected Clearview AI’s software after just one search. Others never tried it at all, including in San Francisco, Oakland, and Berkeley, where local ordinances have prohibited government use of facial recognition technologies since 2019.

But Vallejo police used Clearview AI facial recognition software between 11 and 50 times between 2018 and February of last year, records show, raising unanswered questions about how the department used the service or any results it produced. Department spokesperson Brittany K. Jackson said in an initial April 7 statement that Vallejo police do not use Clearview AI and that command staff were unaware of any use in the three years covered by BuzzFeed’s report. 

“Upon further review with Department members,” Jackson wrote about two hours after the initial exchange, “we have discovered that personnel of the Department exercised a free trial offered by Clearview AI to assess the potential use of the product.”  The department stopped using the technology after the free trial ended, Jackson said. She did not immediately answer follow-up questions asking how the department used Clearview’s technology, which officers used the service, or whether its use may have violated any department policies.

Nearly a third of agencies that acknowledged employees used the technology either initially denied it or said they were unaware of its use, according to BuzzFeed. In some cases, departments told BuzzFeed they stopped their officers from using the service as soon as they found out they had used it.

‘The community has the right to know’

Demonstrators respond to the presence of two drones and a police helicopter on June 2, 2020, hours after Vallejo Police Det. Jarrett Tonn killed Sean Monterrosa, an unarmed 22-year-old, in a Walgreens parking lot.

Geoffrey King / Open Vallejo

Demonstrators respond to the presence of two drones and a police helicopter on June 2, 2020, hours after Vallejo Police Det. Jarrett Tonn killed Sean Monterrosa, an unarmed 22-year-old, in a Walgreens parking lot. Credit: Geoffrey King / Open Vallejo

The revelations about Vallejo’s use of Clearview AI come as the Vallejo Police Department falls under increasing scrutiny for its acquisition of new surveillance technologies.

Fueled by concerns over past abuses by officers, as well as the city’s failure to follow both state and federal law, advocates have begun to push back on the department’s use of surveillance technologies such as license plate-reading cameras, drones, and a cell site simulator — a device that intercepts cell phone signals by mimicking a cell tower in order to pinpoint a mobile phone’s location. 

“It’s incredibly concerning that all this is happening without transparency and without community input,” said Raquel Ortega, an organizer with the American Civil Liberties Union of Northern California.

“The community has the right to know and be in control of what’s in their neighborhoods.”

Ortega is part of an effort by organizers and community members to form an ad hoc surveillance advisory board that would investigate the city’s acquisition and use of surveillance technologies, advise the city council on them, and draft model legislation to control their use. Ortega expects the city council to decide whether to approve the committee later this month. 

Interest in an oversight committee stems from the police department’s growing arsenal of surveillance tools. Since 2015, the department has installed stationary license plate readers throughout the city, as well as cameras mounted on police vehicles. 

Vallejo police Lt. Michael Nichelini records civil rights attorney Melissa Nold with his phone as members of the city council discuss a proposed union contract on Sept. 24, 2019. Nichelini covertly filmed Nold for nearly 15 minutes as she observed the meeting, according to video obtained by Open Vallejo through a public records request. He is now the union’s president despite being fired for threatening a reporter and for sharing a picture of a Vallejo police badge engraved with a swastika.

Geoffrey King / Open Vallejo

Vallejo police Lt. Michael Nichelini records civil rights attorney Melissa Nold with his phone as members of the city council discuss a proposed union contract on Sept. 24, 2019. Nichelini covertly filmed Nold for nearly 15 minutes as she observed the meeting, according to video obtained by Open Vallejo through a public records request. He is now the union’s president despite being fired for threatening a reporter and for sharing a picture of a Vallejo police badge engraved with a swastika. Credit: Geoffrey King / Open Vallejo

License plate readers are specialized cameras that automatically record the license plate numbers of passing cars. They are often shared with multiple other agencies in regional databases, allowing law enforcement agencies to track a vehicle’s movements over time as it passes the cameras. Last year, the federal Bay Area Urban Areas Security Initiative granted the Vallejo Police Department $30,000 for license plate readers to combat terrorism at Vallejo’s San Francisco Bay Ferry terminal. The city added more license plate readers in September, and in February the city council voted to add 10 more on Mare Island at a cost of $56,601. The city currently seeks to expand its existing partnership with Flock Safety, an Atlanta-based company that markets license plate readers to members of the public.

Vallejo purchased its most controversial piece of surveillance equipment last March, when the city spent $766,018 on its cell site simulator. The device is capable of recording a phone’s precise location — including inside buildings — by connecting to every phone in a given area.The purchase sparked outcry from residents and the local advocacy group, Oakland Privacy, which along with two Vallejo residents sued the city last summer. The lawsuit alleged that officials violated state law by acquiring the device without public input. After the group won a preliminary ruling, the city revised its policies for the device based on recommendations by Oakland Privacy. A judge later ruled the city violated California law when it acquired the device, a first-of-its-kind ruling in the state.

False positives

Critics of facial recognition technology have raised concerns about personal privacy implications, false matches, and potential state law violations as researchers learn more about the technology’s reach — and its limitations. Advocacy groups and even some government leaders have said facial recognition is a privacy nightmare, comparing the technology to placing members of the public into a perpetual police lineup. 

A portrait of Nijeer Parks in Patterson, N.J. on Dec. 28, 2020. Parks is the third person known to be arrested for a crime he did not commit based on a bad face recognition match.

Mohamed Sadek / The New York Times, via Redux

Nijeer Parks in Patterson, N.J. on Dec. 28, 2020. Parks is the third person known to be arrested for a crime he did not commit based on a bad face recognition match. Credit: Mohamed Sadek / The New York Times, via Redux

Civil suits against Clearview AI in California, Vermont and Illinois claim the company violated personal privacy laws in those states when it pulled pictures from billions of social media accounts without users’ knowledge or permission. The case in Illinois may soon be headed to the Supreme Court. In its suit against the company, the State of Vermont accused Clearview AI  in court papers of building a “dystopian surveillance database.”

Facebook, Google, Twitter and LinkedIn have all issued cease and desist orders to Clearview AI after the company scraped photos from their users’ profiles, alleging the service violated their respective usage policies.

Privacy advocates have also called attention to facial recognition’s poor track record of accurately identifying people of color. Nationwide, there have been at least three reported instances of law enforcement arresting Black people based on false matches from facial recognition technologies. 

Clearview representatives, including Chief Executive Hoan Ton-That, claim the technology is 99 to 100% accurate. But in early 2020, a person with access to Clearview’s mobile service ran 30 searches of photos provided by BuzzFeed reporters, including some computer-generated pictures of people who do not actually exist. The service matched two of the computer-generated images with real people, according to BuzzFeed. Both of the fake images  — and the real people they were matched with — were people of color. Computer-generated white faces did not result in false positives, according to the report.

But as experts point out, accuracy might soon be less of an impediment as the technology improves, and services like Clearview AI actually approach 100% accuracy.

“The question I have is, how is it used?,” said Mike Katz-Lacabe, Oakland Privacy’s research director. Police departments have a history of using surveillance technologies in secret, he said, and without department transparency and citizen oversight, residents and privacy advocates are unlikely to even know about the technologies the department uses, much less have a say in how they are used.  

“There’s really no restrictions on it,” Katz-Lacabe said. “There’s no one overseeing how they use it.”

Mayor Robert McConnell and members of the Vallejo City Council did not respond to requests to comment for this article.

Brian Howey is an award-winning journalist and master's student at the UC Berkeley Graduate School of Journalism. His reporting has focused on homelessness, extremism and police accountability.