Prohibition of face recognition technology in West Lafayette failed last December, but activists say they plan to try a new one this week.
West Lafaith Council member David Sander said Monday night that he would re-introduce the technology ban at a council meeting. In an effort to improve the chances of the proclamation being passed after last year’s Vito vote, Sander said it would include special conditions that would allow the police to use the technology in serious crimes.
It is a matter of whether there is enough control over the companies that have contracted with the government to provide the controversial technology and software companies.
Face recognition technology uses computer programs to compare individuals captured on video or camera to the front database. Law enforcement agencies say it is an important tool for investigation – but privacy advocates say that a database of photos can unknowingly and without permission include images from social media.
Photos of Michigan driver’s licenses are shared directly with state police. They were rarely arrested, but they were unjustly arrested.
In December, West Lafayette City Council voted against the ban on face recognition technology – after the city’s mayor, Veto, announced it.
During the hearing, councilor David Sanders said West Lafayette Police Chief Troy Harris asked which department would conduct face-to-face identification – especially if he had heard about a major service provider, Clearview AI.
“Clear view? I have never heard of him, ”Chief Harris replied.
Clearview AI is one of the most controversial companies that provides face recognition technology to law enforcement. The company has been criticized for taking photos from sites like Facebook without user permission. Those photos are uploaded to a database used to identify suspects.
Many law enforcement agencies across the country have used Cleleview’s software – and privacy attorneys are concerned that the company, its technology, and its use are not adequately controlled.
Jameson Spivak is a partner at the Georgetown Law Privacy and Technology Center, a tank for monitoring and privacy law. He said the traditional face recognition database includes one-on-one matching – the suspect’s photo is compared to a mug shot or driver’s license photo.
“In fact, in the case of the unprecedented Clearview AI issue, this database includes faces away from the Internet,” he said. “So basically if your photo is on the internet and identified, you might be in this database. Most people in this database probably do not know that it is in it.
Spivak says the Privacy Center supports the use of face recognition technology.
“In most cases, this technology has been widely circulated,” he said. “Often, elected officials do not even know it has been used, because most law enforcement agencies do not know the technology, not the city, state or county budget, but the federal government or non-profit police. In fact, the city council may not even know it has been used.
In West Lafayette, police said face-to-face questions will be sent to the Indiana Motor Vehicles Office or Indiana Intelligence Center. But detective Lt. Jonathan Eger said facial recognition technology is “rarely used”.
“Normally we do not take a photo of a suspect during interrogation so it ends up being a source that we do not use,” he said.
The Indiana Motor Vehicle Office did not respond to WBAA comments.
Captain Ron Galavis is Indiana State Police Chief Public Information Officer. He said law enforcement agencies in the state are sending face-to-face requests to the Indiana Intelligence Center. According to Galavis, photos sent there must be obtained “legally”.
“We have a policy open to the public on our website,” he said. “At the end of the day, we are very aware and we really want to respect the rights of those who move there. We want to move within boundaries so that human rights are not violated.”
State policy regarding face recognition states that the technology may be used in the image “if there is a reasonable suspicion” that the subject is related to “possible criminal or terrorist activity” or has knowledge.
According to Galavis, they need a “criminal connection” worth pursuing.
“So again, going to a protest or a rally, going to a meeting that has the right to be people — if there is no criminal connection, the request will not be accepted,” he said.
Galavis points out that face-to-face matches are not enough to make a person guilty of a crime. The Fusion Center’s policy states that face-to-face search results in all cases should be considered “Principles መራ Positive identification of any subject should not be considered”.
“When it comes to using this type of technology, the Fusion Center certainly needs to develop a step-by-step policy – what are the parameters and limits on which this technology can be used,” said Galaviz.
But privacy advocates such as Spivak worry that law enforcement policies targeting law enforcement are not enough to hold the police accountable.
“Law enforcement agencies everywhere say, ‘We are only using this for investigative leadership,'” he said. But nothing holds them back.
When first asked if Fusion Center uses Clearview AI, Galaviz only mentions Vigilant Solutions – a unique face recognition technology provider. Proponents of her case have been working to make the actual transcript of this statement available online.
In the Indiana Intelligence Fusion Center policy documents, the only face-to-face provider mentioned is the active solution, but the 2021 Buzzfeed investigation revealed that government police had requested Clearview more than 5,000 times between 2018 and the end of February 2020. Fusion Center privacy policy June 1, 2019.
And, according to Clearview, Indiana State Police was the company’s “first-time customer.”
Asked when Indiana State was transformed from Vigilant to Clearview, Galaviz responded by email: “Both platforms are used as checks and balances.”
In the WBAA’s FOIA application, in 2020 and 2021, Indiana State Police used Vigilance Solutions 373 times. Clearview AI, at the request of FOIA, was used 3,067 times in the same period – approximately eight times for search.
Asked to explain why Clearview AI was not mentioned in the state policy documents, Galavisz said, “Clearview provides an extensive database.” In response, Galavis said: “I will return to what I said earlier about the use of these platforms, which will increase the recognition of the use of both platforms.
Spivak could only guess why Clearview would be used without appearing in state policy documents with Georgetown Law.
“One of the things I suspect is that the police are hiding its use because Clearview has become a toxic name,” he said. “Over the past few years, Clearview has violated the terms of service of Facebook, YouTube, and social media platforms by deleting images from these websites.
And Spivak said the lack of transparency makes it difficult for police to ensure that they follow their own policies.
“This leads to confusion and a little bit of transparency and ultimately less accountability,” he said.
The situation in West Lafayette is not unique. According to Spivac, efforts at the federal level to monitor and hold face-to-face recognition technology have been limited – pressuring local governments to take action.
“In this gap, states and cities are increasingly pushing for things to happen without the federal government doing anything,” he said. Because the federal government is not doing anything about it, and perhaps even the states have done nothing, local activists, local politicians have tried to do something.
Across the country, Spivak has imposed restrictions on the use of face recognition technology in more than 20 cities and two states – although those restrictions sometimes include the use of violent crime.
David Sanders, a member of the West Lafayette Council, said he hoped changes in the regulation would help ease this time. Police say they continue to worry about using the technology.
“There is a lot of interaction with Clearview AI and it is not present in their public documents – this tells me something,” he said. “It simply came to our notice then.