As artificial intelligence tools and facial recognition see increased usage in law enforcement, there are understandable concerns about privacy and equity in their application.
How law enforcement officials can utilize these emerging technologies to foster trust and create regulation standards highlighted the May 10 “Facial Recognition Technology in Law Enforcement: Regulations and Trust” conference, organized by the University of Maryland Eastern Shore.
The forum is part of UMES’s research on the use of facial recognition technology in the justice system, funded through a grant from the Governor’s Office of Crime Prevention’s “Building Accountability and Trust” program. The conference’s panel included remarks from state politicians, federal government officials, researchers, and UMES president Heidi M. Anderson.
“We wanted invite speakers from different backgrounds who represent different viewpoints based on their roles,” said Dr. Lily Tsai, an associate professor in the criminal justice program at UMES, and co-principal investigator in the research. “They are all experts in terms of using facial recognition and how to regulate these biometric data.”
Dr. Sandeep Gopalan, the Interim Vice President for Research at UMES, said the growing use of artificial intelligence has become a “hot topic” with the public and law enforcement professionals as it continues to impact how they live and work.
“This is a clear example of how AI impacts people in the law enforcement context, because ultimately facial recognition technology relies on machine learning and deep learning, which are techniques in AI,” he said. “People want to know what happens when the police use technology to recognize or identify people based on images procured from the internet.”
In some cases, there are drawbacks to facial recognition, such as diminished accuracy in identifying people with darker skin tones.
“Most people believe that facial recognition technology is 100% accurate when law enforcement agencies use it,” Tsai said of the results from a survey of the general public taken as part of the research. “But the reality is that there is misidentification. There are errors, and this is something that most people don’t understand.”
Gopalan said these limitations with the current technology hit close to home, especially among the members of the UMES campus community.
“That’s a huge problem for us as an HBCU because potentially our students, when they are on campus or externally exposed to this technology, may face bias,” he said. “So, if we understand it, hopefully, some of our students will be on the building side of these algorithms to make sure that these biases are removed to the greatest extent possible.”
As a result of the strong interest in the conference, Tsai is optimistic about UMES both a thought leader and a trendsetter on the subject.
“This is going to put UMES on the map,’ she said. “Facial recognition and AI tools are advanced technology, and there’s more research to be done in this area and in the future.”
Gopalan added that the project will ultimately result in recommendations being submitted to the Governor’s Office. The researchers’ goal is for better policies and practices in the use of facial recognition technology to be implemented at the state level.