Facial recognition technology, unbeknownst to citizens, is used in a variety of public settings. U.S. lawmakers are in agreement that this technology should not be deployed freely until security, privacy, and accuracy concerns can be mitigated and civil liberties guaranteed.
The House Committee on Oversight and Reform held a hearing on the use of facial recognition (FR) technology on Wednesday, the third in a three-part series. The hearings are an effort to understand the scope of how private and public companies are using this technology, so these companies can be held accountable to ethical standards.
The use of facial recognition technology is increasing. It can be found in-home security systems, social media sites, sports arenas, and elsewhere for advertising, security, access, photo, and video data identification, and accessibility.
The National Institute of Standards and Technology (NIST) issued a report in December analyzing private facial recognition systems companies. The report found: “Across demographics, false positives rates often vary by factors of 10 to beyond 100 times.” And that Africans and Asians were more often misidentified.
Rep. Eleanor Holmes Norton (D-District of Columbia), shared her concern about the fact that consumers are unaware of the security issues facial recognition on their cell phone poses. She asked the panel if there were any means by which consumers could confirm that these cell phone manufacturers are storing their biometric or other data on their servers.
Meredith Whittaker, Co-Founder, AI Now Institute, New York University said that this technology, “is hidden behind trade secrecy.” She continued, “This is a corporate technology that is not open for scrutiny and auditing by external experts. I think it’s notable that while NIST reviewed 189 algorithms for their latest report, Amazon refused to submit their recognition algorithm to NIST, and they claimed they couldn’t modify it to meet NIST standards.”
Whittaker expressed suspicion about the multibillion-dollar company’s non-compliance with the NIST research and pointed to their global reach and innovations. She said whatever the reason for not disclosing information about their Facial Recognition technology, “we have to trust these companies, but we don’t have many options to say no or to scrutinize the claims they make.”
Rep. Brenda Lawrence (D-Mich.) introduced HR153, which addresses the need for the development of guidelines for the ethical development of transparency and ethics in the AI systems processes, and the implications of it. Congresswoman Lawrence said that currently there are no checks on how and when the technology is used, and what companies are doing with the data.
“Right now, we have the wild, wild west when it comes to AI.”
Lawrence’s bill addresses the fact that artificial intelligence isn’t the only emerging technology that requires the development of ethical guidelines; the same concerns exist for facial recognition technology.
The Congresswomen represents a district in Michigan that is 67 percent minority and is concerned about the findings of the NIST report, which confirmed that African and Asian people are more often misidentified by these facial recognition algorithms.
Lawrence continued, “We in America have the right to know if we’re under surveillance and what are you doing with it. Another thing any release of data that you [are] gathering should be required to go through some type of process for the release of that.”
All the Lawmakers echoed this concern and asked the experts what could or should be done to regulate this industry and ensure citizens’ civil liberties.
“I think we need to pause the technology and let the rest of it catch up so that we don’t allow corporate interests and corporate technology to race ahead to be built into our core infrastructure,” said Whittaker, “without having put the safeguards in place.”
Congressman Jim Jordan (R-Ohio) said this technology, left unchecked could be a threat to Americans’ fundamental civil rights. “You said this facial recognition poses an existential threat to democracy and liberty. My main concern is how [a] government may use this to harm our First Amendment, and fourth amendment Liberty.”
Rep. Tliab said she is disturbed by the fact that in her district, FR is being used in low-income government housing facilities: “I don’t think being poor or being working class means somehow that you deserve less civil liberties or less privacy.”
It was made clear that the public has no substantial knowledge about the use of this technology and its accuracy and that the NIST report is only a piece of the puzzle. “We don’t have a way to audit whether NIST’s results in the laboratory represent the performance in different contexts, like amusement parks or stadiums, or wherever else so there’s a big gap in the auditing standards,” said Whittaker, “Although the audits we have right now have shown extremely concerning results.”
Lawmakers and experts agreed that communities should be educated about when this technology is being used, the harm it can do to their communities, and have a say in where it is used.