Sen. Edward J. Markey
A group of Democratic lawmakers are seeking to ban government use of facial recognition technology with a new bicameral bill announced Thursday.
The Facial Recognition and Biometric Technology Moratorium Act is a response to recent reports of government use of the technology, according to a press release. It would seek to make permanent the pledges IBM, Microsoft and Amazon have taken to temporarily stop selling their technology to law enforcement.
The companies announced those moves amid rising scrutiny of surveillance systems tracking widespread protests over the death of George Floyd. The firms cited the lack of regulation as a core reason for their decisions, but this new bill goes further than simply placing guardrails on law enforcement’s use of the technology.
Under the legislation, federal entities would not be allowed to use facial recognition technology or other biometric tools like voice recognition, unless Congress lifts the ban with a separate act. It would also ensure that only state and local entities that have their own moratoria in place could receive federal grant funding. The bill would prohibit federal dollars from being used to fund biometric surveillance systems.
Any data collected in violation of the act would not be allowed in judicial proceedings. Individuals would be able to sue when they believe their biometric data was used in violation of the act. Additionally, state attorneys general would also be able to enforce the law and states and localities could enact their own laws on top of the federal legislation.
Sens. Ed Markey, D-Mass., Jeff Merkley, D-Ore. and Reps. Pramila Jayapal, D-Wash. and Ayanna Pressley, D-Mass., will soon introduce the legislation. The bill is supported by several groups that advocate for racial justice and privacy rights including Color of Change, MediaJustice, the American Civil Liberties Union, Electronic Frontier Foundation and more.
In a press release announcing the bill, the lawmakers pointed to the growing body of research that has found systemic bias in facial recognition technology. Studies have found that facial recognition often fails to accurately identify non-white people at a much higher rate than white people, which can have devastating implications. The New York Times reported Wednesday on a recent case where a Black man was wrongfully accused of a crime and arrested based on an inaccurate match by a facial recognition algorithm.
In many cases, the public does not even have access to information about the types of surveillance technologies law enforcement is using. Last week, New York City became the 14th city to adopt a bill promoted by the ACLU that will require the local police department to disclose how it uses surveillance technologies and describes the safeguards it has in place to protect residents’ privacy.
Subscribe to CNBC on YouTube.
WATCH: How police use powerful surveillance tech to track George Floyd protests