Facial recognition software law enforcement


















Recognizing the inherent biases in facial recognition systems and how they can be and have been misused, the question is how these systems can be reformed and regulated to ensure their constructive use by law enforcement. Although a few large technology companies have announced temporary bans on facial recognition technology sales to police departments e. Federal and state legislators must also play a critical role in this reformation.

The laws they adopt should protect the right to privacy and other civil liberties impugned by facial recognition technology.

Currently, all states have introduced or adopted a bill to address these concerns in some form [20] but Congress has yet to act. Studies show that this technology brings racial discrimination and bias. Transformation of facial recognition systems should also include reform at the developmental stage. We can achieve this by educating and training programmers from diverse backgrounds. Absent issuing everyone a pair of anti-facial recognition glasses [33] or employing creative face make-up techniques, [34] we need immediate practical solutions to the negative repercussions from the use of facial recognition technology by law enforcement.

Limiting use, imposing strict regulations, and diversifying the technology workforce are logical places to start with a technology that does not magically work in every instance. Vivian D. See also Shira Ovide, When the police think software is magic , N. Times Jan. Clearview AI, Inc. May 29, A guard can see every cell and view all inmates, but the inmates cannot see into the tower.

These inmates will never know whether or not they are being watched. Beneficial uses of facial recognition technology are allowed under the Washington FR Bill, including locating or identify missing persons including missing or murdered indigenous women, subjects of Amber and silver alerts, and other possible crime victims , and identifying deceased persons, all for the purposes of keeping the public safe.

For racial minorities and women, facial recognition systems have proven disproportionately less accurate. In a widely cited study, MIT Media Lab researcher Joy Buolamwini found that three leading facial recognition tools — from Microsoft, IBM, and Chinese firm Megvii, were incorrect as much as a third of the time in identifying the gender of darker skinned women , as compared to having only a 1 percent error rate for white males.

Presumably, bias issues in facial recognition will improve over time, as the technology learns and data sets improve. And facial recognition can be harder to hold accountable than a human being when it makes a mistake. So unwinding the biases built into this tech is no easy task. Another hurdle facial recognition tech will have to clear: Convincing communities they can trust their police departments to wield the powerful tool responsibly.

Part of the challenge is that in many cases, public trust in police officers is divided, especially along racial lines. Some tech companies, such as Microsoft and IBM , have called for government regulation on the technology. But that raises the question: Should people trust companies any more than police to self-regulate this tech? Other groups such as the ACLU have created a model for local communities to exert oversight and control over police use of surveillance technology, including facial recognition.

The Community Control Over Police Surveillance laws , which the ACLU developed as a template for local regulation, empowers city councils to decide what surveillance technologies are used in their area, and mandate community input. More than a dozen cities and local jurisdictions have passed such laws, and the ACLU says efforts are underway in several others. As long as police departments continue to use facial recognition in this information vacuum, the backlash against the technology will likely grow stronger, no matter the potential upside.

Passing robust federal level legislation regulating the tech, working to eradicate the biases around it, and giving the public more insight into how it functions, would be a good first step toward a future in which this tech inspires less fear and controversy. Open Sourced is made possible by the Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists. Our mission has never been more vital than it is in this moment: to empower through understanding.

Financial contributions from our readers are a critical part of supporting our resource-intensive work and help us keep our journalism free for all.

I accept. Take action on UpLink. Forum in focus. Read more about this project. Explore context. Explore the latest strategic trends, research and analysis. Audio: Listen to the article This is an experimental feature. Some words or names may be mispronounced. Does it sound good?

Facial recognition technology has the potential to help conduct faster investigations, bring offenders to justice and, thus, resolve, stop and prevent crimes. Eventual widespread use by law enforcement agencies raises concerns over the potential risk of wrongful arrests, surveillance and human rights violations. Tests of this framework will start in January Have you read?

What to know about the EU's facial recognition regulation — and how to comply Facial recognition can help re-start post-pandemic travel. Here's how to limit the risks Trust in facial recognition technology can be achieved. A set of principles for action that defines what constitutes responsible use of facial recognition for law enforcement investigations by covering all relevant policy considerations; A self-assessment questionnaire that details the requirements that law enforcement agencies must respect to ensure compliance with the principles for action.

A new initiative represents the most comprehensive policy response to the risks associated with FRT for law enforcement investigations. License and Republishing.



0コメント

  • 1000 / 1000