December 15, 2025
Technology

Police use of facial recognition technology subject of upcoming public NIST meeting


The U.S. National Institute of Standards and Technology’s (NIST) National Artificial Intelligence Advisory Committee will hold a virtual public meeting to receive a briefing from its Law Enforcement Subcommittee (NAIAC LE) on the benefits and drawbacks of using AI in law enforcement that are specific to facial recognition technology.

NIST said the NAIAC LE briefing will address AI as it pertains to the needs of law enforcement and other agencies to identify individuals for a wide variety of reasons. NIST said “video and photographic evidence obtained from surveillance footage, bystanders, social media, and other sources may provide crucial evidence about potential suspects, victims, witnesses, or community members in distress,” and that facial recognition technologies (FRTs) can allow law enforcement officers to identify these individuals with greater frequency, speed, and accuracy.”

But, NIST explained, “therein lies both the potential and the risk of facial recognition technology. Unrestrained use of FRTs could bring more people unwillingly or unwittingly into the scope of law enforcement investigations, and the system could be misused in a manner that violates constitutional rights or community norms.”

To prepare for the public briefing and meeting on September 4th, NAIAC invites public comment on the July 2024 discussion draft document, Discussing a Framework for the Responsible Use of Facial Recognition Technology in Law Enforcement. NIST said the discussion is intended to inform future recommendations from NAIAC LE to the full NAIAC committee concerning the limited and responsible use of FRTs.

The draft NIST document says that “while some communities and civil rights organizations oppose all use of FRTs by law enforcement,” public opinion overall remains mixed, with 46 percent believing that “widespread use of facial recognition technology by police would be a good idea,” compared to 27 percent who say it would be a “bad idea.”

NIST put forth a framework it said “creates the structure for legal requirements and best practices that should steer the responsible use of FRTs.” These are the four basic findings NIST said provide the backdrop for the framework:

  • When used appropriately, FRTs have the potential to improve the quality of law enforcement’s efforts, including both its criminal investigations and its community caretaking functions;
  • Unconstrained use of FRT poses a serious risk to civil rights and civil liberties, including but not limited to accuracy and bias concerns, risks to free expression, and privacy invasions;
  • Current law does not adequately direct or constrain law enforcement FRT use to ensure that law enforcement is capturing the benefits of the technology while also guarding against its risks;
  • If policing agencies are going to continue to use — or start using — FRT technology, they should do so subject to carefully-considered guardrails.

NIST said the framework that is outlined in detail in the draft document provides “preliminary recommendations for future recommendations” and well as alternative recommendations.

“Because of the unprecedented nature of FRT, and the fact that reasonable and knowledgeable individuals will have differing opinions about how to plan for uncertainty and how to manage conflicts between competing values, we have not, and could not, reach consensus on each and every major issue relating to FRTs,” NIST explained. “For this reason, we have noted which issues caused significant fractures among our members so that NAIAC may have a well-informed discussion about the competing interests involved.”

NIST identified a set of FRT uses that it says are primarily “surveillance” uses “for which we do not yet have a framework and set of preliminary recommendations.”

The discussion draft says “LEAs should not purchase FRT software from a vendor, use the results of FRT software from a vendor, or produce their own FRT systems unless the vendor or producer:

  • Can demonstrate, using results from NIST testing, high accuracy across the
  • demographic groups present in real-world use;
  • Discloses information about their FRT systems sufficient to enable independent, expert assessment of their FRT systems’ performance for intended law enforcement use cases;
  • Provides instruction and documentation on image quality and other relevant technical specifications required to maintain low error rates across demographic groups for the particular system(s) sold to law enforcement;
  • Provides LEA users with ongoing training, technical support, and software
  • updates needed to ensure their FRT systems can maintain high accuracy
  • across demographic groups in real-world deployment contexts;
  • Builds their FRT technology to facilitate auditing regarding who is using the
  • technology and for what purpose; and
  • Can demonstrate compliance with data security best practices.

NIST’s draft discussion framework further states that law enforcement agencies “should maintain and publish a comprehensive FRT acceptable use policy” which should, at a minimum, specify:

  • FRT uses that are authorized or prohibited;
  • Protocols and procedures that will ensure consistent and lawful use;
  • Authorized users of FRT;
  • Rules for data collection and retention; and
  • Restrictions on data access, analysis, or release.

NIST further stated that the use of FRTs “for criminal investigations includes the archetype case in which a law enforcement agency applies FRT to identify a suspect from an image captured at the scene of a crime,” but “by contrast, when FRT is used to identify an incapacitated person, or to limit access to a high-security building or area, such use is non-criminal.”

There are, are, however, no bright lines that can cleanly separate criminal from non-criminal use,” NIST explained, pointing out that “the most difficult examples involve the use of FRT to identify crime victims or witnesses to a crime who may become reluctant participants in a criminal investigation or prosecution, and who may become criminal defendants in other criminal investigations.”

In that situation, NIST said it recommends that law enforcement “considering FRT use along the law enforcement-to-non-law enforcement range through three crude categories: suspects, victims or witnesses, and non-law enforcement use.”

This open meeting will be held via web conference on Wednesday, September 4, 2024, from 2:00 p.m.-5:00 p.m. Eastern Time.

Article Topics

biometric identification  |  criminal ID  |  facial recognition  |  NIST  |  police  |  U.S. Government  |  United States

Latest Biometrics News

 

In unsurprising news, U.S. politicians are expressing alarm over the introduction into grocery stores of vending machines that sell ammunition…

 

The U.S. Department of Homeland Security’s (DHS) Office of Biometric Identity Management (OBIM) is seeking comments regarding its request for…

 

The revised digital identity guidelines from America’s National Institute of Standards and Technology are up for a final review after…

 

Countries of the Asia-Pacific Economic Cooperation (APEC) should work on making their national digital IDs interoperable, says Daniel Castro, vice…

 

Over 2.5 million Scottish people are now using MyAccount and reusable digital ID from Yoti in a bid to streamline…

 

An audited financial statement released recently by the World Bank has revealed that Nigeria’s Digital Identification for Development (ID4D) project…





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *