Minnesota HF 3661 would prohibit all state and local government entities from acquiring or using facial recognition technology, mandate deletion of previously collected facial recognition data, and give individuals a private right of action for damages when the government violates the ban.

That is a significant sentence. It deserves unpacking.

Overview

Most government surveillance reform follows a familiar pattern: risk-based frameworks, bias audits, human-in-the-loop requirements, annual reporting. HF 3661 skips all of that. The bill adopts a categorical prohibition — not a regulatory regime, but a ban. Minnesota public entities cannot acquire facial recognition technology. They cannot use it. And they must purge whatever facial recognition data they have already collected.[]

This is the governance equivalent of choosing demolition over renovation. And the enforcement mechanism — a private right of action for damages — ensures the prohibition is not merely aspirational.

Key Provisions

The Prohibition: Acquisition and Use

HF 3661 targets two activities: acquiring and using facial recognition technology. The conjunction matters. A government entity cannot procure a new system. It also cannot continue operating an existing one. This forecloses the common workaround where agencies stop buying new tools but grandfather legacy deployments indefinitely.

The scope covers state agencies and political subdivisions. That means county sheriffs, municipal police departments, the Bureau of Criminal Apprehension, corrections facilities, and any other unit of Minnesota government. The breadth is deliberate.

A key implementation question: does "use" include submitting images to a third-party vendor that operates facial recognition on the government's behalf? If a county sends a photograph to a commercial service like Clearview AI and receives match results, the county has arguably "used" facial recognition technology even though it never ran the algorithm itself. The bill's treatment of vendor-managed systems and data-sharing arrangements will determine whether this obvious circumvention route remains open.

The Deletion Mandate

HF 3661 is not merely prospective. It imposes a retroactive data lifecycle obligation: all previously collected facial recognition data must be deleted.

This provision raises immediate operational questions. "Facial recognition data" could mean raw images, biometric templates, faceprint embeddings, or indexed databases linking facial data to identity records. Each category presents different technical challenges for deletion and different evidentiary implications.

Consider a law enforcement agency that has built an investigative database over years, linking facial recognition matches to case files. Deleting the underlying biometric data may be straightforward. But what about derivative evidence — case notes, arrest records, or prosecution files that reference or rely on facial recognition matches? The bill's interaction with evidence preservation obligations and Minn. Stat. ch. 13 (Minnesota Government Data Practices Act) will require careful parsing.[]

The deletion timeline — whether the bill specifies 30 days, 90 days, or some other period — materially affects compliance feasibility. A large agency with facial recognition data distributed across multiple systems, backup tapes, and shared databases cannot purge everything overnight.

The Private Right of Action

This is where HF 3661 gets serious.

Most public-sector surveillance restrictions rely on internal compliance mechanisms: inspector general oversight, exclusionary rules in criminal proceedings, or administrative penalties. These mechanisms depend on government policing itself. A private right of action shifts enforcement power to the people the ban is designed to protect.

The practical effect is litigation exposure. Every deployment of facial recognition technology by a Minnesota public entity becomes a potential lawsuit. Every failure to delete legacy data becomes a potential lawsuit. The threat of damages claims functions as a deterrent that no internal compliance memo can match.

Several details in the bill text will determine the private right of action's actual force:

  • Available damages: Actual damages alone may be insufficient to motivate litigation where individual harm is diffuse. Statutory damages (a fixed amount per violation) or punitive damages would dramatically increase enforcement incentive.
  • Attorneys' fees: Fee-shifting provisions determine whether plaintiffs' lawyers will take these cases. Without fee-shifting, the economics of suing a government entity over surveillance — where individual monetary harm may be modest — often do not work.
  • Sovereign immunity interaction: Minnesota's tort claims framework, governed by Minn. Stat. § 3.736 (state entities) and Minn. Stat. § 466.01 et seq. (political subdivisions), imposes caps on damages and procedural requirements for claims against government defendants.[] If HF 3661 does not expressly override these frameworks, damages claims could be capped at levels too low to drive meaningful deterrence.

Compliance Implications

For Government Entities

Compliance requires three immediate workstreams:

  1. Procurement freeze. All pending acquisitions of facial recognition technology must be halted. Existing contracts must be reviewed for facial recognition features or services, and those provisions must be terminated or renegotiated.

  2. Data inventory and purge. Agencies must identify all facial recognition data in their possession — including templates, embeddings, indexed databases, and any data held by vendors on their behalf. A deletion program with documented verification must follow.

  3. Audit and attestation infrastructure. To defend against damages claims, agencies will need audit logging demonstrating non-use of facial recognition technology. This means not just removing systems, but implementing procurement gating controls and technical attestations that adjacent computer-vision tools (video analytics, object detection) do not include facial recognition modules.

The third workstream is the most technically demanding. Modern computer-vision platforms are modular. A video analytics system used for traffic monitoring might include a facial recognition capability that is disabled but present in the codebase. Does possessing software with a dormant facial recognition feature constitute "acquisition" of facial recognition technology? Conservative compliance counsel will say yes.

For Vendors

Technology vendors serving Minnesota public entities face contract disruption. Companies providing identity verification, access control, or video analytics must either:

  • Offer feature-disabled versions of their products with technical attestations that facial recognition is not included, or
  • Exit the Minnesota public-sector market for affected product lines.

Feature flags and model removal become compliance-critical technical controls. A vendor cannot simply promise not to activate facial recognition; it must demonstrate — through architecture documentation, code audits, or third-party attestation — that the capability does not exist in the deployed system.

Fiduciary Relevance Framework Analysis

Pillar 1: Duty of AI Due Care and Loyalty

HF 3661 operationalizes a maximalist version of the duty of care. Rather than requiring government entities to exercise reasonable care in deploying facial recognition — through bias testing, accuracy thresholds, or human review — the bill concludes that no level of care is sufficient. The duty runs to Minnesota residents, and it is discharged only through non-use.

This is a fiduciary posture. Government entities hold a position of trust relative to the people they serve. Facial recognition deployed without consent, often against communities that have no meaningful ability to opt out, represents a breach of that trust that procedural safeguards cannot cure. HF 3661 makes that judgment explicit.

Pillar 2: Transparency and Explainable Redress

The private right of action is a redress mechanism. It does not require individuals to navigate administrative complaint processes or wait for an inspector general's report. It provides direct access to courts and damages — the most legible form of accountability in the American legal system.

The deletion mandate also serves transparency goals. Data that does not exist cannot be misused, misclassified, or concealed. Purging facial recognition data eliminates an entire category of government information that, under Minn. Stat. ch. 13, would otherwise be subject to complex classification and access disputes.

Pillar 3: Access to Justice and Liability

The private right of action is the bill's most important structural feature from an access-to-justice perspective. It converts a government obligation into an individual right. The question is whether the remedy is robust enough to be exercised. Sovereign immunity caps, absence of fee-shifting, or restrictive standing requirements could render the right of action theoretical rather than practical.

Pillar 4: Privacy and Meaningful Data Minimization

The deletion mandate is data minimization taken to its logical endpoint: zero retention. This echoes the lifecycle control principles embedded in the Minnesota Digital Trust & Consumer Protection Act framework, where collection limitation and mandatory purge obligations prevent the accumulation of surveillance infrastructure that, once built, is nearly impossible to dismantle through ordinary political processes.

Broader Significance

HF 3661 matters beyond Minnesota for three reasons.

First, it represents the ban-versus-regulate choice that every jurisdiction deploying AI governance must confront. The EU AI Act, Regulation (EU) 2024/1689, restricts certain real-time biometric identification uses in public spaces but permits others subject to safeguards.[] HF 3661 rejects the safeguards approach entirely for government use. As other states consider facial recognition legislation, the Minnesota bill provides a clean template for the prohibitionist position.

Second, the private right of action sets a precedent for enforcement design in AI governance. Most AI regulation relies on agency enforcement — the FTC, state attorneys general, or specialized AI oversight bodies. HF 3661 distributes enforcement power to individuals. That choice has consequences: it increases litigation volume, creates case law through adversarial proceedings, and makes compliance failures expensive in ways that regulatory fines often are not.

Third, the deletion mandate confronts the data accumulation problem directly. Government biometric databases, once built, tend to grow. They acquire institutional constituencies — law enforcement agencies that depend on them, vendors that profit from them, bureaucracies that maintain them. Mandating deletion is a structural intervention against institutional inertia. It is far harder to rebuild a deleted database than to expand an existing one.

The bill's substrate-agnostic quality also deserves attention. HF 3661 does not target a specific vendor, algorithm, or model architecture. It targets a capability: the ability to identify individuals by their facial features. As facial recognition technology evolves — from traditional feature-extraction algorithms to deep learning models to whatever comes next — the prohibition remains operative. This is durable governance.

Whether HF 3661 survives the legislative process is a political question. Whether its design choices are sound is a governance question. On the governance merits, the bill makes a clear argument: some uses of AI by government are incompatible with the trust relationship between the state and its residents, and the appropriate response is prohibition backed by enforceable individual rights.

That argument will travel.

Notes

  1. [] The bill text is available at the Minnesota Revisor's Office (https://www.revisor.mn.gov/bills/94/2026/0/HF/3661/versions/latest/). All specific provisions — including definitions, deletion timelines, remedy details, and any carve-outs — should be confirmed against the latest enrolled version before reliance.
  2. [] Minn. Stat. ch. 13 (Minnesota Government Data Practices Act) governs the classification, collection, and retention of government data. Its interaction with HF 3661's deletion mandate — particularly regarding data classified as part of active investigations or litigation holds — requires careful analysis of specific statutory sections.
  3. [] Minn. Stat. § 3.736 governs tort claims against the state; Minn. Stat. § 466.01 et seq. governs tort claims against political subdivisions. Both impose damages caps and procedural prerequisites. Whether HF 3661 creates an independent statutory cause of action that supersedes these frameworks is a threshold question for the bill's enforcement viability.
  4. [] Regulation (EU) 2024/1689 (the EU Artificial Intelligence Act) restricts certain uses of real-time remote biometric identification in publicly accessible spaces, with enumerated exceptions for law enforcement. Specific article references should be confirmed against the final published text in the Official Journal of the European Union.