FTC’s New Policy on Biometric Information Creates New Federal and State Legal Risks
At a Glance
- After just over 10 years of relative silence on the topic, the FTC just issued a new policy statement on the regulation of biometrics under Section 5 of the FTC Act.
- The policy’s “guidance” to businesses is in some places obvious, but in other places unrealistic and vague.
- The issuance of the policy suggests the FTC shortly will be taking enforcement actions.
- We expect both the plaintiffs’ class action bar and state attorneys general to use the policy to advance their own state law theories in states without laws regulating biometrics.
The Illinois Biometric Information Privacy Act (BIPA) is not the only law that governs private entities’ collection, use and storage of biometric data — there are biometric-specific laws in Washington State and Texas; biometric-specific municipal ordinances in New York City, Baltimore and Portland; and an increasing number of comprehensive state data privacy and data breach laws that reference or include biometrics within their scope. But BIPA has been the only real source of exposure involving biometrics.
With a recent Federal Trade Commission (FTC) policy statement, this may be about to change: Companies will likely no longer be able to avoid regulatory interest and litigation simply by staying out of Illinois.
On May 18, 2023, the FTC voted 3-0 and adopted a new policy statement on Biometric Information and Section 5 of the Federal Trade Commission Act. This new policy is the first significant effort by the commission to address biometrics since 2012 when the FTC issued a staff report titled Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies. The decision to adopt the new policy reflects the commission’s “significant concerns” about biometric information and related technologies with respect to privacy, security, and the potential for bias and discrimination. And it may spawn new lawsuits under state consumer protection laws known as “little FTC Acts.”
The FTC Plans to Proactively Address Certain Biometric Practices
Section 5 of the FTC Act has long given the commission broad enforcement powers against entities that engage in unfair and deceptive acts. For example in 2021, photo storage app-provider Everalbum entered into a consent agreement with the FTC over allegations that it deceived customers about the settings of its facial recognition feature. See Decision and Order, In re Everalbum, Inc., FTC File No. 1923172.
But during her opening remarks, Chair Lina M. Khan stressed “the need to assess and address risks proactively” of both biometric “developers and the end-users of the technologies.” She therefore reminded companies that the FTC’s enforcement powers under Section 5 allow the commission to take action against unfairness proactively where there is the mere likelihood of harm, and not just after consumers are injured.
Building on its previous enforcement efforts, the FTC’s policy statement thus sets out a nonexhaustive list of practices it will review to determine whether businesses are acting deceptively or unfairly in violation of Section 5.
Deception
With respect to deceptive acts, the policy identifies two areas of focus: (i) making false or unsubstantiated claims about efficacy of biometric technology, and (ii) making deceptive statements about the collection or use of biometric information.
The FTC’s warning about efficacy is directed primarily at developers of the technology. First, developers should not oversell the capabilities of their products because that will: harm their competitors; harm business customers who rely on the misrepresentations; and ultimately harm consumers who are denied benefits, wrongly accused or otherwise adversely impacted by a faulty product. Second, developers’ claims of efficacy must be substantiated by tests or audits that replicate “real world conditions.” The FTC stated that it will “carefully scrutinize” developer’s representations, but singled out four claims that appear most prevalent: reductions in rates of theft, violent incidents, fraud and the elimination of bias in hiring. Developers therefore should document their testing and closely monitor their marketing to ensure that what they are representing can be backed up with data, particularly with regard to those four claims.
The FTC’s warning about collection and use is directed primarily at consumer-facing businesses. The FTC identified two practices of concern: misleading consumers about whether and how they collect biometric information, and failing to fully disclose all uses of the information. While intended as a warning, the commission’s highlighting of its two prior enforcement actions — one against Everalbum and one against Facebook — serves only to underscore how little attention the FTC has devoted to this issue until now. This new policy signals there are likely to be more enforcement actions in the near future.
Unfairness
More worrying are the policy’s sweeping statements with regard to “unfairness” in the context of biometrics. The policy recites the FTC’s prior general guidance that a practice is unfair if it: (i) “causes or is likely to cause substantial injury to consumers,” (ii) “is not reasonably avoidable by consumers,” and (iii) is “not outweighed by countervailing benefits to consumers or competition.” With the first factor all but settled — the FTC points to its prior discussion of the risks of the collection and use of biometrics — the policy highlights the need for businesses to pay particular attention to the second and third factors: is the collection avoidable? and does the benefit outweigh the harm?
The policy identifies at least five factors it will consider in addressing these two questions.
First, has the business conducted a “holistic assessment” of the risks of collection and/or use? The FTC does not list all considerations it thinks makes this assessment sufficient, but does urge businesses to consider the context of the use and collection, the testing involved in the technology and circumstances of the tests, the involvement of human operators, and the potential for demographic biases based on the algorithms. This appears to require businesses to conduct a comprehensive audit and will require significant cooperation between developers and their business customers.
Second, is a business engaged in surreptitious or unexpected collection? Surreptitious collection has long been a concern of the FTC — it was a principal focus of its 2012 staff report on biometrics — and according to the policy, surreptitious collection “may be unfair in and of itself.” The policy makes clear that going forward, companies deploying biometrics should “clearly and conspicuously” disclose the use of the technology and institute mechanisms for consumers to submit complaints about it.
Third, has a business vetted business partners (affiliates, users, vendors) who will have access to the information? The policy states that the starting point is strong contractual language, but also suggests this will not be enough — that businesses should engage in oversight to “supervise, audit, or monitor” compliance.
Fourth, has a business appropriately trained the employees and contractors involved in the collection and use of the information?
Fifth, has a business conducted ongoing monitoring of the technology it either develops or uses to determine whether it is being used or is functioning as intended?
What This Means for Companies
In the absence of a comprehensive national data privacy regime, the policy demonstrates how the FTC is prepared to use its existing authorities under Section 5 of the FTC Act to take action against businesses that develop and use biometric technology in ways the commission deems deceptive or unfair. The FTC’s guidance, however, creates more questions than answers, particularly regarding how businesses can avoid running afoul of the FTC’s view of unfairness when it comes to biometrics. And some of that guidance appears unrealistic — particularly the FTC’s view of the need for a “holistic assessment” and what that entails.
Additionally, many little FTC Acts expressly require state regulators to look to FTC guidance in applying their own laws, and judges often look to the FTC’s pronouncements regarding Section 5 when interpreting state consumer protection statutes. Therefore, it is reasonable to expect state attorneys general and the plaintiffs’ class action bar to attempt to enforce these same principles via state law throughout the country. As stated above, although BIPA applies only in Illinois, anything less than its strict requirements for written notice, signed consent and a specific policy for biometrics may soon be argued to be “unfair” under state laws throughout the country.