By David Kalat

Many technologies that sci-fi promised us have yet to appear: jetpacks, teleporters, and rayguns remain the stuff of fantasy. Some sci-fi tech, however, has become increasingly common: thanks to biometric data, it is now possible to unlock your electronic devices with your fingerprint; security providers use facial recognition software to sort potential threats from law-abiding citizens; in some areas of the world, drivers can unlock their cars with their voice, just as James Bond did.

Biometric access controls can be convenient (no need to remember passwords or keep track of keycards) and more reliable (for example, employers can prevent employees from clocking in on behalf of tardy or absent friends). Amid the benefits, though, lurk troubling risks—if a hacker steals your password, you can just change it, but what can you do if someone steals your fingerprint, your retina scan, or your face?

In 2008, this was more than a mere speculative concern. A company called Pay by Touch developed biometric technology for consumers to pay for goods and services using a mere swipe of a finger. Founded in 2002, Pay by Touch attracted high-profile backing from the likes of billionaires Gordon Getty and Ron Burkle. Suddenly, in 2008, the company collapsed into bankruptcy as its CEO was accused of domestic abuse, drug abuse, and abuse of his investors’ money. Left behind in the aftermath of Pay by Touch’s demise was a database of some three million customer’s fingerprints, the security of which was unknown.

In response to this concern, the Illinois legislature passed the Biometric Information Privacy Act (BIPA) in 2008. It was the first such law of its kind in the country and remains an outlying pioneer. While states such as Texas and Washington have laws limiting the collection and use of biometric information by businesses, and Missouri, Maine, and New Hampshire limit government collection and use of biometrics, Illinois alone provides a private right of action.

Notably, BIPA’s language is oriented toward providing a framework for proactive protections of individuals’ biometric data, as opposed to providing a mechanism for redress when that data is compromised or misused. Essentially, the law requires organizations that collect and store biometric data to provide appropriate notice to the individuals affected, to gather the individuals’ consent, and to ensure responsible data-archiving protections and a clearly articulated data-retention schedule. No actual exposure or misuse of a person’s biometric data is necessary to trigger liability.

For close to ten years, BIPA sat on the books with little attention, but beginning in late 2017 that changed. Over fifty lawsuits have been filed since July 2017, many of them putative class actions. Like the Telephone Consumer Protection Act (TCPA), BIPA provides statutory damages for violations ($1,000 per negligent violation; $5,000 per reckless or intentional violation), and this makes the law an attractive target for the plaintiff’s bar.

Some businesses sued in Illinois for BIPA violations have sought to remove the actions to Federal District Court in order to avail themselves of defenses not readily available at the state level. A recent Second Circuit decision offers some guidance to defendants facing similar BIPA claims: Vigil v. Take-Two Interactive Software (No. 17-303, 2017 US App. LEXIS 23446 (2d Cir. Nov. 21, 2017)). Take-Two is a video game publisher whose “MyPlayer” feature allows gamers to create personalized avatars. To do so, the player slowly scans her own face using a camera to create an in-game representation. Plaintiffs argued that this constituted biometric information as defined by BIPA, and that Take-Two failed to provide proper notice or obtain the players’ consent prior to collecting that data.

In removing the case to the Second Circuit, the defendant successfully argued that under Spokeo v. Robins, the plaintiffs had no standing to state a claim. The US Supreme Court’s Spokeo decision held that a plaintiff must articulate “concrete harm,” not just “allege a bare procedural violation.” In the case of Take-Two, the Second Circuit found that because the MyPlayer feature did include a notification, albeit one that did not conform to the specific BIPA language, plaintiffs were unable to plead a material risk of harm from a purely technical violation. Additionally, the Second Circuit found that since a player had to spend some fifteen minutes contorting her face in close proximity to the camera in order to scan her face, “no reasonable person” would have failed to understand her face was being scanned, and that it made no sense to argue that a player might withhold her consent to such an involved process she voluntarily undertook. In other words, while plaintiffs could argue that Take-Two’s actions may fall short of the specific notice and consent requirements called for under BIPA, the court found they were unable to articulate material risk of harm that could result from those bare procedural violations.

Dozens of other cases are pending, with an array of factual, procedural, and jurisdictional differences that may diverge from the Take-Two precedent. On February 26, 2018, the US District Court for the Northern District of California rejected Facebook’s Spokeo-based defense in In re: Facebook Biometric Information Privacy Litigation (case number 3:15-cv-03747). Judge James Donato found that “BIPA vested in Illinois residents the right to control their biometric information by requiring notice before collection and giving residents the power to say no by withholding consent,” and that “[t]he abrogation of the procedural rights mandated by BIPA necessarily amounts to a concrete injury.”

One aspect of BIPA that did not get an airing under Take-Two involves the law’s requirements regarding security measures to protect the biometric information. The Second Circuit rejected the plaintiffs’ allegations that Take-Two had insufficient data privacy controls in place, and found that plaintiffs would need to state specific allegations that the defendant’s security measures created a material risk that the biometric data would be improperly accessed. Crucially in this context, the technologies used to collect, store, and manage biometric data–such as fingerprints, retina scans, and facial recognition–are widely variable, and individual technological details may play a significant role in arguing a defense.

By way of an example, the Touch ID fingerprint scanner on an Apple iPhone does not store a copy of the user’s fingerprint, but rather a mathematical representation derived from it. BIPA defines biometric information as “any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual,” which would seem to encompass the Touch ID’s mathematical model derived from a fingerprint. However, it is important to note several characteristics of the Touch ID technology: 1) the mathematical representation can only be used to validate whether each new fingerprint scan matches the model, but cannot be used to reverse engineer the fingerprint itself; 2) the mathematical representation is encrypted; and 3) the representation is stored within a special processor that is itself encrypted and integrated into the phone’s circuitry. Taken together, the Touch ID system on an iPhone provides several layers of security, such that even if a malicious actor gained unauthorized access to the iPhone, disassembled it, extracted the processor, decrypted the processor, and decrypted its contents, the hacker would still be unable to use the resulting data to obtain the user’s fingerprints. This is merely one type of technology in one specific application, however, and the individual facts of each technology will be relevant in determining how to evaluate the risk of harm.

Defendants seeking to explore and develop defenses along these lines would likely need the services of expert witnesses in computer technology to review and opine on the specific types of biometric data collection and data-security measures involved.


The views and opinions expressed in this article are those of the author and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group, LLC or its other employees and affiliates.