Our Bodies, Our Data
A commentary by Tiffany Li, visiting clinical assistant professor in the Technology Law Clinic.
Our Bodies, Our Data
Protecting biometric privacy.
In 2015, the US office of personnel management suffered one of the biggest cybersecurity breaches the us government has faced yet. Included among the leaked data were the raw fingerprint scans of 5.6 million people who had worked in the federal government. This means that, for 5.6 million Americans, someone out there has their fingerprint data and can use it for potentially nefarious means.
The fingerprint data breach is one example of the concerns we face in the world of biometric privacy, or the security of data related to or arising from the body. This category includes fingerprints, eye/retina data, motion/gait data, faces and face prints, DNA data, and more. While privacy in general is a hotly debated topic today, biometric privacy often does not get the attention it deserves.
Biometric data is unique among other classes of data because it is data that is intimately connected to our bodies, and, by extension, ourselves. There’s a common saying in privacy and cybersecurity, “Biometrics are user names, not passwords.” User names are identifying and often public, while passwords should be secret and changeable. You cannot change most aspects of your biometric data, and you often cannot keep it secret. It is crucial that technological systems are built to reflect this difference. For example, many apps today collect face data for sign-in or for use in photo or messaging functions. These apps often do not provide adequate protections for face data, which is quite sensitive. You only have one face, after all. Once your face data is out there for use by potential adversaries, it’s out there forever.
Biometric privacy harms are not limited to apps and consumer products. Facial recognition is now part and parcel of law enforcement surveillance programs worldwide, including in the US. Sometimes this occurs in conjunction with consumer products. (One high-profile example of this is the vast number of partnerships Amazon has made to share data from its Ring video doorbell with US law enforcement authorities.) While there may be benefits to the use of facial recognition for public safety, there is also great potential for harm. Facial recognition has also been used in illiberal countries for surveillance of political dissidents, leading to devastating consequences.
The harms range from the immediate and common, like potential misuse of DNA data in criminal procedure, to the more far-ranging and technologically advanced, like deepfakes—audio, photo, and video recordings that falsely appear to depict a real person doing or saying something they did not actually do or say. Deepfakes have frightening implications for our future: imagine if we come to a day when we cannot trust any reporting or recording of the truth. However, one of the more immediate harms is the use of deepfake technology to create explicit images and videos of individuals, often celebrities and women, without their consent—an issue Professor Danielle Citron has written about extensively.
On a practical level, biometric privacy is an important topic for lawyers working with clients in technology, health sciences, and other industries that may collect or use biometric data. It can also be important for lawyers working with clients who do not deal at all with health or technology. For example, one issue that arises often is how employers may collect, use, or transfer the biometric data of employees. You may have heard of cases where employees are offered free Fitbits in exchange for health credits or employers who offer DNA testing as a perk. How the data from these health and wellness programs is collected and used is important in terms of privacy law compliance as well as generally considering ethical corporate conduct.
At the Technology Law Clinic, where I am a visiting clinical assistant professor, we regularly deal with clients who work with biometric data. Our clinic is unique for BU Law and also for the country, as it is one of the few law clinics whose clients are also students—in our case, BU and MIT students. These student clients come to us with research, start-up, and social impact projects, and some of these include issues surrounding biometric data. We have served clients who have managed innovative, technology-driven health nonprofits and research projects that seek to create scientific advances that may one day help all of humanity. Other past clients we counseled created useful (and hopefully profitable) health tech products, including complex medical technology products driven by artificial intelligence and machine learning. Our clinic students advise these clients on biometric privacy and other technology law issues that often involve difficult, unsettled areas in law and regulation.
[Certain] apps often do not provide adequate protections for face data, which is quite sensitive. You only have one face, after all. Once your face data is out there for use by potential adversaries, it’s out there forever.
Biometric privacy is certainly one of those difficult, unsettled areas. In the EU, the General Data Protection Regulation (the highest privacy law in the region) categorizes biometric data as a special protected category, requiring higher levels of protection for compliance. Right now, only a smattering of states (Illinois, Texas, Washington, California) have laws on biometric privacy. Massachusetts has a bill currently under consideration. While regulations like HIPAA include biometric data, and specific laws like the 21st Century Cures Act govern discrete aspects of health and biometric privacy, like genetic privacy, there is no federal privacy law in the US and certainly no federal biometric privacy law.
With the rising ubiquity and prominence of technologies collecting and using biometric data, it is all but certain that we will see a greater push for biometric privacy laws and regulations at the state and federal levels. In the meantime, this is a space where technology companies can prove themselves in terms of setting industry self-regulatory standards. Additionally, while these laws are being shaped, it is the ideal time for civil society and advocacy groups to weigh in on what laws should look like and how they will be enforced.
In the Technology Law Clinic, we teach students to evaluate biometric privacy issues with two priorities in mind: protection of privacy rights and protection of innovation. Both of these priorities are crucial in any current and future laws on biometric privacy. We need stronger and better laws to protect biometric privacy, and we need them before it’s too late.
“LAW Reviews” is an opinion series that provides commentaries from BU Law faculty on a variety of legal issues. The views expressed are solely those of the author and are not intended to represent the views of Boston University School of Law.