Computers v. Congress: Graduate Student Fellow Studies the Intersection of Digital Privacy and Law

BY ALEX JOHNSON & GINA MANTICA

When most people hear the word “crypto”, their minds jump to cryptocurrencies. Meaning “hidden” in Greek, “crypto” is also short for “cryptography” – a field of research dedicated to encoding and decoding data. Modern cryptography has been around since the 1950’s and is almost as old as the very first computers. Today people like Graduate Student Fellow Marika Swanberg use cryptography and other digital privacy technologies to protect people’s personal information. However, even though some of these ideas and systems have been around for a long time, it is still a challenge to create effective laws around them.

During her undergraduate studies, Marika was drawn to math. “The puzzles, expressivity, and creativity in mathematics captured my interest–it was a breath of clarity for me,” said Marika. She later found a deep passion for digital privacy after publishing a paper with Adam Groce, an Assistant Professor of Computer Science at Reed College. “A big part of cryptography and privacy is thinking like a malicious attacker and anticipating what could go wrong. It was fun,” she said.

Marika’s interest in cryptography led her to pursue a PhD in Computer Science at Boston University. A growing area of study that Marika participates in is understanding how computational definitions can satisfy legal requirements in safe and ethical ways. In an ideal world, the legal system and cryptographic technologies would work in tandem to protect the rights of data users. In reality, however, the rapid pace of technological developments challenges the robustness of existing laws and overwhelms the slow process of enacting legislation.

Marika works at the intersection of digital privacy and law, piecing together the puzzle of how technology can evolve in a way that keeps everyone’s information safe and secure. She notes that computer science is actually an area that excels in precision and detail. “Computer science theory shines in defining things. What does it mean for something to be secure, what does it mean for something to be private–these are fundamental questions that we have answers to,” she said.

Computer science is rooted in the well-defined axioms of mathematics, while legal lingo is riddled with the ambiguities and of the human language. Privacy can mean something very different when it is presented in a legal sense to convey broader ideas and concepts, whereas the algorithms of computer science are clear cut. “Within cryptography we think of security within the framework of a well-specified definition. Either something satisfies the definition or it doesn’t. In law, it is less clear whether an algorithm satisfies a legal notion of privacy or security,” said Marika. Marika studies how we can use the precision and clarity of computer science to capture the spirit of current laws and what the societal impacts are of using certain computer algorithms.

Marika’s challenge is to understand privacy definitions in the context of law, and one specific definition is getting a lot of legal attention: differential privacy.  Differential privacy (DP) is a rigorous definition of “anonymization”. Algorithms that satisfy DP do not reveal information about specific individuals’ data, while still providing useful insights and statistics about a population at large. Differential privacy has been adopted by companies such as Apple, Google, Microsoft, and LinkedIn as well as by the US Census Bureau. These implementations are providing key insights into the practical challenges and effects of deploying DP algorithms in the real world, though much is still unknown.

Marika and others won a grant from the BU Center for Antiracist Research, to study how the use of differential privacy in the 2020 United States Census could impact marginalized communities. After the US Census Bureau found that its previous methods for ensuring confidentiality were vulnerable to attacks, they implemented a new disclosure avoidance system, called TopDown, that uses DP techniques to guarantee confidentiality. TopDown introduces a number of distortions that are specific to this algorithm: for example, it overestimates population counts for rural districts and underestimates counts for densely populated districts. Marika and colleagues are investigating how these distortions could affect Voting Rights Act cases against racial gerrymandering.

Marika aspires to turn computer science theory into practice by analyzing the side effects of real-world algorithms.  “If we’re using a new algorithm we need to understand its possible impacts,” said Marika.


Interested in learning more about the research happening at the Hariri Institute? Sign up for our newsletter here.