A Case for Privacy Regulation – April 2018

I attended the Midwest Security Workshop last weekend, along with about 200 other people from academic institutions across the Midwest and 9 others from the University of Michigan. It was really a great time thanks to the work of the organizers and our hosts at UIUC. At this workshop there were a lot of chances to talk research and get the Midwest’s academics of Security and Privacy together.

One such occasion that I’ve put more thought into is the panel on Privacy Enhancing Technologies, and in particular the discussion on Cambridge Analytica and what it means for the field. With such a hot topic and strong opinions at hand, the panel seemed to dissolve into a community discussion. Even though many people rendered their opinion, I think most fit along one of two paths.

The first was the stance that perhaps in the wake of the public turning on Facebook, the public could be convinced to take action to take back their privacy. Perhaps we could aid in the “hearts and minds” campaign to convince people that it is worth sacrificing a little convenience so that our corporate overlords can’t learn everything about you and use it to manipulate your thoughts— hyperbole intended.

This has legs, and I think we need to seize the moment and use the Cambridge Analytica press to drive public opinion in a way that helps us. Much like the Snowden revelations drove public opinion on governmental surveillance and has led to modest gains in policy, we could achieve the same with corporate surveillance. The interesting part for me was that I walked away with the impression that some people holding this first stance disagreed with the other major stance.

The other major stance is that perhaps we should apply pressure to federal regulators to require better privacy protections to regular people. While we have public opinion on our side we should use the same mechanisms many industries use to provide safety to the American people to keep their personal information safe.

I am of the opinion that this naturally follows as an avenue to capitalize on the public opinion swinging toward privacy, but not everyone in the room was. And I understand this! Using federal regulations to enforce privacy practices places power and trust in the regulators. Regulations are less helpful if a few govvies that know nothing about the online ad economy are making all of the decisions. But is that really a reason to abandon this mechanism altogether?

We will find out soon what the effects of the GDPR are. I’ve been told the GDPR is incredibly vague in exactly what technical changes it requires, and I would argue that is about the worst case when the legislation has to be interpreted and litigated by non-technical parties. But it seems to have impacted practice in favor of privacy already.

Something that was not mentioned during the PET panel was the option of industry self-regulation. Another CA, Certificate Authorities is an interesting example of this in Security. Technical measures have been instituted by Certificate Authorities and Browsers to enable informed decisions about what parties should have access to the root of trust for the Web. A few powerful players in this ecosystem, Google and Mozilla, have driven strong security requirements into practice with the leverage they have as agents for the user.

The prospect of self-regulation in a privacy space has two issues standing immediately in its path. The first is a lack of transparency into the ecosystem: we currently have very little insight into who is selling what data, so it is hard to know when a company violates a norm. The second is a lack of coordinated action by users or their agents. Users have not yet taken significant action to claim their privacy back, that I am aware of. Nor am I aware of a single party acting as an agent for many users that is drawing lines in the sand that privacy researchers would agree with. Hopefully if these two resolve themselves, the future of privacy will be stronger.

I want to clear up that I am absolutely in favor of technical solutions that allow a user to unilaterally take back their privacy. In fact the privacy offered by even our existing tools is stronger than most regulations I think would pass.

Regulation provides a unique opportunity to provide benefits to all users, even those that do not have the the luxury of considering their own privacy. So I don’t think we should write off privacy regulation out of hand. A lot can go wrong when creating a regulation for privacy, like ambiguity and giving undue power to the regulator, but in my mind these risks may be worth the trade-off and regulation should not be dismissed out of hand.


22 April 2018
Website design stolen shamelessly from Zakir Durumeric.