Promotor: prof.dr. B.P.F. Jacobs (RU)
Copromotor: dr. J.-H. Hoepman (RU)
Radboud Universiteit Nijmegen
Date: 9 October, 2017, 16.30
Digital technologies play an ever-increasing role in our daily lives. However, these technologies are also frequently data driven. Governments and companies alike collect more and more data about us to offer better services, to fight crime and terrorism, and to prevent fraud. The result: we have less privacy than ever.
Yet, privacy is important. A lack of privacy harms individuals by limiting their freedom to live their personal lives and by limiting their personal development. The chilling effect of a lack of privacy harms people by changing how they behave. They change their behavior not because they do anything illegal, but because of how that behavior could be construed. The actual collection and aggregation of personal data enabled by a lack of privacy is, of course, similarly harmful. It can, for example, lead to exclusions, to incorrect conclusions being drawn about a person, or to unexpected spreading of personal information.
A lack of privacy, however, does not just harm individuals but also societies. Without the protection of privacy, it is much harder to develop the critical mindset that is so essential for a democratic society to function properly.
While privacy is important, there are many arguments why privacy should not be increased. In this thesis we focus on two common arguments against (increasing) privacy, and show that the situation is more nuanced.
The first of these arguments is that an increase of privacy results in a decrease of security. In Chapter 3 we reintroduce the notion of revocable privacy to show that it is possible to build systems that offer privacy and security simultaneously. As long as users follow the rules of the revocable privacy system, they are fully anonymous. Only if they violate the rules can their anonymity be reduced. To show the usefulness of this approach, Chapter 3 highlights scenarios that could benefit from revocable privacy, and indicates which systems already exist that implement this notion of revocable privacy.
A common reason for online platforms to disallow anonymous users is the potential for abuse. In Chapter 4 we introduce the revocable privacy system vote-to-link. It enables Wikipedia, and other online platforms, to allow anonymous access for editors (or users in general), while simultaneously enabling Wikipedia to recover from abuse by misbehaving users. By default a user’s actions are unlinkable. However, if moderators deem an action to be abusive, they can vote on this action. Once an action accumulates sufficient votes, the system can link all other actions by the same user within a limited time frame. In this way, the other potentially malicious actions by that user are distinguished from all remaining actions, and can thus quickly be examined or removed.
Whereas the vote-to-link system implements the rule that an action should not be marked malicious by too many moderators, the distributed encryption scheme from Chapter 5 implements the rule that a party should not cause events at too many different locations. The distributed encryption scheme solves the canvas cutters problem—canvas cutters are criminals that rob trucks parked along highway rest stops by cutting their canvas—in a privacy friendly manner by identifying only those cars that stop at many rest stops. We simplify Hoepman and Galindo’s original distributed encryption scheme, add proper key evolution—this is essential in many scenarios—and propose a batched solution that is more efficient for small plaintext domains such as the set of license plates.
The second argument against privacy that this thesis addresses is that privacy friendly solutions are not practical. We show, however, that the vote-to-link system and the distributed encryption scheme are efficient enough to use in practice.
ABCs are digital alternatives to identity documents, loyalty cards, etc. ABCs have strong privacy guarantees, however, to protect the security of the system when a credential carrier is lost, stolen or abused, it should be possible to revoke credentials. In Chapter 6 we propose the first privacy friendly revocation scheme for ABCs that is fast enough to be practical even when these ABCs are implemented on smart cards.
PIR allows clients to retrieve records from a database, without the operators of that database learning which records it retrieved. Achieving these privacy properties is computationally intensive for the database servers. In Chapter 7 we show how we can batch queries from many clients to reduce the load on the database servers. This new scheme is efficient enough to apply PIR to certificate transparency, a system to detect misbehaving certificate authorities, thereby making it privacy friendly.
These results show that the two common arguments against privacy that we address in this thesis are not universally true. In fact, we show that we can build practical systems that achieve security and privacy simultaneously, showing that security can often be achieved without negatively impacting privacy. Hence, reasoning against privacy requires more nuanced arguments than that more privacy always harms security or is not practical.