Back to top

Differential Privacy

Differential Privacy: A New Approach to Dealing with Social Big Data

The goal of the project is to research the practical application of differential privacy and other privacy-enhancing methods.

While the theoretical side of differential privacy is a widely researched area, the actual application of differential privacy in industry and society has so far been very limited.

Differential privacy is a formal property that data models should fulfill in order to allow valid statistical conclusions without impacting the privacy of individuals (Dwork 2011). The technique involves adding noise to datasets in a way that makes it impossible to reverse-engineer the results of the analysis in order to get the individual sensitive data inputs.

Some of the research questions are:

  • What are the opportunities of differential privacy from a technical and societal perspective?
  • What are the risks of differential privacy for users and society?
  • How can differential privacy and other privacy-enhancing methods be taught in an accessible and applicable way?

This project is conducted in cooperation with the TUM chairs of Political Data Science and Operating Systems, and is funded by the Bavarian Research Institute for Digital Transformation (bidt), an institute of the Bavarian Academy of Sciences and Humanities.

Open-source contributions to the privacy community:

1. Extant paper implementation: differential Identifiability, https://blog.openmined.org/differential-identifiability/

2. Extant paper implementation: choosing epsilon for differential privacy, https://blog.openmined.org/choosing-epsilon/

3. Global sensitivity from scratch, https://blog.openmined.org/global-sensitivity/

4. Local sensitivity from scratch, https://blog.openmined.org/local-sensitivity/