Most business models within the internet of things (IoT) based on the analysis of big data resources that contain personal and private data: For instance, social media platform operators gain most of their profits by analysing personal and person related user data. Beside the threat of data leakage (i.e. the Cambridge Analytica case in 2018) or a potential data breach, these big data sets, combined with the continuous development of data analysis, challenges the protection of privacy. “Differential Privacy“ is a new approach that enables big data analysis on data sets containing personal data, while not having an invasion of privacy. Depersonalisation methods often had to deal with the trade off that the individual is full anonymised, but her features and characteristics still remain in the data set. In contrast to these classical approaches, differential privacy methods add noise to all personal features and characteristics within the data, so the individual privacy is fully preserved while valid statistical analysis is still possible. In principle, this can be done by computing a random noise and “cover” the data set or add the computed random noise to the analysis procedure.
Our project „Differential Privacy: New approaches for managing social big data“ focus on a comprehensive evaluation of the new ways of combining privacy protection with the possibilities of big data analysis: Our goal is to provide the public with the “essentials” of differential privacy and pave the way for further development, research and implementation of this method. Our project is subdivided in four work packages to answer the following questions:
- From a technical and social perspective: What are the chances and challenges of differential privacy?
- What are the risks of differential privacy regarding the users, the operators and the public?
- From a software engineering perspective: How can differential privacy approaches implemented in practise and what are the possible use cases?
On the one hand we‘ll publish our findings in policy paper, professional articles and political consulting. On the other hand, we’ll develop an interactive learning environment (platform based), where users can test differential privacy methods and get background information. Finally, we’ll conclude with a feasibility study that evaluate the possibilities to establish a differential private data source for scientists and scientific users.